Rocket Man

Paul Algreen on custom trading platforms.

Building a hedge fund computer platform capable of handling a global, around-the-clock trading operation is highly specialized and intricate work worthy of a rocket scientist. So as III Associates — a $4 billion Boca Raton, Florida–based quantitative hedge fund — undertakes that mission, the firm is in good hands. Its chief technology officer, Paul Algreen, is a rocket scientist — a Massachusetts Institute of Technology–trained aeronautical and astronautical engineer with a diverse background in computer programming and software development.

In addition to working in the aerospace industry, Algreen has served as a securities analyst and trader — a unique combination of skills. These days he is using all of them as he creates a customized trading platform that is part of III Associates’ efforts to expand its trading activities beyond the firm’s fixed-income interest-rate strategies to a broad range of credit instruments and derivatives.

Since taking over as CTO some 18 months ago, Algreen has significantly bulked up the firm’s technology team and has been aggressively pulling together the pieces of its new trading system. III Associates now boasts 17 technology specialists, up from 12 when he took the position — more than 15 percent of its 110-person workforce and a testament to the importance the firm places on technology.

In recent years, as III Associates has moved into new areas of trading in an effort to find additional ways to create alpha, the firm found that its 15-year-old legacy operating system was having difficulty handling the increasing amount of information it needed to process. When Algreen became CTO he had to make a choice: buy or build a new system. He chose the latter, based partly on the firm’s culture, which values technical expertise and leans toward systems developed in-house that can be customized.

“We don’t like a black box,” says Algreen. “One of our primary core competencies is our quantitative development expertise. We have a very tech-savvy trading desk and quantitative analysis group.” Roughly a year into the project, he says, the firm is already conducting its credit trading on the new system, with the rest of its strategies on track to be switched over by the end of next year.

Algreen followed an unusual path into finance. After graduating from MIT in 1997, he took a job as a technical consultant with a private telecommunications firm, Kenan Systems Corp., now Comverse, a division of Alcatel-Lucent, helping implement software at client sites around the world. His next stop: a job as an aerospace engineer working on software development in Tucson, Arizona, for aerospace systems supplier Raytheon Co. After a stint as a technical engineer and consultant in the Boston office of Waltham, Massachusetts–based software maker Idiom Technologies, he returned to Tucson for his first CTO job, at start-up Octane Games, a developer of computer games for hand-held devices. During his time there, he learned about an opening for an entry-level analyst on the fixed-income desk at III Associates, where some of his college buddies had landed. Switching gears, he jumped to III Associates in 2003, obtaining his Series 7 license and learning about the principles and guidelines the firm has developed in its 25 years of operation. Over the next few years, he took on numerous other roles at the firm — becoming a trader, working with clients to execute trades, making trading recommendations and ultimately moving to the Treasury desk.

“I had always been interested in finance,” says Algreen. “So it made a lot of sense to make another drastic change.”
But then the CTO job opened up, and Algreen decided to move back to his original area of expertise. His trading experience provided him with a unique perspective that is helping to shape the way III Associates will use technology for years to come. In January, Algreen spoke with Alpha Contributing Writer Irwin Speizer to discuss that technology and the challenges of managing the computational needs of a large hedge fund.

Alpha: What are the biggest challenges that technology officers at hedge funds confront today?
Algreen:
It depends on the type of hedge fund. As a fixed-income, relative-value hedge fund, our challenges center around our analytics and the models we use to develop our strategies. The challenge everyone faces — especially funds that have gone multistrategy — is to create a consistent, unified view of portfolio risk and profit and loss.

So risk management is the central focus of everything?
Generally, yeah. The risk in the trades that risk managers are looking at should be the same as what traders and the operations folks are looking at. A lot of funds use a prime broker, but we choose not to. We do all our own clearing and settlement in-house. So providing a consistent view is critical.

What is the biggest decision you’ve had to make as CTO?
We were faced with a rapidly expanding set of asset classes that we were trading. We were at a crossroads where our legacy system — our in-house proprietary system — was starting to fall behind. So we had to decide whether to build upon that platform, buy a big, third-party system or develop our own next-generation platform. We build a lot of our analytics and models ourselves, using combinations of third-party components and custom stuff. But at the end of the day, we like to maintain control internally, or at least have the ability to understand what’s going on.

What kind of resources did you need to undertake this project?
We didn’t have enough people or the right focus, so I had to convince our management committee to make an investment. Fortunately, I had circumstantial evidence. The operations guys were saying, “Look, we’ve got to do something different,” and the traders were saying, “We need to make this better.” We weren’t inadequate, but we needed things to be better to avoid unnecessary risk or the need to hire a lot of people for a job one person could do with technology. Take credit trades. Manually confirming hundreds of trades a day would take two or three people, but with technology one person can do it in half a day.

What role does technology play in III Associates’ strategies?
Technology is the only way to manage the volume of information in today’s market. Using technology we can do things much more efficiently — things like automation of data feeds and confirmation processes.

The complex nature of over-the-counter derivatives, in which the details are agreed upon by our counterparties at the time of a trade, lends itself well to a robust technology solution. All the details must be checked by someone in the middle office and back office, and we can do that through a combination of manual and electronic eyes. We input the details into our system, the other side inputs the details into their system, and wherever possible we use electronic matching to make sure the details match. There are large organizations and platforms we utilize to do that, but we have to make those integrations in the first place.

Similarly, for credit there is a spread curve for every single name in the credit market — thousands of curves. Loading them into your system, ensuring data integrity and making sure all the curves are accurate requires a fair bit more analysis and automatic data scrubbing. If a trader is looking at a curve used to price a certain instrument, or an operations guy is looking at it, they need the same tools and data. Technology is really the only way to manage that volume of information.

What are the technology challenges in risk analysis, and how do you deal with them?
Our technology strategy includes a few approaches. First, we maintain rigorous calibration for our analytics. We calibrate to various sources, including third-party sources and our actual counterparty evaluations — on a daily basis. That is a key way to make sure that the way we quantify risk internally matches what the market does.

The second piece is grid technology — which is essentially a fancy way of taking applications and chunks of code and running them remotely on computers in parallel fashion. You are leveraging assets you already have that are idle. In order to really get a handle on your risk, you need to run scenarios and do stress analysis and regression analysis. To do that requires an awful lot of computational work. Grid technology allows us to take advantage of all of our hardware and to really assure our investors that we have looked at all the scenarios of concern; the grid is, in my mind, the most cost-efficient way to do that.

How have these changes affected your ability to do modeling and analysis?
We can get risk to the users more quickly — whether a risk manager, a trader, an accountant or an operations desk — allowing them to do things on a more ad hoc basis. If they want to run a big simulation instead of having to wait all night or kick it off before they leave for the weekend, they can get results in a few minutes.

How do you use technology to model and price assets, and how did these systems hold up in the recent volatile market?
Our portfolios include instruments priced on models, and we use a combination of models developed in-house as well as commercial models. They stood up quite nicely to the market gyrations. There is seldom much of a lag between what our models kick out and what the market is doing. Even if things are gyrating, we can weather the storm and keep up to date on what the true market prices are.

The recent market gyrations weren’t caused as much by models that didn’t stand up well, as by people using models inappropriately. Any model just does what it is told. It’s up to the trader, the analyst and the risk manager to understand what the model is telling them and to know its limitations. A lot of desks out there simply don’t have that expertise and are treating these models as a black box. That is a dangerous thing.

When you say some people treat it as a black box and your people don’t — what exactly does that mean?
Let’s say I buy my models from a third-party vendor. I am a tranche trader, and I am looking at a tranche of a subprime-backed CDO. I want to buy a certain amount of a senior tranche. What should I pay for that risk? They punch it into the model and say, “Okay, that is the price.” If it’s a black box, that trader may not know what assumptions the model made. Did it make assumptions about the homogeneity of the spreads inside the basket? Did it make assumptions about the probability of default over time? If you go to any of the risk conferences, any of the forums or discussions, the development of these models is still an ongoing process. At the end of the day, the truth is what the market will pay. That is where calibration comes in. That is where our continuous feedback ensures that our models are as accurate
as possible.

So it all boils down to having a very robust modeling system?
Not only having a robust system, but also having traders who understand the models well enough to know their limitations. And second is the feedback, not just assuming that your models are always right. I might be stubborn and think, “The market is pricing that wrong, my model is right.” If Goldman Sachs, Merrill Lynch and Deutsche Bank will pay only so much for it, then, guess what, your model’s wrong. Maybe someday they will agree with you. But if you have mark-to-market exposure and you have margin calls, you can find yourself in trouble before the market corrects and comes back to your model. It is all about robust models, traders who understand the models and having calibration feedback.

What is the challenge of bringing these new systems in without disruption?
We test everything we build thoroughly. We’re fortunate that the end users of our software are our employees, which makes it easier to get it right the first time. It requires more effort than sticking with the status quo, but I doubt anyone here would want to switch back to the legacy system.

Related