00:07 AM
Data Volume Skyrockets
Automated trading strategies are driving market data message volumes through the roof. "Especially in the last six months, it's been very difficult to stay ahead of the curve," according to Paul Famighetti, director of automated trading at Trillium Trading, a direct-market-access broker whose infrastructure supports black-box trading. Speaking at Wall Street & Technology's Data Management Conference in New York, Famighetti said his firm had to add T1 and T3 communications lines to boost capacity to accommodate data volume.
Over the past three years, market data volumes have risen by 1,750 percent, according to ComStock, a market data supplier. The volume of messages has more than doubled in the past 12 months alone and is projected to continue growing exponentially.
"Bandwidth has been a really big [challenge], particularly in the last six to nine months, as algorithms have proliferated," said Jenny Drake, managing director, corporate client group, at Archipelago Exchange, who spoke at an industry event hosted by ComStock. Archipelago recently increased the bandwidth requirements of ArcaBook, its full depth-of-book data, from 3 megabytes to 4.5 megabytes, which represents a 50 percent increase in bandwidth in less than a year, Drake noted.
Meanwhile, investment banks, exchanges and electronic communications networks are under pressure to upgrade their market data infrastructures. "All we really could do is get higher powered hardware and faster bandwidth," says Andrew Goldsmith, director, global head of market data, at Dresdner Kleinwort Wasserstein (DrKW).
Though there have been cutbacks in the market data budgets of many brokerage firms over the past three years, the paradigm has changed, according to Robert Iati, partner, TABB Group. Because of the link between trading profitability and market data, the emphasis on accuracy, speed and distribution of data within and across a firm has never been more urgent, he asserts. "Now, the traders are saying, 'I can't execute my strategies effectively if I don't have accurate data, a robust infrastructure, and if I can't access analytics, decision support capabilities and update black boxes," Iati says.
In a recent report, "Data: The Life Blood of the New Electronic Marketplace," Iati predicts that "The emergence of advanced electronic trading, which hinges on real-time analysis of market information as a driver of trading strategies, will force firms to aggressively improve their data infrastructures."
What's causing the data avalanche? "The trend started with decimalization because there are so many more data points on the wire," Famighetti told conference attendees. "It's growing at an extremely high level in the past six months because of algorithmic trading, black-box trading and automated trading, which is capable of trading more rapidly and also can scale across thousands of symbols," he explained.
One of the catalysts is program trading - which covers a variety of portfolio trading or list-trading strategies. Whereas program trading accounted for 20 percent of New York Stock Exchange volume in 2002, it jumped up to 70 percent of the exchange's volume in 2004, according to TABB Group. And Aite Group, a Boston-based financial advisory and research firm, estimates that by 2008, 40 percent of all equity trading volume will be algorithmic trading.
Meanwhile, the rise of electronic options exchanges and auto-quoting by remote market makers is leading to escalating options quote volumes. The Options Pricing Reporting Authority projects that the industry will require capacity for 130,000 messages per second (MPS) by January 2006, up from the 110,000 MPS it requested for July 2005. "Three years ago, OPRA was probably coming across a couple of T1 [lines]. Now, we're getting OPRA on a DS 3s, which is more like 31 T1s," says Daniel Connell, president of ComStock.
Chicago-based Townsend Analytics (TAL), a direct-market-access provider to 40 markets in Europe and North America, recently upgraded its two data centers in Chicago and New York to a new market data architecture that it calls "TAL Hermes." "We're looking at several thousand servers in our data centers," says Jim Holt, VP of server development at Townsend Analytics.
"Machines are certainly taking the place of people," commented Peter Esler, former managing director, principal and global head of market data at Bear Stearns, speaking at the ComStock event. "Margins are thin - they're getting thinner. Spreads are requiring such immediate response, the only way to trade this market is through algorithmic trading," he said.
Location, Location, Location
As the market data traffic continues to surge, financial trading houses will need to fight the battle on two fronts: reducing data latency and increasing bandwidth, says TABB Group's Iati.
Already, many market centers and electronic communications networks - including INET, Archipelago and Nasdaq - are offering colocation services that enable clients to house their models at the execution facility's data center. The strategy can cut down on data latency, allowing firms to shave off milliseconds from data delivery times.
For example, Trillium Trading, the black-box and algorithmic trading division of proprietary trading firm Schonfeld Group, has a data center that it colocated with several of the equity market centers, including INET ECN. "Really, the first bit or the first message on the wire is the most important thing. Even though you are traveling at the speed of light, geographic latency does come into play," Trillium's Famighetti tells WS&T.
"As far as housing applications off-site at these managed networks, the concept is fine, but I don't know that a lot of financial firms will give external parties rights to their applications," notes Dresdner Kleinwort Wasserstein's Goldsmith.
Historical Plus Real-Time Data
High-frequency trading groups that are writing strategies against these feeds require market data platforms that can capture and store tick data, test strategies against historical data, and change them on the fly. "They run their algorithmic trading engines off of the real-time data feeds. Usually they're using direct exchange feeds," explains Alan Paris, director of Pricewaterhouse-Coopers' Data Management Group.
Companies such as Kx Systems, Vhayu, Apama (which was acquired by Progress Software) and StreamBase offer platforms that can handle the high tick volumes and integrate historical with real-time data and allow firms to write trading strategies against the feeds. "They can handle terabytes of data, normalize it and provide APIs [application programming interfaces] against it," says DrKW's Goldsmith. "That's a great solution on the exchange side," he adds. DrKW is using one of the platforms "as an analytics engine to crunch enormous amounts of history along with real-time feeds in order to do some analysis," Goldsmith explains.
To avoid the hassle of capturing and storing tick data, The Oak Group, a $100 million hedge fund based in Chicago, taps into Great Neck, N.Y.-based FlexTrade Systems. "I log into their system and I run my algorithms and my back testing on their side," explains Gajender Singh, head of trading for the hedge fund.
Singh uses Fast Simulator, which FlexTrade made available for testing last November, for back testing the firm's models. "There is no messaging back and forth," says Singh. The Oak Group uses a trading model that figures out how to trade during the day in an automated fashion. "With Fast Simulator, I'm able to go back in time and tweak a strategy and see what will happen," Singh notes. He's also able to optimize strategies and to test new models, he adds.
Before tapping into Fast Simulator, Singh was paying $1,000 a month to the New York Stock Exchange to purchase its tick data on a daily basis. "It was a big pain because I had to transfer this file from their systems to my system and maintain that data," he relates. "Now, I don't have to take care of it at all - FlexTrade maintains it," he says.
In fact, many hedge funds, proprietary trading groups and other firms that use black-box models are relying on their direct-market-access providers to house their models and provide rapid access to real-time quotes and the market venues. For example, Lime Brokerage, an agency broker, provides an infrastructure that meets the high velocity needs of hedge funds.
"Many of our customers will put a trading server into our computer room," says Michael Richter, executive vice president, business development, at Lime Brokerage. "Capacity, throughput and low latency are critical dimensions of what we are supplying," he says.
Let's say a firm has a model that scans a universe of 3,000 stocks. Every quote is coming out streaming and it goes into a trading algorithm, which is scanning markets and then scalping for opportunities, relates Richter. The strategies run on autopilot throughout the trading day and are recalibrated dynamically at night. There is very little human intervention except to monitor a system for time lags in data, or to stop the model if it's using a stale quote or if any news has come out that will affect the model.
When Algorithms Go Wild
One concern is that black-box users can generate a massive amount of quotes, which can tie up a marketplace. With all the black boxes spewing bids and offers and cancel and replace messages, they are contributing to the quotation traffic jams. "A program that's putting in prices 20 cents off the value of Microsoft - what's the value of that relative to one that's one or two cents off?" Archipelago's Drake asked WS&T conference attendees. This can increase the bandwidth requirements for all clients, she added.
Broker-dealers like Trillium Trading that sponsor DMA platforms, and marketplaces like Archipelago, say their users are "self-policing." Still, an algorithm could go awry or need to be fine-tuned. But, according to Drake, Archipelago monitors the quality of the bids and offers that the black-box users are posting. "We work with the client and look at the data and the distribution of prices to make sure that the data that's coming in is valuable and that we're not getting a bunch of junk, quite honestly," she told conference attendees.
Jeff Brown, director of product development at UNX, an institutional agency broker in Berkeley, Calif., says that over the past six months, he has seen high levels of message traffic at prices that are away from the prevailing market. "If you're a dollar to two away from the prevailing market and you generate 50,000 messages per minute, you're really not contributing to price discovery in any meaningful way," he laments.
Townsend Analytics, the technology company that provides direct market access to asset managers and sell-side institutions, also has noticed the phenomenon. "If we see a bigger customer doing that, we'll talk to them," says the company's Holt.
"Obviously, it's not in our best interest to cut a customer off. Sometimes their code isn't optimized," Holt continues. Townsend might move the customer onto a different server, however, to protect the other customers in the server farm, he adds.
However, the problem is tricky, relates Brown, because when the models are working properly, they do contribute to price discovery and efficient markets, and they generate a lot of trading activity for brokers like UNX and markets like Archipelago. Yet data overflow could lead to the shutdown of a venue, he cautions.
Ivy is Editor-at-Large for Advanced Trading and Wall Street & Technology. Ivy is responsible for writing in-depth feature articles, daily blogs and news articles with a focus on automated trading in the capital markets. As an industry expert, Ivy has reported on a myriad ... View Full Bio