Wall Street & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Trading Technology

02:14 PM
Connect Directly
Facebook
Google+
Twitter
RSS
E-Mail
50%
50%

Tapping the Pipeline

Automated trading is driving demand for low-latency data. In response, business units and IT groups are searching for ways to manage direct-exchange feeds before slow data drags down the bottom line.

Two years ago, Merrill Lynch had to tear down some long-standing IT and business silos before it put market data experts at the trading desks to monitor the speed and quality of all the data being piped in. "It was a new model for our business, as this had traditionally been completely an IT function," says Mike Stewart, managing director and head of global portfolio and automated trading at Merrill Lynch in New York. "This was done in partnership with IT."

Today, the surge in automated-trading strategies has put so much emphasis on getting fast and accurate - did we mention fast? - data that the kind of team approach to information technology that Merrill and other securities firms are taking has become necessary for survival. Financial institutions are measuring the speed at which data is delivered from the point when it leaves an exchange to the moment it arrives inside an analytic application and fires off an order. The slightest delay could make an automated trading application or algorithm price a security incorrectly, costing clients and the firm .

Because automated trading applications are so sensitive to data latency, an increasing number of Wall Street firms are bringing in direct-exchange feeds from stock exchanges and electronic communications networks (ECNs), as well as futures and options markets, to populate the data in specific trading applications more quickly. In the case of algorithms, brokers crunch real-time data continuously via analytic engines and generate "interval statistics," such as one-minute, five-minute and 10-minute bid-offer sizes, as well as spreads and volatilities, which then interact with historical databases. All of this must occur in milliseconds for the algorithm to capture opportunities before an order is canceled.

With little tolerance for latency, companies are investing in their market-data infrastructures and taking control of managing their data internally - even if this means installing ticker plants, feed handlers and middleware to process, cleanse and redistribute each exchange feed internally into trading applications.

The trend is being driven by the surge in program trading, algorithmic trading and other types of automated trading strategies that detect latency faster than a human eye. Programs that scan ECN order books - which display multiple price levels for each security - need to see every "tick" in order to make trading decisions in milliseconds. And, options market-makers use automatic quoting to calculate their bids and offers, which they transmit to the options exchanges, based on movement in the underlying stocks.

"Every time an order is entered or canceled in the order book, it changes market data," says Larry Tabb, founder and chief executive officer of The Tabb Group. "The inside bid changes, you recalculate a whole chain of options and those options are listed on multiple venues - plus you have puts and calls and multiple strike prices."

At Merrill, Stewart works in partnership with Ed Keenan, Merrill's chief technology officer for global equities, to make data a core competency of the business unit.

Over the past year, the the firm has focused on the problem of market data latency in connection with the firm's buildup in options market-making and automated trading. It has tackled the issue with two different approaches.

First, the firm receives price feeds directly from exchanges and ECNs and uses them to build the equivalent of a montage, or consolidated view, of prices. Second, Keenan is exploring ways to continue utilizing feeds from data vendors while eliminating some of the hops that the vendor takes when it consolidates the data. That requires building systems across both companies' hardware, software and network architectures. "We're still working with some of the vendors to see if we can essentially eliminate the latencies of their standard feeds," Keenan relates.

Merrill isn't alone in its quest to reduce latency. Other brokerage firms, proprietary trading desks and hedge funds are looking for alternatives to circumvent traditional consolidated-quote vendors, too.

"Data is oxygen," says Rob Flatley, managing director of electronic trading services at Banc of America Securities. "We go right to the floor for ours." Flatley's group takes in direct-exchange feeds to fuel a direct-access trading service, which it provides to smaller broker-dealers and their clients. It acquired Direct Access Financial Corp. in February to launch a direct-access trading operation, and now it runs a data-access center in Dallas with screens that monitor the data coming in from all the ECNs and futures and options exchanges.

Firms have two choices when they bring in direct market data: build the IT factory of ticker plants, feed handlers and middleware platforms to manage it, or hire vendors to do it for them. Banc of America wrote its own feed handlers to receive Level I and Level II data direct from Nasdaq and other stock, options and futures exchanges. Flatley finds data from the exchanges "faster and cheaper," and the company prefers itself to manage all the cancels, replaces and update messages that come in from exchanges.

The trend could spell trouble for big-market data vendors such as Reuters and others that supply consolidated market-data feeds. Reuters recognizes the risk and is working on a direct-feed strategy of its own (see story, page 16). But it's not panicking. "We've seen this movie before," says Peter Lankford, senior vice president and head of enterprise information systems at Reuters. Ten to 15 years ago, Wall Street firms spent millions to build their own ticker plants and did all the processing. "Gradually, it was no longer a competitive differentiator for them - if it ever was - and it became a cost issue," Lankford says. A lot of them had their own market-data system, and almost all of them now have a commercial system, he notes.

One element that's clearly changed from a decade ago is the volume of data that the entire market is attempting to digest. All those data messages generated by computerized trading create a tsunami of traffic, especially in options trading. The Options Price Reporting Authority (OPRA) - which aggregates all the quotes and trades from the options exchanges - estimates that peak rates will reach 62,000 messages per second by 2005, double the 31,000 messages per second it reached in February. Consolidated-quote vendors have capacity constraints and are beginning to use techniques known as "conflation" to filter the data so as not to transmit every quote update. This could push financial users that need to see every tick or canceled order toward direct-exchange feeds.

For all these reasons, the pendulum appears to be swinging back toward market-data management as a core competency. Ivy is Editor-at-Large for Advanced Trading and Wall Street & Technology. Ivy is responsible for writing in-depth feature articles, daily blogs and news articles with a focus on automated trading in the capital markets. As an industry expert, Ivy has reported on a myriad ... View Full Bio

Previous
1 of 5
Next
Register for Wall Street & Technology Newsletters
Video
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.