03:05 PM
Data Opportunities: The Concepts Don't Change Over Time
Discussion abounds in the industry as to how firms are handling data to improve performance. Although the investment and trading environments have changed over the past few years, the quantitative principles driving the analysis of data are still the same. What has changed are the means by which data is handled and processed. This has been tied closely to technology use and development.
Mobile: The Road AheadAdvanced Trading's March issue examines how the buy side is crunching transaction data, automating algorithmic trading decisions and building analytics in an effort to more effectively track down alpha. To read more, download our March 2013 digital issue now.
Pick up any statistics book. Speak with any quant. Talk to any finance professor. The principles of finance, risk and mathematics have not changed since they were advanced in the mid-20th century. Fundamentally driven investment firms still conduct financial statement and industry analysis the way they have for decades. Quantitatively oriented firms still apply the same mathematical principles to time series data to benefit from divergences from established relationships. Trading firms still look for opportunities among undervalued and overvalued securities and related derivative products.
Technology is what has changed. Computer power, memory and storage are significantly cheaper than they were five, 10 and 30 years ago. These drivers have given modern-day analysts and supporting technologists the ability to process, store and normalize considerably more data than in the past. Analysts, leveraged by technology, arguably spend more time analyzing than performing the underlying tasks of data collection, scrubbing, etc.
[Optimizing Your Data Management Strategy in 2013 ]
Cheap technology and the off-the-shelf availability of sophisticated analytics packages have been the game changers. The commonly discussed "race to zero" -- the tremendous technology spend undertaken by a handful of firms whose profitability, if not survival, depends on the success of latency-sensitive strategies predicated on the fastest execution speed -- is a result of cheap, widely available technology backed by tremendous financial resources. Those less inclined to play the "speed game" can play the "smarts game." While at one point those with Excel spreadsheets had an advantage over others, the availability of packages like Matlab -- if not its open source equivalents -- puts analytical power in the hands of newly minted quants (and many not-so-quants).
Clearly, the availability of cheap technology and analytics has set the stage for extremely short-term holding periods and time horizons, fostered the rise of high-frequency traders, and brought about considerable industry, regulatory and political discussion as to the merits and detriments of such practices. Regardless of one's view as to whether or not such changes are beneficial to overall market operations, no one denies that they have in fact occurred.
Changes in investment horizon and analytics have shifted the overall trading and investment environment. Still, this hasn't altered the fundamental objective of data analysis or the underlying means by which it's performed. On the short end, firms that garner returns through extremely short holding periods still examine time series data for opportunities, just more of them across increasingly smaller windows.
The former "short-term" traders who can no longer play in the smallest window must make do examining data in longer windows. Then there are new, ancillary data types working their way into the information equation, such as news made actionable for trading triggers. While all of the above are legitimate topics for discussion and analysis in their own right, none invalidates that the premise and underlying use of data is the same as it has always been: to look for return-generating opportunities for a given level of risk.
The trading and investment landscape has changed over the years. A substantial element of these changes has been from technology-driven initiatives. On the surface, it may be plausible to suggest that the use of data has changed as well. However, this fundamentally is not the case. Certainly, particular elements of data handling and analysis have changed and evolved. The overarching principles of alpha generation still apply.
Matt Samelson is a Principal at Woodbine Associates, Inc. focusing on strategic, business, regulatory, market structure and technology issues that impact firms active in and supporting the global equity markets. He brings to the firm a wealth of experience in U.S. and ... View Full Bio