02:45 PM
The Art and Science of Leveraging Cloud Infrastructure
Having demonstrated an initial, and understandable, wariness of cloud services, trading firms and other players such as hosting providers in the institutional trading environment are slowly coming to embrace the technology. With storage demands for mid-size firms growing at approximately 90 terabytes a year, perhaps this is not surprising. Building proprietary infrastructure that can keep up with this level of growth is comparable to painting the Golden Gate Bridge: a never-ending task.
The growth of data volumes brings the advantages of the cloud into clear focus. The immediate benefit of cloud services is that, being built on a shared infrastructure that occupies a shared space, they offer significant economies of scale. Leveraging the enterprise architecture and facilities that have been put in place by cloud service providers can therefore be a far more economical option than building out in-house resources.
Commodity providers, such as Google, Amazon and Microsoft can also be a faster option for accessing infrastructure, hosting software services or managing storage – particularly when the alternative is building large data centers, with all the routing, networking and security that entails.
Specialist providers have also established themselves in greater numbers as more industry sectors and individual businesses engage with the possibility of using cloud-based services. These specialists tend to offer more predictable and always-available service with extremely high levels of computational power to serve more data-intensive businesses.
As a result of these developments, we have a diverse cloud ecosystem that can meet a wide variety of needs. From simple hosting of servers and applications for firms that do not require or cannot house a data center of their own, to hosted spaces for distributing software, hosting web services or maintaining a virtualized environment, the cloud is adding value.
But remaining concerns are legitimate, particularly for trading firms. In scenarios where latency is a concern, or time-sensitive data needs to processed, then a cloud-based option is clearly not appropriate. Worries about security and protection of sensitive data are equally pressing, and concern about the location of data centers remains as firms have to ensure that the necessary connectivity and market data is available.
More attractive than a wholesale adoption of cloud computing therefore, is a hybrid model whereby a firm maintains its own infrastructure and data center alongside the service from a cloud provider. In this set-up, data is segregated and treated in the most appropriate environment.
In a hybrid scenario, a client’s personally identifiable information remains in a firm’s local infrastructure, while tick data used for compliance purposes is directed to cloud storage. By the same token, pricing data whose value and utility is determined in large part by the speed at which it can be accessed, understood and responded to remains in the local environment. Conversely, data acquired in the name of market research is directed to the cloud since its value is considerably less dependent on the speed at which it is available.
Now that cloud providers have addressed many of the more practical concerns of their users – security being the obvious example – data segregation has become the major challenge in cloud deployments.
This is more than separating the signal from vast amounts of data. Not all signals are needed in real-time after all. There is also little to be gained from a static, binary decision-making process, particularly as sensitivity to latency can change over time. As the gap between ‘historical’ data that is used for dynamic or intra-day trading analytics and real-time pricing data gets narrower and narrower, the need for more dynamic data analysis grows more pressing.
Interestingly, the answer to the dilemma lies in human engagement as much as automated analytics. Greater collaboration between clients, traders and IT teams will be a key feature of the more successful hybrid models, since it allows for feedback regarding data types and usage to be incorporated into development plans.
This may appear counter-intuitive at first. After all, in cloud computing we have a phenomenon that is both driver and consequence of greater automation and the unfortunately named ‘rise of the machines’. But the truth is that this level of data analysis is an art as well as a science. As the key that opens up the multiple benefits of cloud computing, it is an art that is worth mastering.
David Meitz is Managing Director, CTO at ITG, the execution and research broker. Responsible for software development, technology and trading support services, and information security, Mr. Meitz joined ITG in 2002 from Reuters America, where he was an EVP responsible for ... View Full Bio