02:59 PM
Is HCatalog The REST Of The Hadoop Capital Markets Story?
The necessary prerequisites for Hadoop to play a key role in the enterprise data architecture of capital markets firms are quickly coming into place.

Jennifer L. Costley, Ashokan Advisors
In a previous article, I discussed how new resource management features will allow multiple processing modes -- batch, interactive, online and streaming -- to run simultaneously with defined quality of service. Here, I address another key element in making Hadoop enterprise-ready, HCatalog.
A key component of Apache Hive, HCatalog is the metadata and table management system for the Hadoop platform which stores and shares information about data structure. Critically, HCatalog also enables sharing of data structure with external systems including traditional data management tools. As described by Jim Walker, director of product marketing at Hortonworks, "It is the glue that enables these systems to interact effectively and efficiently and is a key component in helping Hadoop fit into the enterprise."
[Is IT Irrelevant? William Murphy, Chief Technology Officer, Head of Blackstone Innovations and Infrastructure, is a keynote speaker at Interop, taking place in NYC September 30 through October 4.]
Hive, as the defacto SQL interface for Hadoop, provides a relational view through SQL-like language to data within Hadoop. HCatalog publishes the Hive interface as an abstraction, as well as a REST interface.
HCatalog includes: -- A shared schema and data type mechanism -- A table abstraction -- Interoperability across data processing tools in the Hadoop ecosystem such as Pig, Map Reduce, and Hive. -- A REST interface to allow language-independent access to Hive's metadata.
Data technology leaders are starting to use HCatalog to integrate Hadoop into their overall data architectures. Teradata recently announced their Teradata SQL-H product which leverages HCatalog to provide direct access to Hadoop data through standard ANSI SQL and enables that data to be run directly in-memory on Teradata.
Not bad for an Open Source project which was conceived (way back in 2011) not as an enterprise-enabler but a way to avoid having to contact the Hadoop data-producer to ask them where they write their data, what format it is in, and what its schema is.
About The Author: Jennifer L. Costley, Ph.D. is a scientifically-trained technologist with broad multidisciplinary experience in enterprise architecture, software development, line management and infrastructure operations, primarily (although not exclusively) in capital markets. She is also a non-profit board leader recognized for talent in building strong governance and process. Her current focus is in helping companies, organizations and individuals with opportunities related to data, analysis and sustainability. She can be reached at www.ashokanadvisors.com.
Jennifer L. Costley, Ph.D. is a scientifically-trained technologist with broad multidisciplinary experience in enterprise architecture, software development, line management and infrastructure operations, primarily (although not exclusively) in capital markets. She is also a ... View Full Bio