03:30 PM
Data Integrity: A Necessity, Not an Option
Why would any financial firm or bank want to rely on its software/data provider to ensure its data integrity? As the saying goes, garbage in results in garbage out.
Proper data integrity and the results data produces are more important now than ever to reduce risk, since incorrect or inconsistent data can lead to false conclusions and misdirected investments. With exchanges, content, and formats increasing every day, the job becomes more laborious and complex. Increased regulatory compliance and the need for more transparent reporting means the accuracy of the data is more important, as well.
Financial institutions that have taken on the data integrity task in the past now have to spend more money on hardware, software, and people just to keep up with the demand -- at a time when many are looking to reduce costs. Now firms are looking to strategic data partners that have the expertise and economic scalability not only to deliver content, but also to serve as their data integrity specialist.
It takes a substantial team and sophisticated tools to maintain large historical databases and updates. By outsourcing IT responsibilities to experts, firms have an opportunity to achieve cost savings and realign internal teams to focus on more important tasks. Experts that invest in innovating and maintaining high-performance data management infrastructures are best equipped to validate, clean, and deliver the data that firms need to drive their business activities and decisions.
Two main tasks drive high-quality data: data integrity checks and an effective correction process.
Checking the integrity of data as it is loaded is crucial. When database corrections are sent throughout the day for historical pricing master data, for example, all exceptions and questionable data items should be researched and verified during the next business day. In addition, checks should be performed on a daily or intraday basis, generating reports and email alerts. These reports should be reviewed by data analysts who can make corrections to the database as required.
One of the greatest challenges in achieving data integrity involves the correction of values and invalid entries. If simple deletions are performed for invalid information, then information is lost, making it harder to make quality decisions on all the data required. Data cleansing can be an expensive and exhaustive process, since data updates are made every day. A provider can make these corrections quickly and efficiently, since it is the provider's core competency. For example, SunGard makes more than 1 million data corrections a month to its data content in a managed services environment, which eliminates the need for firms to do so on their own.
A successful correction process will maintain a database that is complete and accurate. This requires the right tools, a well thought-out process, and a team dedicated to automation and manual checks, from data collection, validation, and manual investigation to data re-issue and distribution.
The value of data relies on its quality. And in today's marketplace, quality is not an option, but a major requirement. Experts that provide the services that ensure validated and cleansed data every day can help firms quickly gain the transparency and efficiency they need to reduce risk and cost, meet regularly requirements, and make investment decisions that drive greater profitability.
Jason Finley is a technical analyst of historical data and technologies for SunGard's brokerage business and has 15 years of experience in the financial services industry. Prior to joining SunGard, he served as head technical account manager at Activ ... View Full Bio