Wall Street & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Trading Technology

07:44 PM
Dr. Thomas C. Redman, President, Navesink Consulting Group
Dr. Thomas C. Redman, President, Navesink Consulting Group
News
Connect Directly
RSS
E-Mail
50%
50%

Data: A Sea Change Ahead

If I'm reading the tea leaves correctly, we're on the verge of a sea change in the way organizations manage data.

If I'm reading the tea leaves correctly, we're on the verge of a sea change in the way organizations manage data. I'm referring specifically to data quality, which has been a thorn in management's side for some time.

Data quality - in the "BIG-Q" sense - means having the right and correct data in the right format, in the right place, at the right time to complete an operation, satisfy a client, conduct an analysis, or make and execute a plan. BIG-Q data quality elevates the people using data to the role of customers and recognizes that they are the final arbiters of its quality.

Implicit in BIG-Q data quality is that customers can effectively complete their tasks millions of times a day at the lowest possible cost. BIG-Q data quality means that a firm need not worry that its data is incorrect in risk, regulatory and compliance activities. It means that clients, regulators and others can trust opinions, reports and statements. It means that the firm can exploit ever-expanding technological capabilities to take in, assimilate, organize and process massive quantities of new data to better understand clients, markets and products

Data Quality Starts With the Business

The thrust of the sea change will not be about computers, databases, applications or systems. Rather, it will take place in the hearts and minds of leaders as they come to appreciate the quantum leaps in data management that are theirs for the taking. But the change will come at a price - not so much in dollars and cents, but in the replacement of long-cherished methods of managing data.

The signs are everywhere. Perhaps the most important change is in the approach to data quality. Historically, firms focused their data-quality efforts on finding and fixing errors. Indeed, entire operations and IT departments aimed to fix failed trades, correct errors in client account data and repair discrepancies in reference data from two sources.

Over time, firms got better at finding and fixing errors. But companies are coming to realize that the results often are not very good. These operations take time and cost money. Worse, they don't catch all the errors, exposing firms to client dissatisfaction, increased risk and litigation.

Leading financial services firms are beginning to see that there is a better way to manage data. They are coming to realize that preventing errors at their sources yields better results at lower total cost. The reason is simple - eliminating a root cause may obviate the need to find thousands of errors at a later date.

The one-time investment needed to eliminate a root cause is repaid many times over by the reduction in ongoing costs. Most manufacturers learned these lessons during the 1960s, '70s and '80s. Those that didn't are no longer around.

A related change involves the expectations of technology. During the Industrial Age, Dr. Edwards Deming advised manufacturers against automating poorly performing processes. "Doing so will only produce junk faster," he counseled.

Firms are beginning to learn that Deming's wisdom is especially pertinent in the Information Age. You simply can't computerize your way out of a broken accounts process, settlement process, client reporting or any other process that involves data. Even if the technology works as planned, the newly automated process still will produce bad data.

Data Isn't IT's Problem

So, managers are learning new roles. In the past, it was common to bury responsibility for data quality somewhere in IT. After all, "The data is in the computer."

But now, firms are learning that it is not the computer that creates new data, but their business processes. Firms are learning that their most important processes cross departmental boundaries and that it is not enough for each department simply to do its job. The entire enterprise must work in concert.

Importantly, firms depend on data created by others. Reference, pricing, ratings, demographic, transaction and other data all are created by outsiders. Reference data, in particular, provides a special lesson. Many firms have adopted the so-called "golden copy" strategy, purchasing data from multiple suppliers and, using business rules, loading a golden copy into their security masters. But this simply is a superior form of finding and fixing errors.

Yet firms are learning a new way: They are following the lead of manufacturers that now manage suppliers to obtain better quality parts, delivered exactly when and where they are needed.

Financial services firms are learning how to engage their vendors in a new way - proactively explaining their data requirements, helping vendors measure their performance against those requirements and insisting that vendors eliminate the root causes of gaps between requirements and performance. And in doing so, they are learning that they don't need so much redundant data. As Mark Twain observed, "A man with a watch knows what time it is. A man with two is never sure."

Firms are beginning to recognize that they need hard facts to manage data quality and are adapting statistical tools that have proved effective on the factory floor to their processes. Also, firms are coming to recognize that their continuing need for more and more data, faster and faster, is only accelerating. They understand the need for well-defined processes in acquiring new data, integrating the data into systems and processes, and using the data to create new value for their businesses.

As firms struggle with these concepts, they come to realize that data is an asset that must be managed in its own right. People must be assigned to manage vendors and cross-departmental processes. Leading firms are beginning to understand the small fraction of data that actually is the most important on which to focus.

Data Accountability

Finally, since everyone touches data in some way or other, firms are realizing the need for data governance and a senior leader (i.e., a chief data officer) to drive the needed enterprisewide changes. The greatest challenge is gaining consensus across the enterprise that everyone is accountable for the quality of data he or she creates, not just for completing his or her assigned task.

The changes described here are happening in pockets throughout the industry. Some firms are using them to process check and credit card transactions more efficiently, others are creating superior reference data and still others are sourcing data.

Some firms have used these techniques to create accounts data completely and correctly and keep it that way throughout the lifetime of the account. Others are finding that some of the controls they had to define and implement to comply with Sarbanes-Oxley also have the salutary impact of helping them create better data. And these controls actually add value.

Most important, some firms are positioning themselves as the high-quality provider in the marketplace. These firms must demonstrate their claims that they are better than their competitors to win and keep business based on the cold, hard facts.

But pockets throughout the industry do not a sea change make. A variety of external and internal forces will drive early adopters to extend data quality efforts throughout their firms. And these forces will, over time, compel firms not yet employing modern methods of data quality management to do so.

First are legal and regulatory concerns. Attempting to meet them through the find-and-fix-errors approach to data quality simply is too uncertain and expensive. Second is the marketplace. Early adopters are finding that data quality pays, and they work even smarter as a result.

In industry after industry, history shows that, once customers see that higher quality is indeed feasible, they come to expect it. And no firm can afford to be left too far behind.

Finally, lowered risk, reduced cost and improved internal customer satisfaction enjoyed by the early adopters cannot be ignored. Each of these forces will ebb and flow. And each will affect different firms differently. But in aggregate, they will only grow in force.

---

About The Author

Thomas Redman, president of Navesink Consulting Group, was one of the first to extend quality principles to data and information. Redman has helped companies in a wide range of industries, including financial services, understand the importance of high-quality data. He has authored three books on the subject: Data Quality: The Field Guide, Data Quality for the Information Age, and Data Quality: Management and Technology.

---

On The Net

www.dataqualitysolutions.com

Register for Wall Street & Technology Newsletters
Video
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.