7 Data Sins Series: Serving Multiple Masters
There are different paradoxes in data management. One is that, quite often, firms have multiple “master” databases for their price data, their customer data and the terms and conditions of the products they invest in, trade or issue. The biggest record we have seen is a firm that had 32 different widely used databases just to keep financial product terms and conditions. And this is not even counting a large number of small local databases and spreadsheets that also stored some of this information.
The Pitfall of Redundant Data Storage
The “sin” here is clear: avoid the redundant storing of information! Having multiple places where you store information leads to the need to reconcile and cross-compare and, in general, causes uncertainty as to the validity of data points. At best, you could hope to be consistent across your different databases. But at what cost?
There have been several reasons why firms set up multiple databases with essentially the same data:
- Decision making and budgeting across departmental lines made it easier to do something at the local level rather than establishing firm-wide services.
- Lack of sound data governance and the tracking of metadata historically made it difficult to track permissions and data usage when consolidating onto a single database or single (external) service.
- Departments often added their own custom fields to their datasets, which could be specific identifiers for homegrown applications, specific product taxonomies, industry classifications or derived data elements.
- Departments may have wanted privileged access to a dataset or may have had performance concerns that caused them to have their own local copy.
Needless to say, departments that rely the most on aggregated, enterprise-wide information such as the risk and finance departments have suffered the most from a fragmented approach to data management and data storage causing endless rework, reconciliation and verification of data sets.
Modern Solutions with Managed Data Services
Setting up departmental level stores of data may have made some sense ten or even five years ago. However, with today’s managed data services this is no longer needed and here’s why:
- Managed data services have come a long way in offering concrete business user enablement and easy data access via browsers, enterprise search, easy integration into user workflows and APIs for those needing programmatic access.
- Today’s managed data services include a comprehensive approach to tracking metadata including permissions, usage rights, quality checks performed and data lineage information – which provides a full explanation of what sources, formulas or human actions led to a certain data value.
- New cloud-based services provide the required scalability and uptime requirements to serve different departments.
- Providers such as Gresham via Prime EDM provide the capability of using a firmwide data set with different local requirements to cater to idiosyncratic needs – all in the same service.
Keeping data stored in redundant copies may have made sense at some point to prevent resource conflicts and stop applications or users from waiting for access. However, the flipside of different master databases also means redundant entry points of commercial data feeds into organizations – often leading to avoidable data spend. In our experience, teams can best be connected through shared and transparent data assets, that easily integrate into their existing workflows with the capability to augment data sets to cater to local requirements. Our cloud-native EDM platform does exactly that.