<img alt="" src="https://secure.agile-company-365.com/798703.png" style="display:none;">

Achieving Data Alpha: Top FAQ’s in Financial Data Management

Achieving Data Alpha: Top FAQ’s in Financial Data Management
4:52

Financial services has always been a data-driven business. Achieving accurate and timely data and achieving an information advantage over the competition has long driven the industry. From carrier pigeons to early automation and from the low-latency race to using modern-day data integration, data governance, and data accessibility technologies to fuel user productivity and informed decision-making.

With an explosion of data sources (the alt data boom), the opportunity and the challenges to achieve and maintain an information advantage are immense. We call this challenge achieving data alpha.

In this blog series, we list some common questions we are often asked to help firms on their way to improving their data management and improve data alpha.

Q: What does a financial data management solution do?

A: A financial data management solution helps financial services firms effectively source, onboard, cross-reference, quality-check, and distribute financial data such as prices for valuation, historical price data for risk and scenario management, and master or reference data such as legal entity data, index data, ESG data, calendar data, financial product terms and conditions and corporate actions including changes in company structures as well as income events such as dividends. Simply put, a data management solution should make sure users and applications are effectively supplied with the data they need to do their jobs. See our Solutions Guide for more information.

Q: How do I improve the data quality?

ATo an extent, data quality depends on the use case. There are different aspects of quality that can be measured, including timeliness, accuracy, and completeness, and often there are trade-offs between them. For use in the front-office, speed is paramount. In risk and financial reporting, the turnaround time for decision making is longer, and a different trade-off will be made. Generally put, a data management system normally keeps track of gaps or delays in incoming data feeds and any manual interventions that occur. It should differentiate between false positives and overlooked mistakes and feedback into the configuration of screening rules in such a system. Reporting on Data Quality Intelligence will help optimize the mix of data sources, business rules, and operations specialists.

Q: How do I reduce my data cost?

A: Financial data costs have been sky-rocketing and has reached 32B$ in 2019. Data management solutions can help keep tabs on costs simply by streamlining data sourcing and supply – preventing multiple independent entry points. Also, they can warehouse data to prevent unnecessary repeat requests. Due to the quality metrics mentioned above, these solutions can help make more informed data sources. Another aspect of data cost control is that data management solutions can also track usage permissions to ensure firms do not breach content license agreements. Lastly, through tracking consumption and other data flows, firms can better match and map costs to individual users and departments. 

Q: What is data governance?

A: Data governance is a rapidly developing concept that speaks to organizational capabilities to ensure high-quality data and appropriate controls on that data. It covers a range of topics, including the accessibility of data, clarity on the data assets a firm has through a proper inventory, and documentation on metadata aspects leading to transparency on where those data sets can be used. For instance, it can include documentation and monitoring of quality metrics, content licensing restrictions, and the sensitivity or regulatory constraints. Data governance counters poor quality and improves the awareness of available data to improve business operations and Data ROI. See our Data Quality Intelligence use case for more information.

Q: What is data lineage?

A: Data lineage refers to the ability to track and trace data flows, not just from source to destination but also from end result back upstream. Concretely put: data lineage should explain the values of verified data points in terms of identifying and exposing the process that led to these values, including which sources played a role, which business rules were enacted, and any user interventions that happened on the way. Data lineage is a tool for diagnostics on data errors and helps field any questions from customers, internal audit, risk, regulators, or other users. Increasingly it is a regulatory requirement and a common practice in supplying analytical models as firms realize that the best models in the world will fall flat when fed with poor data. See our Managed Data Services capabilities for more information.