Standardizing Reference Data

Paul McInnis, Head of Enterprise Data Management, Data Management


The financial crisis and the resulting changes to the regulatory environment have focused the attention of investment firms on data management. Greater demands for reporting and risk management have made organizations recognize the necessity for clean and accurate data. At the same time, to help defray the rising cost of compliance, they are increasingly seeking greater efficiency and cost-savings.

As a result, we are witnessing a fundamental change in how investment firms view and consume reference data. It’s an area I will be discussing in some detail as part of a panel discussion at the North American Financial Information Summit taking place on May 20th.

Reference data offers little in the way of a competitive advantage, rather it’s a necessary requirement of doing business. Yet acquiring and integrating it is typically an ongoing and resource-intensive undertaking and many firms spend significant time post-trade repairing their security reference data to fulfill risk and reporting demands.

As a result, investment managers have been looking for alternative ways to manage their reference data and are increasingly exploring the use of third parties, such as Eagle, that can provide managed services. The question is, how far can this go? Outsourcing of IT was initially slow to take off in the investment community but it’s a trend that has gathered momentum in recent years as the cost savings and efficiencies have become more apparent. We anticipate the trend will only grow as we continue to innovate and enhance our managed services offering to deliver new efficiencies to clients.

For example, we’re working on initiatives to decouple public from private APIs (or application program interfaces), which will enable us to offer a variety of standardized APIs and improve data integration. We are also continuing to extend and develop our ecosystem of data providers and complementing applications – which already includes firms such as MathWorks, FINCAD, MicroStrategy, and Vermilion – allowing us to provide broader solutions with faster time to market.

The growing interest in reference data utilities is another emerging trend that has taken off in the last few years.  The open industry challenge remains to be how close are we to commoditizing reference data? The obvious appeal, beyond cost reduction, is that reference data utilities would allow investment management firms to redirect those internal resources to other higher value added work.

These are telling signs of the direction the industry is moving; however, firms can only rely on standardized reference data to a point. There always tends to be a portion of data that is specific or unique to each firm that requires a bespoke integration solution to meet varying requirements and processes.

Reference data is still a ways away from being commoditized. Indeed the industry hasn’t yet agreed to a single business ontology. However, steps are being made in this direction with the EDM Council introducing the FIBO standard (Financial Industry Business Ontology). This is a good start but it still has room to grow in its level of completeness and adoption. Furthermore, firms looking to move in this direction will need to do a critical and honest self-assessment as to what level of the Data Management Maturity Model they reside in.

We’re in the early stages of what the future will look like for reference data and standardization has to be seen as the ultimate goal. Crucially, there is pressure from the investment community that will only help to make this a reality. Before we get there, though, a number of issues still need to be resolved.

Leave a Reply