Leveraging Data to Streamline and Optimize Implementations

The discussion at a recent TSAM NY panel underscored the role of data governance in systems implementation

Rich Vivolo, Lead Consultant


You can’t manage that which you can’t measureit is a business maxim that has only become more true as data has become increasingly available to track and benchmark business processes. While there has been a concerted effort throughout financial services to leverage operational metrics available through next-generation data management platforms, we are also finding that this intensifying deference to data can be just as useful in guiding new system implementations, themselves. This was a topic that we discussed as part of a panel at TSAM New York in June.

The principal dilemma for many organizations is whether a data governance program should be in place before an implementation occurs or if it should come afterward. While the answer often depends on whether an existing governance framework is already in place, the most successful implementations do indeed start with a defined process that at some point—either before or during an implementation—can incorporate metrics to both educate and motivate key stakeholders.

The importance of measuring and tracking progress during implementations was something that I stressed during the TSAM panel: Establishing a Firm-Wide Data-Quality Management Strategy. The panel was moderated by Accenture’s Mick Cartwright. Each of the panelists, including representatives from a global investment bank and an investment management firm, spoke to the challenges of creating a data governance model—often characterized as a chicken-or-egg predicament. The benefit of prioritizing data quality and governance at the earlier stages of an implementation is that the data can then inform everything that comes afterward.

Some of the key metrics that can be particularly revealing as part of an implementation include the total workflow completion time; both the number of service-level agreements (SLAs) created and the percentage of SLAs met; the number of exceptions processed; and system availability, to name a few. Each of these metrics can generally be correlated to total cost reduction or workforce utilization, which can be critical signposts to understand if key assumptions are being met, and if not, which areas require refinement. As one fellow panelist emphasized, data governance can also help stakeholders identify “whose throat to choke” when issues emerge.

For any implementation, data governance and the use of metrics can also help organizations avoid some of the more traditional pitfalls. One of the more common issues firms will encounter relates to the scope of the project. Most firms will be inclined to initially take on more rather than less.

As an example, we have worked with clients that have wanted to accelerate their implementations. In one case, an asset management firm was seeking to simultaneously stand up fund accounting systems for both their equities and fixed income businesses. The challenge was that parallel implementations of this scale demand dedicated resources and careful oversight. The advice we often provide is to be sure you are resourced accordingly to support both your business-as-usual and implementation needs. If not, we recommend limiting the number of initiatives undertaken at one time and, importantly, setting up mechanisms for each one to track progress against specific objectives. The alternative, to quote Yogi Berra, is that you’ll be “lost, but making good time.”

This is where utilization data can come in. It can help an organization prioritize which areas are most important and inform a hierarchy structure that serves as a roadmap for future project sequencing.

Another issue, also related to scope, is that as data sets become cheaper to acquire, many organizations will actually take in far more data than they need. This can create duplicative data, or worse, conflicting data that then eats up resources to both store and manage, while creating added costs and unnecessary complexity. In one recent implementation, an Eagle client was able to apply utilization metrics to “obsolete” literally hundreds of data fields and reduce data source complexity.

One of the biggest issues discussed during the TSAM panel was the need to secure senior-level buy in. This is important to both maintain data quality and, also, to fully leverage the capabilities available through new technology. Again, when organizations can quantify the impact of a new system or the need for data governance, it is typically easier to secure the commitment of leadership to the process. Data management for the sake of data management will generally fail to generate the enthusiasm or engender the top-down support needed to effect the cultural change required for success.

Even as one TSAM panelist focused on the “throats to choke” in describing how data can drive change, another emphasized that the carrots are just as important as the sticks. This is particularly true as technology applications extend from IT to business users. When data can be used to quantify the value add, users are that much more inclined to embrace new systems and the work that goes into both the implementation and development of the requisite muscle memory to fully utilize new systems. This panelist also keenly noted that while technology is an enabler, metrics can be crucial to tell the story of how automation and deeper analytical capabilities can empower employees and advance the business at large.

Leave a Reply