Decaying Platforms: Addressing the Growing Risks Posed by Outdated Portfolio Management and Accounting Systems

 

A confident businessman with briefcase walking ahead on a tightrope in empty grey urban space conceptAt recent Eagle Investment Systems client events in Chicago and Toronto, Citisoft COO Tom Secaur and Accenture Principal Michael Kerrigan offered their take on why so many asset managers have put off replacing their legacy systems, even as the enterprise risks grow with every passing year.

 

Jeremy Skaling, Managing Director, Global Head of Marketing


The challenge for many asset managers is that it can be difficult to actually define what a legacy system is. Unlike an automobile, there is no odometer from which to measure the lifespan. The most widely accepted definition is that a legacy system describes a platform or a vendor that can no longer keep pace with the growth of the business or evolution of the markets. It might be an internally built system that is no longer fit for purpose; it could be a platform running obsolete technology; or, in this era of consolidation, it could be a vendor that is effectively sunsetting a platform through inattention and waning R&D. The result? Years of customizations and patchwork modifications become twisted and dangerously intertwined. Sooner or later, the negative impact—be it poor data quality, operational inefficiencies or the inability to accommodate new products—can be felt throughout the investment management organization.

“You’ll see an accounting system sitting in the middle of an organization with point-to-point interfaces going everywhere…It’s a warehouse, a client-reporting system, an internal reporting system, it generates analytics, keeps pricing history, you name it,” Citisoft COO Tom Secaur described at the recent Eagle client event in Chicago. While Rube Goldberg would be proud, the prevalence of legacy systems actually represents one of the biggest threats facing many asset managers today.

At one point in time, these systems were likely fit for purpose and may have even been considered cutting edge at the time of their implementation. Fast forward 10, 15 or in some cases 30 years, however, and many of these systems today have not received the requisite investments to remain relevant. This is becoming even more apparent and the impact more pronounced as the engineers that developed the original technology and code hit retirement age, which leaves precious few professionals available who understand how to patch these systems as needed in order to keep them alive and functioning.

At best, these systems may introduce inefficiencies to the middle and back offices. It is more often the case that asset management firms are pushing platforms past their capabilities until these aging systems can no longer keep pace with the volumes of data, the complexity of new products or even new account growth. At their worst, however, a legacy system can actually present a massive risk to an enterprise whose day-to-day business counts on the ability to manage, process and accurately report on the data flowing through the organization.

“There are so many of these firms out there today who are actually operating on systems that can no longer be upgraded,” Secaur described. “They’ll limp along and limp along, and the risk just goes up as they kick the can further down the road.”

Defining a Legacy

For those running legacy systems today, the prevailing questions are often, “How did we get here?” and “How do we get out?”

“Everyone in this room has wrestled with this,” noted Michael Kerrigan, principal at Accenture and a co-founder of Beacon Consulting Group. He spoke to a tendency among many firms who, over time, built upon their existing systems to add updates or modifications as needed, but in the process made their platforms unwieldy and less agile. “It’s often driven by scale, as an organization will create offline tools to support their platforms. It makes sense early on, because you’re supporting new products or meeting front-office needs.”
However, as these offline tools have proliferated within organizations, many firms end up with “controls on top of controls,” introducing several complications. “When you need to actually get the information,” Kerrigan added, “it takes users across the core platform, ‘daisy chaining’ into Access databases, Excel spreadsheets—and navigating through several different sources in an attempt to find any true authoritative data.”

The inefficiencies are obvious, as are the risks posed by poor data quality. Still, getting out from under a legacy system can be a gargantuan task. In many cases, firms may have been exploring transformation projects during the capital market’s heyday in 2006 or 2007, but these efforts were moved to the backburner amid the global credit crisis. Moreover, these system-replacement initiatives are resource- and capital-intensive endeavors at a time when margins have been shrinking across the industry. As a result, many firms will explore any possible alternative to system replacement until it is apparent that no easy alternative exists. Many often neglect to consider that a failed platform upgrade, even if it is attempted using internal resources, is not cheap either and can cost millions of dollars and eat up years of time.

“With many larger clients, though, that’s exactly what’s happening; they’ve either tried to upgrade their legacy platforms and failed or they’ve reached the point where they absolutely need a new system because their existing platform is about to tip over” noted Secaur. This inclination to put off these decisions is slowly changing as the lines between the middle and front office start to blur. Organizations also have a better appreciation of how a data-centric model and straight-through processing drives data accessibility and quality.

Kerrigan also noted that many organizations are just starting to fully understand the potential costs of doing nothing, particularly when looking three or five years out. “Even with conservative growth estimates, when you consider the added scale and volume of data, the cracks in the foundation today start to become really suspect, which is helping to create the business case for many organizations.” He then added these concerns are percolating all the way up to boards of directors, who are increasingly concerned about the enterprise risks created by legacy systems.

When it is time to make the move, Kerrigan and Secaur both agree that this is not a decision that should be taken lightly. It is also a decision that should not be made in a vacuum. For instance, an organization needs to consider where they want to be in five years and how they intend to get there. For instance, if they are seeking to add new products or grow into new regions, that should be factored into the equation as they explore whether they want on-premise technology or a hosted solution, or whether it makes more sense to outsource certain functions or even the entire back and middle office. Yet before determining the desired future state, the essential first step is recognizing existing legacy systems and the significant organizational risk they pose.

This the first in a series of blogs focused on the impact of legacy systems in asset management. Future posts will explore the impact of consolidation as both a catalyst for obsolescence; a consideration in evaluating new platforms as well as possible strategies to make the business case for new systems; and the ROI exercises that can support those efforts.

Leave a Reply