Monthly Archives: September 2017

Macro Platforms, Micro Experiences

As we progress on Eagle’s deployment-model transformation, certain key principles will continue to guide our cloud-native journey.

Steve Taylor, Chief Technology Officer


The consumerization of IT has arrived and is having a deep and dramatic effect on our everyday lives. In fact, whether we realize it or not, digitalization has seeped into our subconscious to the point that it has altered digital interactions as well as assumptions of what is expected.

We expect a seamless client experience, which translates primarily into “always on” and a confidence that platforms will learn and evolve over time. Personalization around applications based on users’ behaviors is perhaps more commonplace than people realize. At home or in the office, the technology and digital platforms we use are focused on personalizing the experience and improving efficiency to get the task done faster.

Whether we realize it or not, it is our micro experiences that are critical to the success of the platform. This personalization drives the value of the technology we use—it may be the ease of finding a movie on Netflix that suits our individualized preferences or the ability to settle a credit card with a simple swipe or tap.

The challenge, however, is that old architectures cannot be wrapped or encapsulated to deliver these strategies. Legacy platforms were built to support a different set of use cases and “cloud wrapping” (encapsulating old capabilities into new paradigms) will not deliver the desired business outcome. These dated systems are often based on batch processes, with no clear separation of concerns and have a historic focus on a small set of super users as opposed to information consumers and citizen developers.

Read More…

Leveraging Data to Streamline and Optimize Implementations

The discussion at a recent TSAM NY panel underscored the role of data governance in systems implementation

Rich Vivolo, Lead Consultant


You can’t manage that which you can’t measureit is a business maxim that has only become more true as data has become increasingly available to track and benchmark business processes. While there has been a concerted effort throughout financial services to leverage operational metrics available through next-generation data management platforms, we are also finding that this intensifying deference to data can be just as useful in guiding new system implementations, themselves. This was a topic that we discussed as part of a panel at TSAM New York in June.

The principal dilemma for many organizations is whether a data governance program should be in place before an implementation occurs or if it should come afterward. While the answer often depends on whether an existing governance framework is already in place, the most successful implementations do indeed start with a defined process that at some point—either before or during an implementation—can incorporate metrics to both educate and motivate key stakeholders.

The importance of measuring and tracking progress during implementations was something that I stressed during the TSAM panel: Establishing a Firm-Wide Data-Quality Management Strategy. The panel was moderated by Accenture’s Mick Cartwright. Each of the panelists, including representatives from a global investment bank and an investment management firm, spoke to the challenges of creating a data governance model—often characterized as a chicken-or-egg predicament. The benefit of prioritizing data quality and governance at the earlier stages of an implementation is that the data can then inform everything that comes afterward.
Read More…

 Scroll to top