Posts by: Teague Duncan

The Rapid Adoption of Managed Services in Asia

Michelle Wong explains why managed services are proving to be so popular among Asian investment firms intent on improving their data management capabilities

Michelle Wong, Lead Manager, Client Service


Over the last 12 months, I have met with investors across Asia, including both asset managers and asset owners. One common thread spanning those conversations has been the automation of data management processes, but this focus on achieving greater efficiency is no different than any other region. The preferred path, however, is unique as many APAC firms have been keen to augment their operating model with managed services and less inclined to either build their own proprietary data management solution or utilise a third-party system to facilitate the move away from spreadsheets.

Until fairly recently, investment organisations in APAC have been able to meet their investment and operational data needs using Excel alone, relying on manual processes and adding headcount when necessary as the business grows. However, rising employment costs undermine the feasibility of that approach, particularly as the growing demand for data, within the organisation and externally, highlights the need for workflow automation.

At the same time, firms are increasingly focused on achieving greater scale. Larger firms are getting bigger and a key element of these strategies is the ability to achieve greater operational efficiency. Firms that have a solid data management platform are better positioned to achieve economies of scale and to do so more quickly. In APAC, we are seeing firms scale through rapid asset growth and the extension of investment strategies. As a result, there is a greater need to adopt the same tools and solutions as their counterparts in Europe and North America.

However, instead of looking to buy a solution—as many EMEA firms did when they moved off spreadsheets—asset managers and institutions in Asia are turning directly to managed services.

From a cost perspective, this allows organisations to efficiently import data from multiple sources and then validate and consolidate onto a single platform. From an operational perspective, managed services can deliver actionable investment, performance and analytical data—such as an investment book of record—to a customised environment. Investment firms, in turn, get the benefit of in-depth analytics and business intelligence, but without the upfront cost of a new system implementation or the need to build out a team of experts to run the software.

The drivers and benefits are no different than those that are spurring interest in managed services in the EMEA region or in the U.S. But elsewhere, most firms that are looking to adopt managed services also have to contend with an added layer of complexity in retiring their legacy systems. To the contrary, APAC firms are able to take advantage of state-of-the-art managed services with greater ease and speed.

Globally, the demand for managed services is only growing as firms look to offload the operational and data management functions that no longer provide a competitive edge. However, asset managers and asset owners in Asia, largely unburdened by the baggage of outdated or obsolete legacy systems, are perhaps in the best position to benefit. With this foundation for successful adoption, we expect the demand for managed services to only become more pronounced in the APAC region.

Performance Measurement: From Cost Center to Business Driver

At TSAM Boston, panelists highlighted how the ongoing evolution of the performance measurement function—enabled by technology—is creating a competitive advantage for those able to meet the growing data demands of investors.

Richard Flokos, Performance Product Owner


Recent research from Chestnut Advisory Group, published in November, emphasized the central role of performance reviews in helping asset managers secure lasting relationships with their clients. Even when performance suffers, the report concluded, informed investors are more likely to remain as long-term investors when the connection to broader market conditions is clear and communicated in advance.

This takeaway speaks to the organizational shift occurring within many investment firms, as performance measurement and risk teams are increasingly being viewed, not as a cost center, but rather as an elemental driver of the business at large. This was the very topic that we addressed during a recent panel discussion at TSAM Boston. The consensus among the assembled participants was that as performance teams begin to assume more prominent roles, their value is being recognized both internally, by other operating areas and the front office, and externally, by clients demanding more granularity and color around returns.

According to the Chestnut Advisory Group research, which featured a survey of nearly 90 asset owners and consultants, institutional investors today are particularly focused on performance data that can shed additional light on portfolio positioning, detailed attribution, and outcomes relative to strategy benchmarks. The timely delivery of performance data was also cited as a critical element of performance reviews by nearly three fourths of the investors polled. Less important, according to the survey, are the macro outlooks of fund managers or discussion around the best- or worst-performing positions.

Read More…

Can I Buy a Vowel? Why the Move from QA to QE is Necessary for Agile Development

While QA merely provides a quality check before deployment, QE is the process that instills quality into software design

Eric Getchell, Head of Quality and Infrastructure Services


A recent inquiry on the question-and-answer site Quora asked, “What is agile?” A seemingly simple question, yet it drew 40 responses, ranging from basic analogies to complex explanations complete with diagrams and related threads on the various types of agile frameworks. While Eagle has certainly touched on the many benefits of an agile development model, (here and here, for example), the transition from QA (Quality Assurance) to QE (Quality Engineering) can help contextualize what an agile model looks like in action and, more importantly, underscores how DevOps translates into improved quality and a faster time to value for clients.

Traditionally, developers have relied on quality assurance analysts to measure the quality of software ahead of deployment. In a customary waterfall operational model—in which products are designed, implemented and verified in sequential order—the quality assurance team is typically the last stop to eliminate any bugs in the code prior to release. Within this model, the role of a QA analyst is primarily to detect defects, measure the impact, and—when they invariably discover an issue—send the code back to the developers to begin the cycle anew.

Quality engineering, in contrast, is about defect prevention versus defect measurement. QE is effectively an upstream event in which quality engineers work alongside cross-functional development teams to discover and solve issues in real time. Enabled by an agile, iterative development model, the move to QE ensures quality is baked into the software at the onset of development and remains in focus not only up to but also beyond deployment. This process utilizes quality measurements at build time allowing for continuous quality gates prior to code submittal. Gone is the protracted feedback loop in which QA and the development team play hot potato with the code base until a fix occurs. In an agile model, quality is simply engineered into the entire process.

Read More…

How Eagle’s People Will be the Catalysts for Change

Eagle’s Global Head of Change Management and Organization Development discusses Eagle’s strategy to empower and engage frontline employees to drive transformation

Anna Domino, Global Head of Change Management and Organizational Development


There’s a saying in college football that programs are built in the offseason. This is particularly true when there is a change in strategy and coaches are tasked with matching existing players and recruits to new roles and schemes. Change management in financial services is no different. And in the era of digital disruption, it’s the people and their range of skill sets that will dictate whether organizations are successful effecting large-scale transformations.

McKinsey & Co., in February, highlighted the critical importance of frontline employees in driving business transformation. In a survey of more than 1,600 respondents whose organizations have completed change initiatives in the past five years, the consultant found a direct correlation between those companies whose frontline employees were visibly engaged in the effort and success in reaching the desired goals.

Eagle’s Chief Technology Officer Steve Taylor recently outlined the principles guiding Eagle’s deployment-model transformation. While it is clear the adoption of a cloud-native architecture and agile business model will bring new efficiencies and deliver material value to clients for years to come, what can be harder to recognize externally are all the different ways we’re engaging with our people to support and sustain the new operating model. As much as this effort will enhance Eagle’s go-to-market strategy, we expect it to be just as impactful in creating opportunities for both existing and future employees.

Read More…

New LGPS Pools Put Spotlight on Data Management Practices

Amit Bharakda, Sales Director EMEA


Local government pension schemes in England and Wales are undergoing their most radical shake up in years. Currently, the LGPS is organised into 89 pension funds; under the new model, these funds will be combined into eight large investment pools that will manage pools of assets up to £40Bn. One of the central aims of the reform is to reduce investment costs and offer ‘excellent value for money’ by achieving greater economies of scale and introducing improved governance and decision making frameworks.

Creating the operational structures required to establish a common framework and consolidate the assets of multiple separate entities is no mean feat. As Stephen Doyle, BNY Mellon’s head of UK institutional relationship development for asset servicing, identified in his recent article for the Local Government Chronicle, one of the primary considerations is the ability for the authorised entity to receive a consolidated view of the assets within the pool and to deliver consolidated reporting. Having the right data management practices and platforms in place is vital to being able to achieve this and ultimately deliver on the UK government’s goals to improve efficiency and decision making.

Read More…

Macro Platforms, Micro Experiences

As we progress on Eagle’s deployment-model transformation, certain key principles will continue to guide our cloud-native journey.

Steve Taylor, Chief Technology Officer


The consumerization of IT has arrived and is having a deep and dramatic effect on our everyday lives. In fact, whether we realize it or not, digitalization has seeped into our subconscious to the point that it has altered digital interactions as well as assumptions of what is expected.

We expect a seamless client experience, which translates primarily into “always on” and a confidence that platforms will learn and evolve over time. Personalization around applications based on users’ behaviors is perhaps more commonplace than people realize. At home or in the office, the technology and digital platforms we use are focused on personalizing the experience and improving efficiency to get the task done faster.

Whether we realize it or not, it is our micro experiences that are critical to the success of the platform. This personalization drives the value of the technology we use—it may be the ease of finding a movie on Netflix that suits our individualized preferences or the ability to settle a credit card with a simple swipe or tap.

The challenge, however, is that old architectures cannot be wrapped or encapsulated to deliver these strategies. Legacy platforms were built to support a different set of use cases and “cloud wrapping” (encapsulating old capabilities into new paradigms) will not deliver the desired business outcome. These dated systems are often based on batch processes, with no clear separation of concerns and have a historic focus on a small set of super users as opposed to information consumers and citizen developers.

Read More…

Vendor Relationships in the Time of Consolidation: Are Your Vendors in it for the Long Haul?

The impending sunset of Barclays POINT was prominent at the recent FTF Performance Measurement Americas (PMA) conference, underscoring the long-term risks of a short-term focus on software

Jeff Cullen, Solutions Principal


While most performance system RFPs focus on a given product’s user interface or the specific features that are built directly into the software, organizations should also consider the potential for technology debt, which stems from distributed systems and the unhealthy dependencies that often develop with their presence. I recently participated in a panel at the FTF PMA conference in New York entitled “How to Justify System Migration Pain”, and this was a theme that seemed to be top of mind among the panelists following the recent consolidation among fintech vendors that underscored this growing challenge.

The discussion also featured Jeffrey Malmin, General Director, Performance Reporting at John Hancock Investments; Shankar Venkatraman, Director, Global Head of Performance, Risk, Analytics and Compliance at Citi; Jeremy Welch, Head of US Hub Operations at BNP Paribas; and Jeffrey Bellavance, Manager Performance and Analytics at PanAgora Asset Management.

One takeaway from the event resonated with me as the sole vendor on the panel: The consistent recognition that investment firms and asset managers aren’t simply buying software anymore, they’re entering into a relationship with their vendor.

Read More…

Taking Control of Your Investment Operations

Ravi Patel, Solution Specialist


Operating under cost pressures, reduced margins and tough competition, the manufacturing industry has constantly adopted lean operations to eliminate waste of resources and time. Adopting automated process governance across stock control to bill of materials & final packaging, provides an early diagnostic, helps eliminate defects and improves overall operational efficiency for manufacturers. Ultimately this helps them to control costs, meet quality standards, and maintain consistency and reliability of their end product

Lean operations in the manufacturing industry is an example that many Investment Managers could benefit from when it comes to their middle and back office operations. This understanding resonated well with the assembled crowd at TSAM London, where I presented Eagle’s Control Centre. My conversations with various delegates at TSAM confirmed that the reality for investment operations and accounting teams today is very different. Due to silos of disparate legacy systems, they are forced into adopting fragmented manual workflows, which are causing considerable data quality challenges and reconciliation overhead.

Resource intensive manual intervention such as ledger to sub-ledger reconciliation, NAV impact checklists, NAV reconciliation and market data variances regularly cause delays to the valuation process. The reactive nature of such manual workflows often reveals upstream data quality issues once valuations are calculated. Today, these issues are identified late in the process by performing in-depth root cause analysis, causing loss of productivity and often missed SLAs.

Read More…

A Journey Through the Evolution of Buy-Side Investment Data Management

Marc Rubenfeld, CIPM, Head of Eagle Solutions EMEA/APAC


At the recent TSAM London event, I had the honour of chairing the data management conference stream. This involved moderating debates, introducing speakers and facilitating conversation throughout the day. In preparation for the event, I began thinking about the history of data management and what I have seen as the stages in its evolution within buy-side investment managers.

Each new innovation or concept in data management has followed a similar evolutionary path, starting with awareness or recognition around the concept itself. Then follows a ripple of early adopters that look to build their own solutions, before vendors step in to refine and improve the innovation with commercial solutions. Over time, these vendor solutions can offer even more cost savings in the form of a managed service.

Over time, with each new innovation, the period it takes to evolve has contracted as firms have become quicker to embrace new concepts and vendors more agile in reacting to client needs. To illustrate this, I put together the diagram below, based on the design of the London Underground map. The horizontal line roughly approximates time and each evolutionary stage is represented as a station.

Read More…

1 2 3 4  Scroll to top