Monthly Archives: December 2017

The Rapid Adoption of Managed Services in Asia

Michelle Wong explains why managed services are proving to be so popular among Asian investment firms intent on improving their data management capabilities

Michelle Wong, Lead Manager, Client Service

Over the last 12 months, I have met with investors across Asia, including both asset managers and asset owners. One common thread spanning those conversations has been the automation of data management processes, but this focus on achieving greater efficiency is no different than any other region. The preferred path, however, is unique as many APAC firms have been keen to augment their operating model with managed services and less inclined to either build their own proprietary data management solution or utilise a third-party system to facilitate the move away from spreadsheets.

Until fairly recently, investment organisations in APAC have been able to meet their investment and operational data needs using Excel alone, relying on manual processes and adding headcount when necessary as the business grows. However, rising employment costs undermine the feasibility of that approach, particularly as the growing demand for data, within the organisation and externally, highlights the need for workflow automation.

Read More…

Performance Measurement: From Cost Center to Business Driver

At TSAM Boston, panelists highlighted how the ongoing evolution of the performance measurement function—enabled by technology—is creating a competitive advantage for those able to meet the growing data demands of investors.

Richard Flokos, Performance Product Owner

Recent research from Chestnut Advisory Group, published in November, emphasized the central role of performance reviews in helping asset managers secure lasting relationships with their clients. Even when performance suffers, the report concluded, informed investors are more likely to remain as long-term investors when the connection to broader market conditions is clear and communicated in advance.

This takeaway speaks to the organizational shift occurring within many investment firms, as performance measurement and risk teams are increasingly being viewed, not as a cost center, but rather as an elemental driver of the business at large. This was the very topic that we addressed during a recent panel discussion at TSAM Boston. The consensus among the assembled participants was that as performance teams begin to assume more prominent roles, their value is being recognized both internally, by other operating areas and the front office, and externally, by clients demanding more granularity and color around returns.

According to the Chestnut Advisory Group research, which featured a survey of nearly 90 asset owners and consultants, institutional investors today are particularly focused on performance data that can shed additional light on portfolio positioning, detailed attribution, and outcomes relative to strategy benchmarks. The timely delivery of performance data was also cited as a critical element of performance reviews by nearly three fourths of the investors polled. Less important, according to the survey, are the macro outlooks of fund managers or discussion around the best- or worst-performing positions.

Read More…

Can I Buy a Vowel? Why the Move from QA to QE is Necessary for Agile Development

While QA merely provides a quality check before deployment, QE is the process that instills quality into software design

Eric Getchell, Head of Quality and Infrastructure Services

A recent inquiry on the question-and-answer site Quora asked, “What is agile?” A seemingly simple question, yet it drew 40 responses, ranging from basic analogies to complex explanations complete with diagrams and related threads on the various types of agile frameworks. While Eagle has certainly touched on the many benefits of an agile development model, (here and here, for example), the transition from QA (Quality Assurance) to QE (Quality Engineering) can help contextualize what an agile model looks like in action and, more importantly, underscores how DevOps translates into improved quality and a faster time to value for clients.

Traditionally, developers have relied on quality assurance analysts to measure the quality of software ahead of deployment. In a customary waterfall operational model—in which products are designed, implemented and verified in sequential order—the quality assurance team is typically the last stop to eliminate any bugs in the code prior to release. Within this model, the role of a QA analyst is primarily to detect defects, measure the impact, and—when they invariably discover an issue—send the code back to the developers to begin the cycle anew.

Quality engineering, in contrast, is about defect prevention versus defect measurement. QE is effectively an upstream event in which quality engineers work alongside cross-functional development teams to discover and solve issues in real time. Enabled by an agile, iterative development model, the move to QE ensures quality is baked into the software at the onset of development and remains in focus not only up to but also beyond deployment. This process utilizes quality measurements at build time allowing for continuous quality gates prior to code submittal. Gone is the protracted feedback loop in which QA and the development team play hot potato with the code base until a fix occurs. In an agile model, quality is simply engineered into the entire process.

Read More…

 Scroll to top