About Eagle

Sovereign Investors and ESG: Appeal and Reality

Corinne Neale, Global Head of Business Applications, BNY Mellon Data and Analytics Solutions


Environmental, Social and Governance (ESG) considerations are increasingly front-of-mind for sovereign investors – including sovereign funds, central banks and public pension funds. In fact, most we have come across already consider underlying ESG issues as part of their investment decision-making processes – at least informally.

A recent discussion at BNY Mellon’s recent Sovereign Academy, however, highlighted that widespread ESG integration continues to be hindered by the ongoing confusion that stems from the lack of common industry standards and lack of consistency in ESG data. With hundreds of vendors flooding the market offering their own take at sustainability scoring, integration has become a more confusing prospect, creating a paradox of choice.

Anecdotally, most sovereign investors are in fact keen to understand and adopt ESG analysis in their investment strategies. Given their varied mandates and often longer investment horizons, ESG and impact investing can offer a compelling opportunity to put capital to work into sustainable investment strategies and markets. Singapore’s Temasek, for instance, is among those that have already established an impact fund to invest in companies supporting everything from financial inclusion and health, to smart cities and companies tackling climate change.

Still, while sovereign institutions will have a critical role in scaling up ESG investments globally, the discussions with sovereign stakeholders in recent months have surfaced no shortage of frustrations around issues that are currently hindering adoption.

Consider, for instance, the lack of consistency in ESG scores across suppliers. There are currently more than 150 vendors providing data or sustainability rankings, which lends to discrepancies and trust issues in the underlying factor data. Some vendors may reward companies for disclosure alone, whereas others focus more closely on specific metrics reflecting a corporation’s environmental and social impact. Unlike credit rating agencies, which are largely aligned in how they assess and report credit risk, the lack of a common standard to arrive at ESG scores makes it difficult for investors to contextualize the materiality of the rankings.

Another issue relates to the scale of available opportunities. While ESG as a topic continues to surge in popularity, identifying compelling opportunities remains a challenge. The latest GIIN survey, published in June, saw nearly three quarters of respondents (74%) cite the lack of high-quality investment opportunities as either a significant or moderate challenge for those seeking ESG or impact investments. This shortage of opportunities is even more pronounced for sovereign investors with assets under management that can extend to 12 figures. Read More…

Credit Events: When the Effects of Weak Markets Leak into Operations

While credit derivatives provide a valued hedge during downturns, the heavy lifting involved with defaults and bankruptcies requires an automated solution.

Jawann Swislow, Instrument Engineering Analyst


During economic downturns, there are a number of consequences felt by various market participants. Individual investors may see a dip in the value of their retirement funds, while investment managers or asset owners will likely have to rebalance portfolios or consider more sweeping reallocations into vehicles or asset classes that carry less downside risk. Although derivatives investors may appear more protected through certain hedged positions, they aren’t immune either as many are affected by a lesser-known phenomenon, known as credit events.

For the uninitiated, a credit event occurs when an organization is unable to meet its financing obligations. This is most often due to a bankruptcy filing, payment default, or debt restructuring, which can trigger payments on credit derivatives linked to that organization. To fully understand credit events and their potentially sweeping implications, it’s helpful to consider the economic climate roughly a decade before the financial crisis of 2008.

The first credit default swap (CDS) was created in 1994 by Blythe Masters of JP Morgan. The instrument provided a means for investors to take a position on the credit worthiness of almost any organization in the market that carried debt. The CDS is based on a specially designated debt obligation, or a reference obligation, provided by the issuer of the financing arrangement.

In a CDS there is a protection buyer, who takes a negative (or short) view on the reference obligation’s credit worthiness, and a protection seller, who takes a positive (or long) view. The buyer pays a quarterly coupon to the seller in exchange for protection against any credit events. In the event of a default, bankruptcy, or other kind of credit event, the deal is terminated and the protection seller must pay the buyer a fee based on the size of the trade’s notional value and the amount of capital that can be recovered through a restructuring. A similar and newer derivative, the credit default index swap (CDX), tracks a basket of 100 reference obligations and trades on a factor after a credit event instead of being terminated. Read More…

Reimagining Performance Measurement in an AI World

A recent panel discussion highlighted two opposing theories around which skillsets will shape performance measurement and attribution in the future.

Mark Blakey, Product Management


In an era of disruption and digital transformation – marked by hundreds of fintech and software vendors coexisting across the asset management landscape – performance professionals may be asking themselves how they can best leverage the latest leading capabilities available to support their business. More poignantly, many are also trying to discern how the performance function itself will evolve and whether technology will alter their role altogether. It can be a polarizing topic.

Specific to performance measurement, competing viewpoints generally emphasize either one of two skillsets that will be required as the middle office evolves, pitting technological proficiency against domain expertise. The divide between the two camps will only grow wider until organizations have a better sense of where precisely technology will or won’t fit in. In the meantime, many are left wondering, if technology isn’t going to take their jobs outright, should they be worried that a “technologist” someday will?

Read More…

With Global Sanctions Activity Increasing, How Can Compliance Departments Keep Up?

Joel Kornblum, Global Head of Strategic Alliances & Consultant Relations


If the world was flat in the early aughts, as the economist Thomas Friedman implied in his best-selling book, the past few years have become increasingly bumpy thanks to increasing populism and a volatile environment for global sanctions. This doesn’t just pose challenges for portfolio managers trying to stay ahead of geopolitics to manage risk. Compliance departments also have their hands full trying to keep track of certain positions and whether or not there are sanctions in place that ostensibly bar a fund from owning specific securities.

As evidenced by the initiation of several notable sanctions in just five months, it is evident that sanctions are growing. In November of 2018, for instance, broad sanctions were reinstated in Iran as a result of the U.S. withdrawing from the Iranian Nuclear Deal, while the Office of Foreign Assets Control (OFAC) also added Petróleos de Venezuela to its sanctions program (also known as the SDN list), following an executive order that cited human rights abuses of the Madura regime. OFAC also lifted sanctions on EN+ and Rusal, highlighting the challenge of not only tracking potential securities that are under sanctions, but also those that aren’t.

In light of how challenging these efforts can be, Eagle recently hosted a webinar, “Managing Global Sanctions Data with Eagle and SIX” in June. Webinar participants included Jeff Bellemare, Product Manager at SIX; Akhar Mathews, Head of Sales Support at Eagle; and myself.

Read More…

A Close-up on Canada: Data, Investment Performance, Technology and Operational Strategy in Focus

Manuel Tereso, CFA, Consulting Lead, & Mark Goodey, Dip IoD, Director


The investment management landscape in Canada continues to change rapidly. Mounting regulation, technological advancement, changing client demands, business transformation initiatives, and consolidation are presenting asset managers with both new opportunities and new risks. This prevailing shift in the industry was felt at two recent events in the region – TSAM Toronto and a client event that Eagle hosted jointly with CIBC Mellon. The events shared common, prominent themes in the industry that arose amongst the operations teams of Canadian asset managers and asset owners.

Mastering Strategic Data Quality
Firms are focusing on creating robust governance frameworks and enhancing the strategic management of their data, evidenced by the growth of the Chief Data Officer role. At TSAM Toronto, a show of hands was asked for those who did not have a data office within their organisation, highlighting that a tipping point has clearly been reached. Previously, the assumption of embarking on a new technology project was that data issues would resolve themselves or would be someone else’s problem to address further down the line. Yet today, many enterprise level projects and initiatives now start with better alignment to data as a primary governing thought. Firms are investing significant amounts of time, money, and energy in ensuring the quality of their data. Judging from the topics and interest from the TSAM Toronto delegates, it is evident data management will continue to be a growing priority.

Read More…

Replacing LIBOR: Transitioning to Risk-Free Rates

Brexit has effectively sealed the fate of LIBOR. The transition to global risk-free rates promises to be more taxing than most organizations are anticipating

Brian Dunton, Head of Instrument Engineering


LIBOR, the most referenced interest rate benchmark in the world, is due to be phased out starting in 2021. The 2012 LIBOR scandal – in which benchmark rates were manipulated by rogue bankers to benefit their derivatives-trading operations – has resulted in a move toward risk-free rates (RFR). The momentum behind this has only become more acute as financial institutions get their arms around the impact of Brexit.

LIBOR, for the uninitiated, refers to the London inter-bank offered rate and is calculated using appraisals from leading financial institutions in which the banks estimate how much they would be charged to borrow from peer institutions. Risk-free rates, alternatively, are generally calculated as the weighted average rate from actual overnight lending between banks. Given the potential for manipulation, inter-bank offered rates are expected to gradually be replaced by global RFRs. For historical context, the LIBOR benchmark has long been used to calculate financing on swaps, bonds, mortgage-backed securities, bank loans and a host of other financial instruments. The expectation is that deals will start to gravitate toward published risk-free rates. While it sounds seamless, replacing the benchmark with a different index to calculate financing accruals is far more complex than it may appear at first blush.

Consider, for instance, that a vanilla interest rate swap would historically represent a fixed rate versus a floating interest rate hedge based upon the current LIBOR rate. Other economic indicators, such as the yield curve, are also generally factored in at the time that the deal was struck. While one leg would remain fixed throughout the life of the deal, the other would reset at each payment period. Different tenors of LIBOR were published and used to calculate swap financing fees for each period.
Read More…

Embracing Analytics: What Asset Management Firms Can Learn from Major League Baseball

Baseball has demonstrated the profound impact of analytics. Although it is a major sport in the United States, the lessons learned from its data transformation have global application. As asset managers are similarly navigating data transformation in their own industry, they are looking to close the data and technological gap – and discovering Managed Services as a solution that can help them rapidly and effectively make the shift.

Liz Blake, Global Head of Eagle Managed ServicesSM


When Michael Lewis first published Moneyball, he documented how one baseball team embraced data. Fifteen years later, this data-first mindset has spread across the league, impacting how teams invest in free agents, how the game is managed, and even how they sell tickets.

For asset managers, this is more than a curiosity. Baseball’s data transformation foreshadows the change currently reshaping the investment business. Just as advanced analytics challenged long-held assumptions and, ultimately, rewrote convention in baseball, asset management now is undergoing a similar transformation.

Model for Asset Management Firms
Incorporating a true evidenced-based, data-centric mindset into baseball’s scouting and player selection required new thinking or risking being left behind. Today, investment managers are confronting similar challenges. They’re not only rethinking what they analyze and how they generate alpha; they’re reimagining their entire operating model to treat data as a true asset. This enables them to redeploy resources – both people and capital – in new and more effective ways.

Historically, baseball managers were relegated to manual tabulations of individual data points, stored on paper and in their memories to drive decision-making. For example, great managers may know that a batter was a good hitter and was likely to get on base.  However, in order to win more baseball games, teams needed analytics to know how he got on base (he hit a curveball to shallow right field). Using this information in context, managers were able to hone game strategies, like the “shift,” in order to get the batter out, and put their team in a better position to win games. Ineffective use of resources – such as bringing in a curveball pitcher to throw to a batter who loves hitting curveballs or leaving the third baseman and shortstop in their traditional positions waiting for a ball that is rarely going to come – are not winning strategies.

Read More…

Performance Measurement: Controls, Workflows, and Technology

Mark Goodey, Director, Senior Principal of Investment Analytics


Recently, I was fortunate enough to observe a number of thought-provoking presentations and panel discussions as chair of the FTF Performance Measurement Americas Forum in New York. In reflection of the event, I’ve highlighted some of the key themes I found most impactful.

Improved Controls
Performance teams are under increasing pressure from internal audit teams—and, more importantly, external regulatory bodies—to ensure their data is passed through comprehensive control processing. Once validated, the data is deemed reasonably bulletproof in the eyes of consumers. There’s an acceptance that the performance function acts as a safety net for clients and, therefore, needs to act as a data quality feedback loop to other teams across the business. There is a firm ‘quality control’ component to the performance measurement function, requiring significant oversight of data and robust workflows.

Much of the conversation I witnessed centred on the data management challenges for performance teams, as well as the role of manual ‘eyes-on’ processes versus automation. Based on the increase in the volume of data, the sources of data, and the frequency of reporting, it’s apparent that processes and workflows need to be streamlined and the ‘maker-checker-supervisor’ process must be systematised. Ultimately, this comes down to a combination of human and technology processes. It’s essential that exception-based reporting, like that provided by Eagle, replaces manual reconciliation. This solution increases the human operator’s responsibility to supervise and oversee the data by using tools and dashboards to ensure data accuracy and resolve issues as they occur. At Eagle, one of our key considerations, as part of our continual product enhancement, is to enable any number of “checkers” and “supervisors” in the process at any time in order to satisfy regulatory demands. Eagle’s next Performance software release will introduce the ability to have any number of ‘flags’ to evidence a sign off by a stakeholder as part of an approval workflow, this will be audit ready.

Read More…

ESG Data: The Case for Transparency

Joao Sousa Dias, Sales Director


It seems there is nothing hotter in investing today than ESG. While long considered a “tick box” activity, environmental, social, and governance factors have taken centre stage in recent years amid pressure to address issues such as climate change and diversity, as well as societal changes spurred by generational transition. Among investors, though, the catalyst with the biggest impact is the mounting evidence that accounting for ESG factors can improve returns.

“Incorporating ESG analysis into the investment process can add between 50 and 100 basis points per annum to returns,”Arabesque’s Andreas Feiner quantified in an interview with Eagle, adding that it: “imparts a slight reduction in the overall risk”. The numbers support the narrative that impact investors have been making for years. Companies with high ESG standards are likely to be better run, more resilient to changes in regulation, and less susceptible to being fined or suffering reputational issues over the long-term.

That’s the good news for investors. And it helps explain gravitation to socially-responsible investment strategies, as some €19.2 trillion is committed to sustainable strategies worldwide, according to the Global Sustainable Investment Alliance. Europe leads the way in this, accounting for well over half (57%) of professionally invested funds employing sustainable strategies globally. The bad news, however, is that ESG can be a labour-intensive pursuit for firms that don’t have their “data house” in order.

According to the consulting firm Opimas,total spending on ESG data will increase by around 48% in the next two years. Asset managers and asset owners alike are looking to incorporate ESG data to drive both investment decision-making and investment analysis. Furthermore, while ESG factors have traditionally been the preserve of equities, increasingly ESG-based fixed-income indices are emerging. As a result, the demand for ESG data has never been higher and will only continue to grow.

Meeting this demand is easier said than done, however. The availability of data is scarce with vendors playing catch-up as ESG strategies multiply. To fill this void, a range of heterogeneous ESG data services have been introduced, yet standardisation—and, more importantly, standards—have yet to materialise.

This is no great surprise, since the regulatory environment is still developing and ESG measurement is still in its infancy. As Andreas Feiner points out, in the last two years regulators have introduced nearly 300 different rules focused on sustainability and corporate governance. While this is likely to improve ESG reporting—and provide greater opportunity for investment decision makers to identify metrics that deliver outperformance in the longer-term—in the short-term, it holds back standardisation.

Read More…

Becoming a Data-Driven Organisation

As investment firms turn to data to help inform and improve investment and operational decision-making, they need to take a logistical, rather than tactical, approach to data management.

Marc Rubenfeld, Head of Sales EMEA


Evidence-based management has become the new normal across businesses as organisations in every sector are looking to improve decision-making and, ultimately, the client experience. This isn’t necessarily new, but what has changed is that today they’re leveraging facts and data instead of relying purely on the gut instincts of their workforce. This has been made possible because technology has evolved to a point where the client experience can be substantially improved by utilising and combining data in new and unique ways. For example, Uber has combined lots of different types of data together to create an entirely new client experience and business model.

At a high-level, this has created a new breed of organisation: the data-driven organisation. A data-driven organisation fundamentally relies on data to conduct business and optimise the client experience. They typically display several characteristics, including a relentless focus on measuring results and continuous improvement, coupled with frictionless self-service capabilities available to clients. To achieve these characteristics, the value of data must be baked into the organisation’s DNA.

Organisations will need to become data-driven if they want to remain competitive. Like countless technological advances in the past, if your organisation does not embrace the potential of data, it will begin a painful journey to irrelevance. Imagine a business that did not embrace electricity or the telephone; this is the same prospect facing businesses that fail to embrace data today.

The investment industry is no different; firms are more focused on data than ever before as a means to rethink both the client experience and how to perform day-to-day business functions. Specifically, investment managers are undertaking transformation programmes to put data at the heart of their organisation in order to realise the benefits of being data-driven.

Read More…

1 2 3 13  Scroll to top