Reimagining Performance Measurement in an AI World

A recent panel discussion highlighted two opposing theories around which skillsets will shape performance measurement and attribution in the future.

Mark Blakey, Product Management


In an era of disruption and digital transformation – marked by hundreds of fintech and software vendors coexisting across the asset management landscape – performance professionals may be asking themselves how they can best leverage the latest leading capabilities available to support their business. More poignantly, many are also trying to discern how the performance function itself will evolve and whether technology will alter their role altogether. It can be a polarizing topic.

Specific to performance measurement, competing viewpoints generally emphasize either one of two skillsets that will be required as the middle office evolves, pitting technological proficiency against domain expertise. The divide between the two camps will only grow wider until organizations have a better sense of where precisely technology will or won’t fit in. In the meantime, many are left wondering, if technology isn’t going to take their jobs outright, should they be worried that a “technologist” someday will?

This question can solicit a heated debate. On one side are the Luddites, who not only don’t buy into the hype around AI or other disruptive technologies, they also don’t foresee a day when technology will ever replace a performance analyst. They’ll point to social media’s “chatbot” failures in the recent past – or other examples when AI has gone rogue – noting that when it matters most, there is simply too much risk to hand over key decisions to a computer.

On the other side are the advocates – professionals who believe that AI is “definitely” worth the hype. This camp also recognizes that AI and machine learning are already creating a distinction between the haves and the have-nots. They’ll point to fund flows gravitating to algorithmic hedge funds, such as Two Sigma or Renaissance Technologies, or outperformance that suggests these firms or others of their ilk see something in the data others can’t.

Whether performance analysts believe in the power of disruptive technology will generally influence their perceptions around how their roles may or may not change. At a recent performance measurement conference, for instance, the moderator asked a panel if those new to the field should learn how to code. It’s a question that can hit pretty close to home, particularly for professionals two decades or more into a career who wouldn’t know which programming language might be most appropriate (SQL? Python? Unix? All of the above?)

The question, to be expected, evoked strong, conflicting opinions among the panelists. One, who said they don’t expect to ever learn basic programming, argued that performance analysis is less about crunching numbers than it is about translating those numbers into a compelling story. A differing view, from an unabashed technology advocate, won the room when they noted that through coding, they can complete in a half hour tasks that previously required days or even weeks. They also argued that a coding skillset would prove invaluable as cloud technology ushers in more applications and more types of alternative data. They noted that these advances could even force the hands of analysts as data sets become too large to manipulate on excel.

A likely middle ground
To be sure, AI and machine learning are already being leveraged across asset management. These technologies today are largely focused on reducing costs and freeing back- and middle-office employees to take on more value-added responsibilities (i.e., the “analysis” component of an analyst’s role). Soon, though, AI will become an even more material factor in how asset managers differentiate themselves. For some, it will define their value proposition or investment edge altogether.

As it relates to the performance function specifically, AI and the ability to manipulate big data will conceivably allow teams to perform attribution analysis at a far more granular level. It could also reposition performance analysis as a “coaching” or assessment tool used internally to distinguish between skill and luck in stock selection or identify very specific factors that may correlate to outperformance. To ignore the hype of AI is to effectively close the door to career development.

That being said, domain expertise and human intuition will be no less valuable in the future. Regardless of how powerful and intuitive AI becomes, domain expertise will be required to outline the objectives of the technology, recalibrate the algorithm when circumstances change, and serve as the arbiter of efficacy. Technological experience will be valuable of course, but if the programmer doesn’t understand the difference between a Sharpe and a Sortino Ratio, they will need to pair themselves with performance veteran who does.

Moreover, the performance analyst of the future will still be counted on to understand the unique needs of clients, and then explain how a specific fund or strategy is positioned to meet these objectives. What are the client’s future funding liabilities? What are their return expectations and risk tolerance for a given allocation, in both the near-term and longer-term time horizons?  How does a given strategy complement other fund commitments? And how should they interpret performance at a given point in time, in a given market environment? Natural language generation, while it can transform raw data into a conversational narrative, still lacks the intuition and nuance that characterizes the most successful and valued performance analysts today.

But while the Luddites and technology’s biggest advocates will portray the debate in black-and-white terms, it is far more gray than most realize. No matter how disruptive it is, technology will not change the performance function overnight. New systems will be implemented, and technology will bring new capabilities to bear, but it will happen over time. And new software will accommodate both the technologists and non-coders alike. Another, overlooked factor is that as AI and quant-driven strategies continue to influence how investment teams construct and manage portfolios, the performance function will be called on to help investors understand what’s going on within AI’s “black box.” It will require both a technological skillset to explain the model and domain-expertise to explain how it generates alpha. It could be one person who understands and explains both facets, but more likely, it will be multi-dimensional teams who will collaborate to paint a more comprehensive picture.

To be sure, the world is changing. Students today are learning technology and analytics as part of their core curriculum. And these skillsets will certainly influence how they work and what they’re capable of as they progress throughout their careers. At Wharton, for instance, data and analytics isn’t a specialty; it’s infused throughout undergraduate and graduate curriculums. Employers, too, have taken notice. A recent job posting for a performance measurement analyst highlighted that industry experience was “a plus,” while technical skills (such as SQL and Python) was “preferable.”

So, it should be clear that a technological skillset will influence the performance function in time, and probably far more quickly than some might imagine. But without understanding asset management or recognizing how certain investing strategies are supposed to meet very specific and complex goals, the technologist will struggle to be effective in a vacuum. It’s also in these circumstances when automation can produce the kind of unintended consequences that scare the Luddites. But as performance measurement capabilities inevitably evolve, a balance between a technological skillset and domain expertise will be critical to advance the function in a way that creates a true and material competitive advantage.

 

Leave a Reply