11.15.2022

Rebooting Investment Management – Outlook 2023

11.15.2022
Rebooting Investment Management – Outlook 2023

Over the last few years, global investment management has entered a rapidly-shifting phase, with disruption now par for the course. Heightened volatility, inflation uncertainty and geopolitical disruption are now dominating our headlines. As a result, investment managers are now balancing a tightrope of cost optimisation, as they continue to deliver alpha to their clients.

In this Q&A, David Yardley, Managing Director, North America at Citisoft and Christopher Farrell, Head of US and Global strategy, FINBOURNE Technology discuss the road to Investment Management 2.0, following on from FINBOURNE’s latest industry report; From Mainframe to Modern Financial Data Stack.

The discussion covers key opportunities around digitalization, how Software as a Service delivers a safe path to technology innovation and the steps firms need to take now, to be change-ready.

Q: What is the most critical challenge and the biggest opportunity for asset managers, around digitalization?

David Yardley: Digitalization and the cloud mean that we can no longer use technology as an excuse for poor data delivery against business user requests for information. The data is there, waiting to be used, in the cloud. We are moving away from those four-hour overnight batch cycles, and we can now import a shared database in a matter of minutes.

David Yardley, Citisoft

But this doesn’t solve the problem that we have had for the last 20 years.  It just gets us the data faster. The key issues remain: how do we effectively manage and deliver all of this cloud-based data to our business users? How do we query and analyse the vast arrays of data available in the cloud? How do we mesh cloud-based performance and risk data with cloud-based data from our custodians and third parties? How do we solve for the differences around data classification (e.g. security and transaction types) and data structure (e.g. two-legged swaps) across vendor and service provider cloud data sets?

In practical terms, the solution comes down to implementing new, cloud-first business models, applications, and processes. Our starting point is digitalized data, in the cloud.

Unfortunately, many applications in our industry were designed for overnight batch-based mainframe processing and are siloed by asset class and business function. We will only be able to take account of digitalization and the cloud, if we move away from this siloed legacy approach and design and develop a new set of business models, applications, and processes that take advantage of cloud technology.

The biggest opportunity for asset managers will be moving to a solution that is truly cloud native, built with multi-asset data considerations as the foundation. The biggest challenge will be resisting the approach we have used for the past 20 years: attempting to stitch together data from siloed, legacy applications. Our colleagues in online web-based industries have embraced cloud-enabled technology for decades. It’s time for us to do the same in asset management.

Reworking the data management model may appear daunting but it should be embraced. The alternative—limping along, stitching together disparate data sets extracted on a batch basis from legacy applications—is no longer an option. The future is digital.

Chris Farrell: I believe David has hit the nail on the head here. Consumer technology has made rapid advances, changing the very way we live, but the Financial Services industry has been far slower in its adoption of technology and transformation.

While, the evolution of the mainframe has served global capital markets for some time, it’s important to understand that it was only ever intended to answer specific questions at a specific point in time. We now live in a world of big data, including complex data sets, like private markets and digital assets. Investment managers are now struggling to trust or understand their data because the current infrastructure can no longer keep up.

As a result of mainframe inefficiency, firms are now spending far too much time, in some cases 50% or more of their day on manual reconciliation, locating data at source, or attempting to figure out positions and exposure from siloed data. This is leaving the front, middle and back office unable to promptly respond to market events, to the needs of data-hungry investors and to increasing scrutiny from regulators.

The biggest opportunity capital markets firms have, is to address data at the organizational level in order to eliminate the operational, reputational and human cost of inadequate data processes. What we have found is a lot of the focus from front-to-back systems, data lakes, warehouses and even meshes, only prove value after complicated and time-consuming migrations. Even then there is still a gap, because the focus to date has been on chasing this elusive golden source of data, and not how you translate and interpret across all disparate data sets, or confidently reproduce data for client and regulatory reporting.

With the inherent data problems of trust and timeliness unresolved by incumbent providers, it has left the door wide open for SaaS technologies like ours to investigate the barriers and break through them. What we have designed, is a future-state data stack that makes sense of the technology investment made to date and solves the final piece of the puzzle – understanding and gaining value from data, to deliver efficiency, transparency and ultimately, alpha.

Q: What common mistakes should asset managers be cognizant of when embarking on operational change?

David Yardley: The operational change needed to take advantage of digitalization and the cloud is in theory pretty simple: take control of our data. Organizationally, we need to have a structure that allows us to provide business users with the data that they need, when they need it. If we don’t have that data, we need to have an agreed process to get it. Once we have it, we need to govern it and deliver it.

The reality is that when we speak with most asset managers, they’re likely to tell us the same thing, “we have the same data fields provided by three different vendor applications and in four separate data warehouses—and the numbers are all different.”

To achieve successful operational change in the digitalized world, we must go back to basics and rework the core service model we are using to support data and information delivery across the organization.

As part of this rework, we need to examine how we want to support our operations.

Do we want to manage everything in house using our own data team, data owners, data stewards, running our own master data management toolset? Or do we go DaaS and SaaS? Or a hybrid model with some elements of both? Alongside these questions, we must define our preferred target state operating, data, and information delivery model.

Once we are agreed on the target state, we need to develop a roadmap to get us to that target state. What businesses need to remember and get comfortable with, is that there may be multiple “interim states” so that we achieve delivery over time. This is not a revolution, it’s an evolution.

Chris Farrell: So, when we talk to capital markets firms, the core challenges are commonly around discrepancy of data, which stem from the multitude of formats and languages used across the investment ecosystem. Our view here is that firms need to own and control their organizational data, if they are to unlock the value from the data.

Christopher Farrell, FINBOURNE Technology

To date, capital markets firms’ options have been largely limited to best of breed solutions, outsourcing, or big bang, front-to-back, multi-year transformations. Over the years, this cycle has led to severe infrastructure complexity and in many cases, a single vendor dependency that is now constraining competitive growth and innovation. These traditional models have also contributed to a misconception that operational change is more costly, than value-generating.

Interestingly, what many firms don’t always recognize is that the data models mandated by single platform vendors, often compound the data issues faced across a number of workflows; from performance, attribution and exposure calculation, to accounting and reporting.

This is why we believe firms should do business their way, in the language and format they choose to use. It is why we’ve chosen to deliver the means to migrate between data models, so firms have the optionality to understand what they own and how much it is worth, at any given point in time – the way they want to see it.

This has significant implications on mission critical workflows, for example garnering granular ESG insights and historical analysis to create the secret sauce, to eliminating inefficiencies, so firms can interpret and speak the same language as the entities in their ecosystem. It also removes the need to make manual and daily adjustments, to confidently strike the same shadow NAV as your custodian.

Q: How does SaaS differ in the path the industry has taken to date, when trying to achieve operational innovation?

David Yardley: Definitions for SaaS vary, but I think that there is a cloud and subscription-based SaaS ideal to which many of us are aspiring. This SaaS offers us a world where we rarely have to plan for a lengthy upgrade process for our core accounting or front office systems or worry that our software version is no longer supported by our vendor.

Some estimates put the cost savings associated with SaaS versus installed applications at 50%, covering IT and infrastructure support and maintenance, reduced platform footprint, removal of upgrade and testing requirements, and reduced operational support.

SaaS, often used as part of an outsourced model, is one of the core options for any organization looking to undertake operational innovation and change. Over the past few years, we have seen a significant move in the asset management industry towards outsourced and SaaS-supported business operating models.  The days of in-house supported systems and large IT empires are being replaced by outsourced service models and increasingly outsourced IT departments.

Unfortunately, the availability of SaaS in the asset management industry is nowhere near the level of other industries. SaaS, in an ideal sense, is software that is accessed on a subscription basis online and in the cloud. The reality in asset management is that many outsourced service models still run on legacy mainframe systems that are 20 to 30 years old. Whilst we are seeing a move to outsourced service delivery models, many of these fall short of what we’d consider true SaaS.

Despite these challenges, there are a number of vendors that are starting to provide true, cloud-native SaaS models in the asset management industry.  The asset management organizations that redesign their target state business, data, and information delivery models to take advantage of SaaS and DaaS solutions will be the winners as the industry reaches cloud maturity.

Chris Farrell: On that last point, reaching the future-state model desired is absolutely what firms should start thinking about now and that starts with getting data right at the foundational level. Its why we have inverted the model used by incumbent providers in the industry (where investment functionality is addressed first and data management, second), so that firms can interpret data in a way that delivers meaning and feeds cleanly and instantaneously into the investment functionality, built on top.

However, tackling the fundamental problem of understanding, accessing and controlling the data is something the industry needs now, not in five years’ time. We need to stop with the big bang revolutionary thinking and start taking an evolutionary approach that is empathetic to the fact that firms are not operating from a blank slate.

A safer SaaS-native path to technology innovation enables businesses to focus on the core of their business rather than the distraction of a multi-year transformation. It means starting small with mission critical areas, realising the value quickly and building the business case for further change. It’s operational change on your terms and it’s what we believe will drive the process of improving efficiency and transparency in capital markets.

Challenging traditional constructs, we have designed a Modern Financial Data Stack, in the form of a cloud-native, interoperable data store, that reduces the complexity within the existing operational estate and addresses the industry’s low-risk appetite, by opening up existing ‘closed’ systems.

What we do not believe in, is creating yet another monolithic data store. Leveraging a host of SaaS capabilities, the Modern Financial Data Stack forms a data hub that can adapt as needs change and the markets evolve. It also builds a bridge to external innovation and emerging technologies, via open APIs, so firms can augment investment data processes and outsource non-differentiating activities – to form a hybrid model.

Effectively, achieving safer, better and faster data through a SaaS-native and interoperable approach, is what we believe will enable firms to decommission and upgrade technology on their terms, winning back control, productivity and operating margins.

Q: What are some of the practical first steps firms need to take to be change-ready?

David Yardley: The cloud-enabled digital future starts with data. Data is the foundation upon which the rest of the business functions will be built. Central to this foundation is a comprehensive data management model that acts as “the single source of truth” and enables data provision to the rest of the organization. A well-designed data management model will supply dependable and verified data across business processes, simplify operations, remove manual processing, improve accuracy, and reduce operational risk.

One of the first key decisions for the data management model is whether to go in-house or DaaS. Or, as is increasingly the case, a hybrid of the two. Historically, asset management organizations have spent substantial resources building in-house master data management systems and support functions.  This is still an option, but increasingly we are seeing asset managers looking to vendors and service providers who can deliver DaaS and SaaS based solutions. These options come with pre-defined templates, configurable business and data workflows, and add-on service options supporting data gathering, mastering, transformation, and reconciliation.

Given that data is the cornerstone of the organization, most asset managers will lean towards some element of in-house data ownership which makes a hybrid data management model an appealing option. The hybrid model uses DaaS and SaaS elements to support data delivery but still has data ownership and data stewardship as in-house functions. Over time this model can be tweaked to increase or decrease the elements of the model supported internally or externally.

Chris Farrell: The echoing themes around multiple sources of truth, the difficulty of achieving high-quality data, and how to interpret and gain value from complex data sets, resonate strongly with the mission we are on, to transition global Financial Services out of the past and make it future-fit.

To do this in the quickest possible time, and without ripping it all out and starting again, there are some first steps firms need to take in the journey to being future-ready:

Firstly, we believe leveraging clean interoperable data, should be a clear business priority. To address data-led change and to make the transition from cost centre to value generation, firms will need to start asking the uncomfortable questions around the costs they bear and the capabilities they are missing, not at the operational level, but at the business level and with senior leadership buy-in.

  • What is the opportunity cost being missed today?
  • Does the budget split between keeping the lights on Vs discretionary budget support the organization’s needs
  • Are the data and operations ‘good enough’ to survive today and deliver resiliency tomorrow?

Secondly, I think it’s important to acknowledge that while we are dealing with technology decisions, there are humans behind those decisions. Inaction is one of the biggest issues we face when it comes to operational change and migration scoping. We recently asked Buy Side participants at a US workshop what kind of risk appetite they had for operational change. Only 12% were willing to take high-risk change. As vendors, we need to address this low appetite for risk, if we are to support firms in addressing the inefficiencies impacting them and become future-ready.

De-risking the process of change, through Software as a Service (SaaS) investment data management technology, empowers firms to focus on the core of their business rather than the distraction of a multi-year transformation. By delivering instantaneous access to trusted data and the ability to interpret data on their own terms, we can break through the traditional cycle and future-proof the industry – while reducing operational, reputational and human cost.

Securing this future, however, requires actionable change today and recognizing that what was once good, is no longer good enough.

Related articles

  1. The FCA-regulated exchange has added State Street, Fidelity International and LGIM funds.

  2. They will embark on bespoke research & education for investment teams on climate change & biodiversity.

  3. The firms will enable participants to meet DMIST standards for the timeliness of allocations and give-ups.

  4. A category 4 license includes arranging custody, credit & acting as administrator of funds.

  5. They are coming together to help a broader range of investors access alternatives.