Why capital markets needs a next-generation operating model
Capital markets has the opportunity to solve the problems that have hampered them for decades. This is an innovative industry, yet many firms’ operating models still rely on outdated technology and manual work. The core challenges of financial data are stubborn and resist automation attempts.
These problems can’t be solved with something as simple as a piece of software. They require a much more holistic approach: rethinking the operating model.
The ideal operating model is a topic that is explored a lot, whether at industry panels or our own Innovation Day event. But why is this such an important question? Can’t firms just make their existing technology work harder and/or increase headcount?
Unfortunately not. Some of the challenges of financial data are exacerbated, if not created, by legacy on-premise technology. There comes a point where it’s not realistic to handle the volume and complexity with the current operating model.
It could well be argued that the industry passed that point a long time ago.
Let’s explore some of the biggest challenges facing financial firms today to understand why a next-gen operating model isn’t a dream, but a necessity.
The cost problem
Cost-cutting will always be on the corporate agenda. The positive revenue outlook for 2025 didn’t change that. According to Morgan Stanely and Oliver Wyman research, it led investors to price in higher investment in efficiency. The market expects to see firms tackle the expensive complexity within their operations.
Capital markets firms have traditionally ended up with an operating model built around managing the exceptions caused by bad data. Most firms work on the assumption that bad data is a fact of life and that processes will break. Leading firms have thousands of Operations workers to manually compensate for poor data quality.
These ‘Human APIs’ perform tasks such as managing shared inboxes, dual keying information between systems in the correct schema and reconciling data on spreadsheets.
It’s necessary because of the reliance on legacy technology, which is becoming increasingly unfit for purpose as time goes on. These systems can’t innovate or adapt fast enough to meet the needs of modern financial businesses.
Oliver Wyman research in 2023 found that the global spend on operations across investment banking, capital markets and asset management was $35-45bn. Operations departments accounted for 5-6% of total costs for investment banks and capital markets firms, and 8% for asset managers. This is because of the large number of full-time employees (FTEs) they have, accounting for 11-13% of the industry workforce.
In other words, the current operating model in capital markets is expensive, inefficient and outdated. It’s costing the industry $billions each year.
Regulatory pressure
Firms have navigated some big regulatory changes in the last few years, including rewrites to CFTC, EMIR, ASIC and MAS rules, as well as the shift to T+1 settlement in North America.
There are also new changes coming down the line, not least the switch to T+1 in the UK and Europe planned for October 2027.
All of these further exposed the weakness of an operating model that relies on inflexible technology and teams who have to wait for things to break before taking action.
A key shift in recent regulatory updates is that a lot more weight is placed on data quality and the controls around it. Both the CFTC and EMIR rewrites made the rules around data quality much more prescriptive, while greatly increasing the number of data points firms needed to report.
Firms are also on the hook for demonstrating that they have robust controls in place to ensure the accuracy of their reporting data.
This is difficult, if not impossible, under the current model. It’s one where opaque manual work fills the automation gaps between legacy systems. No one trusts the data they’re working with, so each team transforms and reconciles it again to suit their purposes and ensure its accuracy.
When it comes to sourcing all this data for reporting, no one’s sure where it is, where it came from, or what’s happened to it while it’s been inside your organisation.
The war for talent
So far, we’ve explored some of the challenges facing firms from a technology or operating model perspective. But what about the impact of these upon the people who have to deal with them every day?
This year it’s expected that the digitally native Gen Z will account for 27% of the workforce. These are people who have grown up with technology in the palm of their hand. It’s not surprising that performing the role of ‘Human API’ isn’t an enticing career prospect.
“No one wants to come to work and do the same thing day in and day out and then go home at night and feel like ‘What I did today, I’m not even sure if it’s adding value’,” Charles Juneau, AVP, Operations at Manulife Investment Management said on-stage when we spoke to him at CIBC Fintech Showcase. “I think everyone wants to come in and feel that they’re making a difference.”
Financial firms need to upgrade the employee experience if they hope to attract and retain the best talent. Today’s workers want their jobs to have meaning. Firms have to offer them the chance to harness the latest technology and make an impact on their organisations.
As Stefanie Coleman, Principal, People Advisory Services at Ernst & Young, LLP, United States, says, “you can’t put tomorrow’s talent in yesterday’s jobs. The next generation of workers expect to be digitally enabled in their roles and to do work that they find rewarding – creative, strategic and interesting.”
None of this is going to happen in an operating model at the lower end of the maturity scale. It just can’t function without all that manual work.
The data problem
The challenges we’ve explored above are very different in nature, but there is a common thread that links them all: poor data. Costs are so high because expensive legacy technology is failing to provide accurate data, meaning headcount in Operations swells to ensure teams can handle the manual work needed to compensate.
These processes, spread across legions of workers, are opaque and unauditable. There is little, if any, explainability behind your processes. Where did this data come from? How has it been transformed, enriched or altered? These are questions regulators want answers to – but the answers aren’t there.
And it’s because of poor data quality that Operations teams spend their days dealing with the same issues again and again.
It’s a cliche that data is the new oil. People use it to talk about data’s worth as a commodity – it’s something that can be mined and turned into value.
But there is another way of thinking about data, one that capital markets is beginning to realise: data is like engine oil. The organisational engine is kept alive and working at maximum efficiency by the smooth flow of data. Poor quality data, like poor quality engine oil, causes problems. Parts stop working.
Enter a new operating model
The challenges outlined above are insurmountable with the current operating model. But there is hope. The traditional operating model is based on the capabilities and limitations of technology that was developed several decades ago. But data management platforms have moved on.
No-code, artificial intelligence and cloud computing are all technologies that promise to create the foundations of something new. An operating model that enables the firm to rethink how it uses data and take a proactive approach to data quality. It is scalable where the traditional model is inflexible, agile where its forebear is not, and can harness human decision making and analytical skills.
The change is so dramatic that it’s impossible to mistake it for the same type of operating model as firms have relied upon for years. The next generation is upon us, ready to meet the enormous need the industry has, and Duco will be here to help guide you every step of the way.
Download our Reconciliation Maturity Model whitepaper below to begin your own operational transformation.