7 May 2026

How to stay operational as private and public assets collide

Duco’s James Maxfield reflects upon the rising popularity of private market assets, and the lessons firms can learn from trending products of the past.

The capital markets industry has always been a hotbed of innovation, with new products evolving rapidly to support client demand or capture market opportunities.

Now it’s the turn of private markets, which is growing fast. Staying operational is going to be the major challenge for COO's and Operations leaders at the forefront of this growth. But I am old enough to remember various iterations of previous product enhancements, and have managed to inherit a few scars along the way.

Two (out of many) stand out to me and create an interesting parallel with the recent growth of alternatives or private assets.

Firstly, lets recall the world of ‘Petro-credit’ through the 90’s

Though the Emerging Markets Trading Association (EMTA) pushed some standardisation for Brady bonds (ultimately pushing them towards becoming Eurobonds), Central and South American debt was still very paper heavy in the mid-90’s. Long documents, bespoke terms and non-standardised reference data made trade capture, pricing and settlement complex and error prone.

Even with Euroclear providing a common settlement repository, fails were common, overdraft costs punitive and it took a lot of people to settle a low volume of trades.

And this was complicated enough before you threw oil warrants (or obligations as they were known) into the equation. These were usually faxes sent alongside a deal confirmation from the broker, whose real purpose was not really understood by the Operations teams processing them.

Those that remember that time were not surprised by the market turmoil created when the oil price jumped over a decade later. It made those scraps of paper extremely valuable, but subsequently required teams of Operations people to pore over settlement chains to piece together who owned what 10 years earlier, and figure out who owed who.

(I am personally convinced there are still some of these documents at the bottom of old fashioned filing cabinets in the Square Mile in London…)

Secondly, let's take a look at the world of credit (quickly followed by anything you could think of such as equities or weather) derivatives.

These were latterly referred to (in some quarters) as ‘the weapons of mass destruction’ within financial services.

A similar pattern emerged in the early 2000s, when OTC derivatives emerged as a new asset class. Providing bespoke legal terms and non-standardised data points meant an army of Operations people had to become experts in financial and legal terminology quickly.

This explosion of complexity continued until the crisis of 2008 brought home the reality around non-standardised transactions, where pricing, collateralisation and default provisions subsequently became of great interest to regulators worldwide. It ultimately sparked a decade of punitive regulation that drove standardisation around what had previously been a murky and bespoke world of trading and settlement.

The new (old) problem of private assets

Both of the experiences outlined above had common themes that typically characterise the emergence of a new asset class. This brings useful parallels for COO’s and Operations leaders as they try to make sense of the new world of private assets.

1. Bespoke terms. Lack of standardisation means the devil is often in the detail, meaning your Operations teams need to be savvy on legal and financial structuring. Knowing what to extract, where to record it, and how to validate it becomes an important (and expensive) skill to embed across teams - as missing a default provision or a coupon calculation can be extremely costly. This isn’t a matter of simply going to a Bloomberg terminal and sourcing reference data; it is being able to calculate and model this on spreadsheets.

2. Lack of scalability. New products typically don’t fit into a trade capture or settlement system (because they are new and so not designed for…), requiring a lot of manual effort. This is usually around trade processing (booking, confirmation, settlement, collateral management), but it also creates material work within the control environment. In the absence of public data, models are created and checked (and double checked) to ensure cash flows are correct and that valuations have fed into finance and risk correctly (with independent valuations also often added in as a control layer).

Matching or affirmation processes with counterparties become critical as a validation of your valuations (not just to ensure you are agreeing what currency and where to pay). All of this adds layers of effort to an operating model - even at low processing volumes - which isn’t easy to scale.

3. Low processing maturity. Every transaction is potentially new and bespoke, making a lot of market standard processing capabilities irrelevant (think about netting of payments, pricing conventions to drive settlement, indices for benchmarking or third parties who could be a point of reference). This exacerbates the point around lack of scalability.

4. Low MI Maturity. The final theme here is actually a by-product of the prior points, where a lack of credible or referenceable data makes oversight challenging. MI and risk metrics become manually intensive to create and oversight committees (investment, audit or risk) spend as much time debating what the data point is as they do what it tells them. This is the real hidden cost within these assets, as the management overhead is significant to create risk frameworks, manage them and then ultimately report on.


I recall one exercise I undertook to estimate the amount of management time spent on non-standardised products and the MI cost alone across the team was more than 30%...

An extremely expensive use of senior managers’ time.

All of these factors dictate the operating model design and how processing or controls happen on a daily basis. This is a long way from the high STP, low-touch processing that typically characterises exchange traded business in the public assets world - and is a lot more challenging and costly to manage.

History provides important context for thinking around how to drive automation and control in the absence of maturity and standardisation. And these lessons learned are important reference points for organisations investing in growth markets to capitalise on the upside, not losing because of the downside, or maturity of the market. These assets will progress towards maturity and eventually look and feel much like those traded on the public side - but until then, operational leaders will need to recognise the challenges these assets create and flex their models accordingly.

It is within this context that many COO’s and Operations leaders find themselves today, as the explosion of growth in private assets in recent years creates challenges for operating models that have predominantly focussed in public markets. We now need to cater both.

The unique challenges of private assets for your operating model

Let's consider some of the key challenges for operating models that have been traditionally focussed on public assets - and now need to flex to a more hybrid approach.

Asset acquisition

The bidding process for acquiring assets - be these packages of assets or assessing liabilities as part of a pension buy out - is time sensitive and high risk. Being able to extract the key data attributes from long bond confirmations, term sheets or commercial mortgage agreements that are embedded in a 300 page PDF is not a trivial task.

The knowledge required to extract things such as clean up calls, interest triggers and covenants is typically specialised and in short supply. Success here is accurate extraction of the critical data points to support winning the bid process - but failure often results in mis-priced assets and risk in the portfolio that comes back to haunt you further down the line.

There is no easy reference data or market data providers to help compile this; pricing within the buying process has to be built bottom up.

Data hygiene

Data quality is universally recognised as the key challenge in the private markets space - but the diverse nature of the data sets and the formats they reside in makes a common challenge unique to every player. Be this GP statements, specific capital calls or remittance notices, these data points need to be extracted and stored in the relevant repository to drive processing and risk management.

This is as much of a data ingestion and validation challenge as it is creating a consolidated ‘golden source’ for data internally. As an example, a remittance notice isn’t just we will pay you $1m’ - it contains data such as tax calculated, capital vs interest and performance fees, all of which needs to be accurately captured and reconciled to record financial performance.

This is also a critical part of the onboarding journey, where a myriad of file formats and shapes need to be ingested and recorded accurately from brokers and trustees (for an annuity buy out, as an example). There is a risk premium to both speed and accuracy here, to ensure portfolios can be onboarded and managed.

This type of process can take days and weeks to complete. To give a comparison, this is a far cry from simply buying a package of derivatives which are cleared and neatly placed in your clearing account on trade date within public markets.

Data integrity

Reconciliation is typically multi-layered, often becoming a chain of compensatory controls to try and prevent inaccurate data slipping through the gaps. Cash flow reconciliation is a major pain point, with data provided in different shapes, formats and conventions (such as interest vs principal), driving high levels of manual intervention.

Shadow NAV controls are also commonplace, with Excel files and manual calculations providing an additional layer of control over 3rd party valuations from administrators. Or in extreme cases, full blown independent price valuation (IPV) processes that typically are reserved for highly structured products within investment banks.

The absence of trusted data sources - exchanges, index providers, clearing houses - creates high levels of ‘mistrust’ with private assets, leading to a high cost of ensuring data integrity.

Alongside the risk of financial loss (serious enough on its own) is also reputational risk associated with poor and manual errors. Strong reputations for investment managers are underpinned by good controls, trust in valuations and an ability to manage risk as much as they are for good performance.

Continual NAV restatements or erroneous cash flows - Sorry, we accidentally paid you a remittance for unrealised P&L - can we have our $5m back?’- does not paint a picture of good governance. And will ultimately lose you business.

Trading mandates and compliance

Lack of standardised data (as you would see within an index or maybe an ETF) creates additional headaches around risk and supervision. Where trading mandates and restrictive covenants limit certain activities, these can be challenging to enforce - certainly where these clauses are contained within side-letters or similar legal documents that require extraction to codify.

Where an investor prohibits allocation into assets such as tobacco or arms, this becomes a major compliance risk if this data can’t be captured accurately.

How to stay operational

So, as COO’s and Operations leaders grapple with managing private and public assets in a scalable and controlled manner, here are 5 things to consider around staying ahead of the curve.

1. Compensate, not solve, for bad data

It is ‘Op model 101’ to talk about fixing data at source and drive for standardised inputs and consistency around settlement conventions. This is how the public world has evolved and there continues to be heavy focus on utility adoption and industry standardisation by the market. 

This isn’t relevant here - the market participants are different (trustees, lawyers), the value opportunity is in part driven by pricing the opaque, and the value proposition of standardisation is not relevant to everyone in the processing chain. There is a lot of maturity needed over time, which isn’t going to happen quickly.

So what does this mean in practice? Invest in compensatory tooling or capabilities that ingest, transform, normalise and validate data in hours and days - not weeks or months.

This can be technology, it can be people (outsourcing), it can be a mix of both - but focus on dealing with the now, not demanding the future.

2. Out vs insourcing

At a conference last year I heard that middle office and Operations staff who are proficient in alternative assets are attracting a 30% price premium. This signals the market is stressed and should already be driving some thinking that typically happens further down the maturity cycle.

For public assets, ‘lets mature our operating model and technology stack, then look to outsource all of it’ has been a common theme driving tech enabled outsourcing by asset managers to custodians.

Here, the discussion needs to be how to scale as the market grows - and how to scale with a partner that can help. This isn’t about having mature outsourcing offerings (they don’t really exist at scale), this is much more about building this scale with a partner over time and recognition that this market is still evolving. This may be people led at the outset, but will typically converge over time with technology solutions that compensate - and evolve - with the market.

3. AI vs snake oil

Agentic AI is rapidly making an appearance within the CEO budget process, with significant potential for investments to unlock stubbornly people-heavy processes. But just as RPA failed to deliver the promised land a decade ago, caution should also be taken with AI being seen as a panacea.

Specialist vendors may provide out of the box templates for common use cases, but what happens when there is a new document type, changes to documents or the onboarding of a new servicing provider? How does your process adapt to, and adopt, these?

AI may solve for document ingestion or deal capture, but what happens throughout the remainder of the lifecycle? Bringing autonomous agents into the process may bring automation value, but how scalable is your agentic framework?

AI may provide value in certain parts of the lifecycle, but they are expensive tools for higher volume work (such as reconciliation). How does all of this come together within your processing ecosystem? And how do these AI (and increasingly agentic) solutions integrate?

Caution is needed to avoid just solving part of the problem or creating more work integrating point solutions than perhaps the human in-the-loop was doing previously. The right tool for the job is absolutely the challenge here for leaders to appraise decision making around.

4. Orchestration

Process, workflow, tooling, data is all relevant to orchestration here - so mixing third parties, solution providers, internal technology and people is critical to get right. Again, this is seen as ‘Op model 101’ to a certain extent, but the sheer variety and bespoke nature of alternative assets makes this quite hard to do in practice.

That’s certainly the case where an organisation brings on new trading desks or portfolio managers focussed on new sub-groups (retail, infrastructure), which essentially creates a new type of product each time. The start point is typically people and spreadsheets, but this needs to evolve quickly to become even moderately scalable.

Best in class here is having a narrow set of solutions or capabilities - which doesn’t have to be automation, it could be people or third parties - to apply to each new product. 

Governance then becomes important to ensure consistency where possible and ultimately progress to a consistent set of tooling. And although data maturity (lineage, governance) is a long way away, Operations leaders can drive consistency around things such as control frameworks and MI standards to push for commonality around oversight. This piece is key to ensure that fragmentation is recognised as the enemy and needs to be avoided at all possible cost.

I see this area as being one of the most powerful, immediate actions leaders can take to stay in control. Being deliberate with decisions and enforcing them consistently as business evolves.

5. Architect - pragmatically - for success

Golden sources, target architecture, data lineage, etc, are all important here. But they have a time and a place. Many of the larger F2B platforms - such as Aladdin - have invested heavily in catering for the growth in private assets and can bring automation and scale.

Coupled with a coherent data strategy and the right sourcing models, they should be a clear part of the architecture vision - but visions need work to become operational reality. And the highly bespoke nature of many of the asset classes that are attracting investment will still drive a high degree of unstructured data and low automation.

These architectures are not cheap or fast to deliver either and are not a one size fits all for every organisation.

Have an architecture and a vision, but recognise the challenges of changing the tyres whilst driving the car. Success here is being incremental and pragmatic, converging your operating models for both private and public assets over time. Not crashing them together.

Stay ahead of private markets

The growth of alternative assets is projected to reach $32tn by 2030, roughly doubling where it is today. Given the timeline for maturity of asset classes, these won’t be within the public domain anytime soon so this hybrid operating model challenge is something that will persist.

This problem isn’t going away anytime soon for the industry, and the upside for proactively managing the framework is immense…

Discover how we can help you automate complex private market data.