How to conquer your data integrity challenges as derivatives volumes soar
By Rewan Tremethick, Content Manager.
Derivatives trading volumes have soared worldwide this year. The first ten months of the year saw 37.68 billion contracts traded – up 30.7% from the same period in 2019.
October saw a year-on-year jump of 34.7% to 4.01 billion contracts. September saw volume of 4.24 billion contacts, which was 41.2% higher than a year ago.
Uncertainty has been the key theme of 2020. The Covid-19 pandemic caused unprecedented moves in the market. Rising case numbers, second wave fears, and vaccine hopes have dominated the headlines.
Central banks threw open the liquidity taps and rates plunged to ultra-low levels. Oil futures went negative and elevated stockpiles still hang over the market. Gold hit record highs on bets of an inflation surge from monetary and fiscal stimulus.
Throw in the ongoing Brexit talks and of course the US Presidential Election and it’s no wonder firms have seen volumes climb.
Why data integrity is key to handling surging volumes
With increased volumes comes increased complexity. Which shines a light on the need to improve data integrity.
Doing so allows you to better manage your risk exposure and meet regulatory requirements. For investment management companies in the US, for example, the SEC has just adopted Rule 18f-4. It mandates that a fund’s risk management program must identify and manage:
- Leverage risk
- Market risk
- Counterparty risk
- Liquidity risk
- Operational risk
- Legal risk
This requires accurate data. Poor quality data creates risk, rather than helping avoid it.
You also want to be able to react fast to changes in your mark-to-market exposure. To react to big shifts in the market before collateral calls come in.
A key data integrity challenge for firms comes from the difficulty in reconciling ETD and OTC derivatives trades. These products are numerous and complex. Traditional systems were struggling to keep up even before the pandemic sent trading volumes soaring.
In fact, we spoke to a customer earlier this year whose legacy on-premise solution was only able to match across six fields. This left them having to undertake extensive manual work to identify and investigate exceptions.
How firms can evolve to meet the derivatives challenge
So how can you handle the increasing complexity of derivatives markets in order to focus on the opportunities the market presents?
It’s clear that agility is the key. Traditional systems can’t respond fast enough to new products and regulatory requirements. Take reconciliation for example. According to Aite Group, it takes 74 days on average to set up a new reconciliation process on legacy technology. And until it’s built, you need to rely on manual work.
To become more agile you need a data integrity platform that is flexible. There’s a lot of data to track when it comes to derivatives, such as legal entity names, currency, underlying product, trade date, end date, the mark-to-market currency and date, and so on. It comes in many different formats and from different systems.
Speed is also essential. You need your end users to be able to build processes within hours, not weeks. No-code solutions and intuitive designs make this possible. Data integrity no longer needs business requirements documents and extensive controls to get projects live.
It’s important to think ahead and not just address the current challenges. Taking a new approach involves thinking of the opportunities the future may bring. Any data integrity solution needs to be able to work together with other technology, such as RPA.
Combined, these features pave the way for increased operational efficiency, greater risk management, and unparalleled scalability.
Struggling with high volumes? Download our OTC and ETD Clearing guide to find out how you can meet your data integrity challenges with Duco.