Derivatives reporting regulations are going through a period of immense change. The CFTC Rewrite and EMIR Refit are just two of the updates keeping capital markets firms on their toes. They bring with them a number of changes, from new critical data elements (CDEs) to stricter reconciliation requirements.
How can you navigate this sea of change? We brought together industry experts for an exclusive panel on the challenges ahead. Julia Schieffer, founder and editor of DerivSource, moderated a discussion between Virginie O’Shea, founder of Firebrand Research, Ewen Crawford, Senior Product Owner at Standard Chartered Bank, Loren Schwartz, Senior Director of Regulatory Technology at CIBC Capital Markets, and Duco CEO Christian Nentwich.
All panellists were sharing their own personal views, which do not necessarily reflect the views of their respective firms, unless otherwise stated.
Here are five key takeaways from their conversation. You can watch the full recording of the event here to get all the insight.
1. There’s huge emphasis on data quality for regulatory reporting
Regulators have been trying to build a clearer picture of systemic risk since the events of the 2008 financial crisis. The first wave of regulation to come after that turbulent period was very rules-based in its approach. This meant the focus was on what was reported and not necessarily how.
As Virginie noted, this has caused a lot of data quality issues, from data in the wrong formats to data missing altogether. This means regulators are now “trying to redress this by rewriting the regulation to make data quality, first and foremost, a better market practice out there.”
“The CFTC initially was less prescriptive, more rules based,” she explained. But the Rewrite “is very much more prescriptive. It’s much more in the vein of what EMIR mark one was all about, and EMIR mark two, certainly, obviously also very prescriptive. So, it’s interesting to see the CFTC take a different tack, but I think they felt that the data quality needed to be improved by being much more prescriptive.”
But ensuring data quality isn’t an easy ask. When asked by our moderator, Julia Schieffer, for the main stumbling block when it comes to data quality, Loren Schwartz replied, “If there were only one.”
“There are a number of key challenges,” he said. “One, of course, is inconsistency around data, so different data sources have different ways of looking at the data. Obviously, that poses a challenge to getting a nice, clean set of outputs.”
Data availability is also a big problem, Loren said. This could be when whole records are missing – for instance, when a system is taken offline. Or it could be that a particular data point is unavailable, such as a Unique Transaction Identifier (UTI) for a trade that needs backfilling.
Then there’s “upstream sources that are outside of our immediate network, industry platforms and markets, etc, that also have a lot of regulatory requirements to meet and may not necessarily cater to everything that we do need.”
Loren concluded: “To do data consistently, with high availability, and you’re measuring everything all in near-real time, it takes a very heavy commitment and investment.”
2. Regulatory reporting is a data problem
The renewed focus on data quality was the top concern for attendees at our event, chosen by 55% over things such as the potential for further change, the pressure on manual processes and the cost of compliance.
Christian, our CEO, however, welcomed the shift in perspective: “I find it really refreshing that these days we talk about data quite as much as we do, because I think one of the issues we’ve had in financial services for a few decades now is that we’ve seen everything as a very, very vertical problem.”
“So, we’ve seen these things as regulatory problems and we’re building solutions for specific regulations and those become incredibly fragile. If you want to touch them, you’re doing a $10 million project somewhere.”
Regulatory reporting is “just a very large data exercise”, Loren agreed. And an expensive one, having “taken the industry billions of dollars to get here so far. And I think there are probably billions more to come in our journey before this is all really over,” he said.
“So, it’s just a fundamentally large data problem and the complexity of capital markets certainly means it’s not going to stop.”
3. Adoption of global standards doesn’t mean you can take a standard approach
A move towards adopting global standards is one of the focusses of the CFTC Rewrite and the EMIR Refit. Embracing critical data elements, such as the Unique Transaction Identifier and Unique Product Identifier (UPI), are core to many upcoming regulations, not just the Rewrite and the Refit.
However, as Virginie explained, “there are many changes that are similar in tone but different in terms of implementation.”
“From looking at the different elements, the CDEs are the core of most of the incoming regimes, but the alignment is just not there. So just under half, around about 48%, have been adopted in the same manner by CFTC and ESMA.”
“And that’s just looking at those two pieces of regulation. The rest are not aligned and that can be in format, in definition. And some have just not adopted the same things as each other. They’ve just gone with different options.”
4. Reconciliation is more important than ever – but it’s also getting more difficult
With the heightened focus on data quality comes a renewed emphasis on reconciliation. These vital controls allow you to reduce the number of errors that are reported to the trade repositories, or identify and fix them afterwards.
But it’s something lots of firms are going to struggle with. “The reconciliation part is the really big industry challenge facing us, come December 5, and really for a while thereafter,” Loren said.
This is because, as Christian noted, the new requirements make an already difficult task even harder. “The reconciliations in this space were already very difficult, before any of this happened. They are wide. They are large. Now, you’re increasing the number of fields. The number of firms that we see out there that have squeezed complex regulatory reconciliations into systems that were built for cash and securities is crazy.”
“So, then you’re already breaking up a single process probably into 10 processes just to fit the number of fields in, and your people are logging in and they’re looking in 10 different places. Now we’re going to add a few more fields and they’re going to log into 20 different places, presumably, to see everything that is going on. So, the root cause analysis is hard.”
“Now you have to answer the regulator in seven days exactly on what went wrong and what you’re going to do, so that’s going to be no fun to investigate when you’re looking in 20 different places.”
Under the new CFTC rules, firms will have to report any errors and omissions to the DTCC within seven days and include a remediation plan if they aren’t also able to correct them within the same timeframe.
But regardless of how difficult it is, regulators are watching closely, so it’s important to ensure that your reporting is accurate. As Virginie warned: “don’t forget the regulators have also been investing in what they call suptech [supervisory technology]. And that is with the intent of catching people out. They want to be much more active on enforcement.”
Ewen Crawford, however, put a positive spin on it, noting that “the commonality, ideally, would be uniform across regimes, but we do have roughly half of the content kind of matching up in form, which let’s take that as a slight positive – a baby step in the right direction – in a glass half full way.”
5. The legacy technology stack is reaching breaking point
As suggested above, one of the reasons why data quality is such a difficult goal for firms is because of their legacy technology stack.
As Christian explained, a key challenge we’re seeing is the inability of legacy tech, which is notoriously in-agile, to deal with these problems. “If you have hundreds of systems built on Oracle databases with hard coded schemas, and somebody shows up and goes ‘I need another 90 columns in there and it needs to go over your messaging system, and it needs to flow out’, then suddenly you’ve got big, big, big, lineage and change projects on your hands.”
Firms are using systems to try and achieve things they were never designed for. While sometimes it’s possible to cobble together a solution to a particular problem, the pace and depth of current regulatory change is pushing these systems to breaking point. “Some of the systems are starting to reach fundamental limitations,” Christian said. “I think people are still trying to squeeze trade reporting requirements through systems not built for them at all, and it leads to very, very complex tech projects. Simply, the legacy stack just makes it hard.”
Loren agreed. “One-size-fits-all does not fit in this environment,” he said. “And if people think that ‘Oh, I’ve got a very robust cash management, reconciliation solution’, that it’s going to be able to work for derivatives, that’s going to be a big shock and surprise.”
“Because there’s nothing like a complex swap to break everybody’s brains and processes at the same time. It’s definitely probably the biggest challenge outside of the very large challenge of all the new data elements.”
It’s not just the types of systems, but the way that the process flow is structured that creates problems, Ewen said. The “decade old GTR [global trade repository] engine”, implemented usually for global firms on a global basis, “has EMIR kind of meshed in with CFTC, as well as adding a sprinkle of HKMA, and MAS, ASIC, JFSA potentially.”
Reporting is often system interdependent, with data being drawn from legacy systems upstream and “messaging is particularly dependent on elements that take place across other regimes”.
It’s at times like these, where firms need to deal with a huge amount of change quickly and accurately, that new technology comes into its own. As Christian explained: “Some of the things that have happened in recent years, where we started to build new companies in the data space, in the cloud, as SaaS companies, relying more on machine learning and wrapping around existing architectures without requiring system change – with these events that are happening here, they really come into their own.”
“It gives us an opportunity to work with the industry. We have hundreds of clients on a single version, so we can work with our clients and push best practice out to everybody in one go. Obviously, none of that would have worked in the traditional installed proprietary system world where every bank and asset manager is on their own version.”
Top advice from our panellists
Julia ended the discussion by asking our panellists for a piece of advice, or a final thought, they’d like to share with the audience.
Virginie stressed the importance of making time for testing. “Testing is probably the most important bit of some of this stuff,” she said. “Make sure you carve out enough time with regards to testing with trade repositories and things. I know it’s crunch time but that’s generally when I see the most stress going on out there, is the testing phase.”
Finding time for testing could be especially difficult for firms, if our audience polls are indicative of the general state of preparedness for the industry. Only 34% of our audience said they were at the implementation and testing phase for the CFTC Rewrite, despite the proximity of the December 5 deadline.
Ewen reminded everyone to keep their eyes and ears open. “This stuff is changing all the time. There’s a highway of change running into 2025 I think so far around major changes. So it’s essential to plan.”
Loren, meanwhile, highlighted the importance of transparency. “You need to stay transparent with your stakeholders and your business. Building that level of transparency with the business sphere, it’s certainly difficult to do. But it’s really the best defence against incurring regulatory fines or issues down the road. Sharing that information internally with all the key stakeholders is a very important part of the process.”
And finally, our CEO Christian: “Check your assumptions. Technology’s changed a hell of a lot in the last five years, so things don’t have to be done the way they’ve always been done. There’s a lot out there. Get some RFIs out and see what you find.”
Watch the full webinar recording here to catch up on all the great insight from our panel.