Data quality and control are key differentiators that are going to become increasingly important for financial institutions as margins are squeezed and regulatory requirements intensify, according to a panel of experts.
Speaking at a Duco-hosted session at the FIA International Derivatives Expo, Nick Solinger, President of FIA Tech, estimated that around “50-70% of the headcount in futures operations” are engaged in manual reconciliation processes that could be automated using smarter technology.
“T0 with your client. End-of-day recs against the exchange. T+1 portfolio reconciliations. Doing CCP end-of-day position balance. Fee and commission recs at the end of the month. It's the same trade, over and over again,” added Solinger.
One of the main problems is that the systems used at the majority of firms do not appear to be flexible enough to handle modern-day data complexity.
“Data in some cases is locked away in archaic technology and is not freely available and accessible,” said David Pearson, Head of Post Trade at Fidessa. “You've got technology actually acting as an inhibitor to be able to transform data and push data out.”
Christian Nentwich, CEO of Duco, added “We see a lot of complaints about legacy systems failing across the board. They were built for a world where 12 columns of data or 20 columns of data was the norm. Not 160, or 1600 as we had in one case.”
Banks and other large financial institutions have long been trying to consolidate their data to create one “golden” source, though the panel agreed this was still a long way off.
“Now as ten years ago there isn't a single golden source for data,” said Nentwich. “That means as you need to report more and more, and as your internal information flows out the door more and more … you have to be very careful about what you send out.”
Solinger agreed: “With some exchanges, you may see the execution engine using different reference data for the same products from the clearing end of the exchange. Often one can't even get the same data out of different sides of the same exchange! It's really impossible for individual firms to cope with this morass of reference data management and mapping.”
Traditionally investment in post trade technology has been low compared to the billions spent on execution and algorithmic trading, but there are signs that the problems are now too big to be ignored.
“Ten years ago you could always justify the spend in technology on the front office against a revenue return,” said Pearson, “I think that table has completely turned. Now we're starting to see technology and innovation in the post trade space.”
“You’ve got the buy side who previously were prepared to accept the technology that they were given on the basis they weren't paying for it ... That's no longer feasible or tenable,” Pearson added. “All of a sudden, when you're starting to pay for your own technology, you get very interested in value for money and whether it actually improves your current technology capability and lowers your operational cost and risk.”
In turn that puts pressure back on the brokers to try and match the demands of their clients.
Having looked at the data quality issues that abound in post trade, and identified that more investment in technology is needed, the panel moved on to discuss the types of solution that could transform the industry.
“Post trade is very different from execution … there’s a lot of manual and repetitive tasks that happen in our industry,” said Nentwich. “In the middle and back office, there's scope for robotisation and it's almost inevitable that will happen at some point. If it's not something that requires creative decision making, in principle it can be automated.”
David Pearson argued that there was still a great deal to do in cleaning up the data and removing manual processes, before robotisation and artificial intelligence could start to make a difference.
“I think that before you apply machine-learning technology, you've actually got to apply some technology in the first place,” Pearson commented. “You've got to have this information and data transmitted electronically, read electronically, and managed so it becomes an exceptions management process. Then you can start applying really smart technologies.”
Finally, the panel turned their attention to managed and cloud-based services, and whether the financial industry is prepared to embrace these new models.
“I think that people are ready, and they're moving that way” said Solinger. “Today Amazon, and to some extent Microsoft, have tailored their offerings for the financial end-user. They offer security which I think is hard for a technology provider to offer on their own ... the efficiency and cost savings are profound.”
Pearson wholeheartedly agreed: “The economics of the industry are such that actually service-based technology provision is really the only way to go. Whether that's through managed services being offered by vendors, or it's pure Amazon-type cloud services, or a combination of the two. The reality is, that's the way forward.”