By Erica Borghi
We had the pleasure of hosting our Fall Forum in Baltimore this September, gathering industry leaders from the entire securities operations ecosystem. From the general sessions to the working group meetings, the conference was successful in fostering productive discussions about where the financial services industry is headed and how we can best position our firms for the future. After reflecting on my biggest takeaways, it became clear that everything boiled down to one thing: the importance of data standards.
Why data standardization is important
We live in a new digital world, with 90% of the world’s data having been created in the past two years. Considering the mounting growth of data year after year, it’s necessary that standards are developed in order to make use of the information available to us. In the Future of Reference Data session at the event, panelists discussed the growing need for data, with moderator Genevy Dimitrion of State Street Corporation commenting that people will always want access to this extensive information in a usable format.; otherwise, it’s ineffective. Putting standards in place organizes the data and streamlines operations to make it accessible.
In the same panel, John Bottega of EDM Council, pointed out that the World Health Organization is able to have the whole world agree on one set of medical standards, yet the U.S. financial services industry is still unable to come to a consensus. The lack of uniform data standards creates serious operational, structural, and communicatory issues that inhibit progress.
Walk before you run: Fintech’s need for high-quality data
Everyone always talks about how emerging technologies like blockchain, DLT, and AI will revolutionize the financial services industry, but we can’t get there without high-quality data in place. And how do we get high-quality data? Standardization. In the Washington Politics and Financial Services Infrastructure: Making Data Standards Sexy session, Hudson Hollister of Data Coalition, said, “Blockchain-based reporting will only happen if there’s a consistent data structure in place.”
Panelists in the Future of Reference Data session agreed. “The focus is on delivery mechanisms like blockchain, but first we need to focus on structuring the data,” said Bottega.
Scott Preiss, CUSIP Global Services, added, “We have to apply new technologies without forgetting the fundamentals that are so critical.”
How regulators can help
The thesis of the Washington Politics and Financial Services Infrastructure: Making Data Standards Sexy session was that it is imperative to get regulatory agencies involved in developing data standards. Hollister summed it up nicely when he said, “If we can’t get regulatory agencies to adopt data standards, we won’t be able to get reference data operating efficiently.” Hudson’s passion for the issue was evident as he illustrated his years of trying to get regulators on board, and although he’s had significant success, he made it clear that we still have a long way to go.
Daniel Gorfine, Chief Innovation Officer for the CFTC, also weighed in on the topic during the Investment Manager working group session. It was interesting to get a regulator’s point of view, as he highlighted CFTC’s work with fintech firms. According to him, the CFTC has met with about 200 fintech firms to discuss regulatory issues and opportunities to promote innovation in the last year. Proactively engaging with regulators to ensure high-quality data will prove to advance the industry at large and streamline operations as our firms implement new methods and technologies.
Interested in learning more about ISITC’s perspective on data? View our joint webinar with EDM Council, which details how to best navigate data management.