Reference Data: Reliability Needed
Reference data management has matured over the past decade. Just five years ago, the notion of organizing data centrally was relatively new and the technology was in its infancy. Today, firms need to take actionable steps to extend the flexibility and comprehensiveness of their reference data operations.
“The regulatory environment is challenging – more transparency is firmly needed,” Marty Williams, vice president of reference data product development at Interactive Data, told Markets Media. “As regulators look for a more eclectic view of reference data and firms look to benchmark risk and performance across asset classes, reliable reference data has never been more important.”
Success in today’s markets demands constant innovations: expansion across asset classes and geographies, the creation of new investment and trading strategies, improvements in operations, reporting and client service, he added. At the same time, firms must address regulatory obligations and improve risk management processes.
Financial firms are struggling to arrive at the right size of governance program, and especially how to address near term issues with inconsistent data usage, lack of standardized definitions, and a growing list of requirements. Many firms have elected to create a Chief Data Officer (CDO) responsible for large data initiatives and data governance across silos.
“While many firms, especially on the buy-side, are looking to a centralized governance model, it is acknowledged that in order to drive adoption throughout the organization, some level of ownership is necessary within the business units,” Williams said. “To be successful, the CDO needs to help those individual business units understand the importance and value of that data to their bottom line to create ‘data stewards’ across the organization.”
Technologies exist to enable the parsing of various message formats. While important, these systems are entirely dependent upon a strong business analysis function to map data among sources and to the systems ultimately consuming the information. They also need to validate their data, and to ensure consistency firms must enrich their data by adding context.
Interactive Data enables its clients to bring data into their complex organizations through Apex, a content delivery platform. “We approach each reference data client on a case-by-case basis and tailor our data sets to the client’s unique reference data needs.” Williams said.
Best practice includes a periodic review and revision as needed of validation routines based upon the most recently updated governance policies, whether related to legal entity identifiers (LEIs) or corporate actions requirements. “Technology also plays a role, as much of the validation process is automated, the flexibility of this technology to address evolving governance needs is a key feature,” said Williams.
Featured image via iStock
MiFID II introduces changes to transaction reporting from 3 January 2018.
A global code of conduct for foreign exchange has been published.
Lack of robust data is greatest barrier to ESG adoption.
Volumes remain divided on national lines on the European securities settlements platform.
Electronic block execution are adapting to MiFID II.