09.24.2019

S/he Who Masters the Data, Masters FRTB

09.24.2019

She (or he) who masters the DATA masters FRTB

By Gil Shefi, Managing Director for Financial Risk Analytics, IHS Markit

Gil Shefi, IHS Markit

“Data is King” has become an idiomatic phrase.

Banks have an abundance of data, but the recent fines levied by the FCA against prominent UK banks for their failure to accurately report transactions under MIFID indicate that volume of data does not guarantee the fitness to deploy it. These MIFID reporting failures were centered on simple cash instruments; just imagine the problems that might occur when it comes to the reporting of derivatives.

The European Central Bank (ECB) has been concerned about this issue for some time, and although the data purity rules of BCBS 239 were designed to improve the situation, in practice they have yet to take hold in many institutions. In its May 2018 “Report on the Thematic Review on effective risk data aggregation and risk reporting,” the ECB stated that “Sound and robust risk data aggregation capabilities and risk reporting practices have become even more important since the global financial crisis […] The outcome of the Thematic Review shows that the implementation status of the BCBS 239 principles within the sample of significant institutions is unsatisfactory, which is a source of concern.

Interestingly, these BCBS 239 principles have made their way into the final version of the Fundamental Review of the Trading Book (FRTB) text, in the form of the so-called “Annex D.” In their frustration at banks’ lack of compliance with BCBS 239, it seems regulators are finding other routes to enforce the same principles.

It is clear that most banks are struggling with their large and disorganised databases of derivative transaction data, and consequently the Risk Factor Eligibility Test (RFET) component of FRTB. Data, or to be more precise – quality, consistency, transparency and volume of data – is a near-universal topic in discussions that we have with major banks, but the responses from our clients are quite varied. To paraphrase comments made by a Bank of England official at our FRTB event earlier this year: “While banks are concerned about what they CAN do, regulators are more concerned about what they SHOULD do.” In the context of regulatory requirements for NMRFs (non-modellable risk factors), I would argue that rather than focusing on what they CAN do with in-house data to reduce NMRFs, banks should ask themselves what they SHOULD do: look for the best available data sources both internally and externally in order to reduce the size of the NMRF capital burden.

Another way to think about it is by echoing Ursula LeGuin’s “The Left Hand of Darkness”, where it is said that “To learn which questions are unanswerable, and not to answer them; this skill is most needful in times of stress and darkness.

While FRTB does not represent a time of darkness of course, its implementation does create a degree of stress for many banks.  Ironically those banks that openly admit that data is a problem are frequently in a better position than those who do not. Unfortunately for the industry, it seems that quite a few banks remain in the second camp, either because they assume that their in-house data will be complete and robust enough to answer the RFET question, or they assume that time is still on their side and therefore they have not examined the question at all.

I would argue both assumptions are wrong.

As illustrated by numerous exercises conducted by IHS Markit and others, few banks (whether large or small) can achieve the same level of modellability with their own data versus a pool of real price observations (RPO) data – and this assumes the in-house data is of sufficient quality to begin with, which is not always the case. As for time – it is clearly not on the banks’ side. Assuming the January 2022 go-live date and two years of IMA approval process by supervisory teams, including 6-12 months of parallel run, banks will need to have a plan, finalised vendor agreements and be well into the FRTB/IMA implementation work by the end of 2019 or early 2020 (after the publication of the long-awaited Fed’s Notice of Proposed Rulemaking (NPR)).

So, how do banks become masters of their data?

Banks need to learn which questions are “unanswerable” and prioritize accordingly by what they do in-house versus what they outsource. Core functions should be kept in-house, while non-core are candidates for outsourcing. Program leaders should also have in mind that this issue is not just specific to FRTB; rather their overall market data architecture and IT infrastructure need to span multiple use cases for risk analysis, financial reporting and capital requirements. Choosing the right partner to help close the loop between Annex D and BCBS 239 is a winning formula here, as partnering can bring an injection of domain-specific expertise, improved operational efficiency and significantly reduced costs.

Solutions exist to provide banks with the broadest possible set of RPO data and enriched reference data allowing maximum modellability and better proxying as well as adherence to Annex D. They also serve to relieve banks from the operationally intensive, but extremely important, need to validate and normalize data to make it fit-for-purpose.

FRTB. Balance-sheet fair-value-hierarchy. IPV back-testing. Prudential Valuation reporting. All are ruled by the imperative for high quality data. In the coming years supervisors will review banks’ overall data governance with an eye set on reliability, timeliness, comprehensiveness, accuracy and consistency. Some can be defined by clear metrics while others will require more regulatory judgment, but as FRTB and data management shift into supervisory focus, the winners will be those who acknowledged gaps in their data, sought the right help and became the masters of their data.

Related articles

  1. The Parliamentary Investigation Committee was also critical of FINMA’s relaxation of capital requirements

  2. Cumulative trading revenue was $16.5bn in the third quarter.

  3. Institutional money could soon pour into tokenized real world assets.

  4. SG-FORGE demonstrated the technical feasibility of on chain interbank refinancing.

  5. Group head Rui Fernandes outlines the benefits, and challenges, of shifting from standardized to custom.