11.29.2017

Bank of England Invests In Data Analysis

11.29.2017
Shanny Basar

The Bank of England is investing in a new data architecture to improve the analysis of information from multiple trade repositories.

The UK central bank said in its latest Financial Stability Report yesterday that it is investing in its capability and technology to collect, process and store data to enhance its ability to query and analyse information from trade repositories.

Mark Carney, Financial Stability Board

Mark Carney, Governor of the Bank of England

“The new data architecture will solve some of the existing technological impediments to analysing trade repository data, most notably by combining the data available from multiple trade repositories into one integrated data set and automating the identification of duplicate copies of reported transactions,” said the report. “By improving data collection, processing and storage, the Bank will be able to analyse larger volumes of data, across multiple trade repositories and across time, significantly faster.”

After the financial crisis in 2008, the leaders of the G20 pushed for a shift of standardised derivatives from the over-the-counter market into central clearing in order to improve transparency. Standardised products would be traded on exchanges or electronic trading platforms and cleared with uncleared transactions subject to higher capital requirements and mandatory margin requirements. Details of each transaction would be reported to trade repositories.

The European Market Infrastructure Regulation, which covers central clearing, came into force in December 2014 and introduced the reporting requirements for derivatives in the region. There are now eight authorised trade repositories in the European Union.

The Bank of England said transaction-level data from repositories has increased the transparency of derivatives markets to authorities. However regulators lack a global view of derivatives markets because many authorities can only access local data.

“The Bank will work with other authorities internationally to promote the faster progress required at international level to resolve barriers to data quality, standardisation and sharing,” said the report. “In particular, no international work is currently under way to decide on how a cross-border data aggregation mechanism should work in practice.”

The regulator continued that it has already used trade repository data to support its objectives in a number of ways including monitoring activity and positioning in derivatives markets around significant market events such as in the run-up to and immediately following the United Kingdom’s referendum on EU membership last year; assessing the market impact of policy shocks such as the implications of the Swiss franc’s depeg from the euro in 2015; and understanding the structure of key derivatives markets to inform policymaking  and supervisory decision-making.

In addition the Bank’s upcoming in-depth assessment of the role of leverage in the non-bank financial system will draw heavily on repository data to analyse non-banks’ use of derivatives.

The Bank continued that the increase central clearing has reduced the number and size of derivatives exposures generated by a given trading volume, and made the system more resilient under stress.

“There has been a marked increase in rates of central clearing in some of the largest asset classes underlying OTC derivatives,” said the report. “For example, the minimum percentage of outstanding single-currency OTC interest rate derivatives globally that are centrally cleared has increased from 24% at end-2008 to 62% at end-June 2017; for credit default swaps, it has increased from 5% at end-June 2010 to 34% at end-June 2017.”

Clarus Financial Technology, the derivatives analytics provider, said in a blog today that analysis of US swap data repositories suggests that uncleared markets see little new trading activity.

“Notional outstanding of uncleared trades has been constant for the past year,” added Clarus.

In addition, the gross market value of both cleared and uncleared positions has decreased by around 40% in the past year.

“These gross market values have moved hand-in-hand between both cleared and uncleared positions,” said Clarus. “Compression still appears to have a lot of work to do in uncleared markets to shrink legacy portfolios.”

Compression is a process in which clients can “tear-up” offsetting trades in their portfolios to reduce the notional outstanding and number of line items in their portfolio while maintaining the same risk exposure. Use of compression services has increased following the introduction of stricter capital requirements which have led to banks reducing their balance sheets and capital efficiency becoming increasingly important.

Related articles

  1. Outlook 2016: Alexander Lehmann, LSEG

    The exchange is picking up the pace of migrating datasets onto the Microsoft platform.

  2. MiFID II to Boost Automation

    As settlement accelerates, firms are looking closely at their post-trade processes.

  3. FINRA has begun disseminating individual transactions in active U.S. Treasuries at the end of the day.

  4. The acquisition enhances SIX's data offering and expands its global fixed income footprint.

  5. The partnership accelerates the time-to-market for the delivery of customized solutions.