With Hubert De Jesus, Global Head of Market Structure and Electronic Trading, BlackRock
Briefly outline your professional background and current role / responsibilities at BlackRock.
I started as a programmer analyst developing trading systems at Goldman Sachs more than 29 years ago. Since then, I’ve held a variety of positions on the buy side. I worked for Long-Term Capital Management and a spin-off called JWM Partners as a strategist focused on fixed income relative value trading and building risk management and trading tools. In 2003, I joined Barclays Global Investors (BGI), which later merged with BlackRock, where I traded equities and FX and learned about market microstructure.
These experiences were all instrumental in preparing me for my current role as the Global Head of Market Structure and Electronic Trading. At BlackRock, my team is responsible for driving investment performance by overseeing the firm’s electronic trading initiatives, adapting trading solutions in response to market structure evolution, and delivering execution research across all asset classes. The past decade has been a particularly exciting time to be in this space, as the industry has experienced dramatic upheaval from advancements in technology and the adoption of new regulations, which have necessitated a wholesale transformation and electronification of the trading process.
How would you describe the current landscape for trading data, broadly speaking?
In general, real-time (or near real-time) trading data is much more widely available compared with 10-20 years ago, particularly for corners of the market which have traditionally been less transparent, like fixed income securities and over-the-counter instruments. Given that trading data is the lifeblood of vibrant capital markets, this has markedly improved liquidity and price discovery.
However, the constant proliferation of trading venues and transaction data repositories has created a fragmented ecosystem with a complex menu of different market data offerings. As a result, many market participants have an incomplete or inconsistent view of liquidity, while some firms have higher quality and faster data due to their ability to navigate the costs, technology, and operational challenges of consolidating disparate data feeds.
While market participants will naturally have varying data requirements, widespread availability of and equitable access to prompt and accurate market data is crucial to maintaining public confidence in our markets and enabling investors to achieve best execution. As such, there’s room for improvement, for instance through the development and enrichment of a consolidated tape, to deliver more equitable access to trading data and level the playing field.
What are the key challenges in terms of institutional buy-side firms sourcing the data they need, quickly, efficiently and at a fair cost?
The principal challenge for institutions has been the effort to obtain the data that they need on a commercially reasonable basis. This problem has two dimensions – content and cost.
Demand for information is constantly changing as markets develop, so it’s important for market data content to keep pace with the evolving needs of market participants. For instance, equity market structure changes have elevated the significance of auction data, depth-of-book information, and odd-lot quotations (especially in the US), making their addition to core data feeds essential for investors to be able to participate effectively in the market. Additionally, as automation and electronification further increase the speed of trading, delays in the dissemination of transaction information should also be minimized to the fullest extent possible without adversely impacting liquidity provision.
The other dilemma faced by investors is the excessive cost of acquiring market data. Fees for trading data are largely unconstrained by competition because the information from each execution venue is unique and market participants need to have a comprehensive view of the marketplace to make informed trading decisions. Since demand for market data is inelastic, pricing power is disproportionately skewed towards data providers, who set fees according to value of the information to users instead of the cost for venues to produce it. This dynamic is exacerbated by the imposition of price structures that drive up costs by licensing trading data according to various categories of usage.
As a result, a single end-user may be charged multiple times to have the same piece of data accessible across different applications, incorporated into derived analytics, or consumed by back-end processes that do not subsequently display the information. High fees may induce cost-conscious investment firms to be more circumscribed in their utilization of market data, subscribing only to a subset of feeds to the detriment of best execution for end investors. Market participants also shoulder the cumbersome administrative burden of complying with onerous licensing terms and monitoring their market data usage.
How do institutional buy-side firms address these challenges?
BlackRock believes that many of these concerns would be mitigated by establishing a consolidated tape (with content relevant to today’s market structure) which is overseen by a diverse set of market participants. A robust governance model would drive investment in data quality and ensure that the public data feed is operated effectively and impartially with broader input on policies and fees.
In Europe, a consolidated tape would also standardize the data being reported and provide an aggregated pre-trade and post-trade view of the market that strengthens best execution, increases transparency, and improves competition. This would augment market resiliency by furnishing the industry with trading data which would facilitate the routing of order flow to alternate venues in the event of marketplace outages.
Regulators should also review exchange licensing terms, in addition to the level of fees, to ensure that trading data costs remain fair and reasonable for market participants. We would encourage the adoption of enterprise licensing models to simplify the administrative burden for subscribers. Additionally, considering the elasticity of demand for market data, we believe that policymakers should ensure that market data fees bear some relation to the cost of producing the data.
Sell-side banks and exchanges have had a long standing feud with regard to trading data. How do you view this?
It’s very easy to be swayed to take one side or the other given how heated and controversial the trading data debate has been. But we feel that it’s helpful to take a step back and assess these issues from the perspective of who the markets should be serving – end investors and the public. Viewed through this lens, the specific reforms which policymakers should address come clearly into focus.
What regulatory or legal frameworks are important with regard to trading data?
Impediments currently exist in some jurisdictions that prevent regulators from pursuing essential market data reforms. For instance, a recent court ruling found that the Exchange Act did not grant the SEC authority to include non-SROs (self-regulatory organizations) on the operating committee of national market system plans. Legislators should take appropriate action to modify the law, where the legal framework encumbers regulators from broadening governance of market data plans to include entities such as the buy side, sell side, issuers, and vendors.Governance is more effective when everyone has a seat at the table. When the dissemination of market data is controlled by a concentrated set of market participants, outcomes are less fair and not representative of the needs of the industry. Until this is resolved, we will continue to see a lack of investment in data content and unfair and unreasonable fee filings.
This article first appeared in the Q4 2022 issue of GlobalTrading, a Markets Media Group publication.