05.22.2014

For Block Traders, ‘Big Data’ is Double-Edged Sword

05.22.2014

For U.S. institutional equity traders seeking to efficiently execute large buy and sell orders, the good news is there is a wealth of available data and information available. The bad news is that in today’s high-speed electronic market, sifting through that wealth of data and information can be a Herculean challenge.

“These traders have data coming in from all sources at a very fast pace — it’s high-frequency data, if you will,” said Kapil Phadnis, global head of quantitative research and algorithmic trading at Bloomberg Tradebook, the agency-brokerage unit of Bloomberg LP. “Everything is coming too fast to handle, digest and make sense of.”

Years ago, a stock trader would ‘work’ a large order in the so-called upstairs market of the New York Stock Exchange. The trader’s relationships and savvy were the primary determinants of execution quality, i.e. getting an order done at the best possible price with the least possible information leakage.

The ‘electronification’ and fragmentation of markets that have transpired over the past decade-plus have redrawn the landscape. Now, traders must select a trading venue from among a dozen exchanges plus a few dozen alternative trading systems; choose one or more algorithms from a vast universe of algos; and make sure the transaction will pass muster with compliance — often, while under pressure to get something done quickly.

The task would be substantial in a static setting; the complexity is multiplied exponentially on a desk with with multiple screens, each of which may be updating real-time with a heavy volume of high-velocity numbers, from a variety of sources.

“‘Big Data’ and systematic feedback for the trader are the two main themes that manifest themselves in the form of pre-trade analysis, or pre-trade characterization,” Phadnis told Markets Media. “We are crunching Big Data using different models, trying to systematically see if there’s a pattern that can help the trader reduce his or her implicit cost. We’re looking at the market level, the order level, and also the trader-behavior level.”

Allianz Global Investors deploys pre-trade analytics primarily for so-called basket or program trades, which include multiple securities, according to Joe Rodela, head of U.S. trading.

Rodela noted that pre-trade analytics has evolved from a facility used to understand portfolio characteristics such as liquidity, market capitalization, and sector breakdowns, to an execution-consulting tool used to evaluate different implementation options as well as risk and cost profile.

“At the heart of an effective pre-trade cost-estimation model is the order data that is used as its foundation – this is ‘Big Data’, even before this term became commonplace,” Rodela said. “Our sell-side providers benefit from routinely handling transactions, of such huge numbers that everyone would agree is ‘Big Data’.”

“The benefit that the buy side receives comes in the form of effective cost and trade modeling based on the amount of historical data that is available and continually analyzed,” Rodela continued. “By way of many cost models and liquidity that our sell-side partners are able to provide, we as clients benefit from their vast amounts of past experiences.”

The most advanced pre-trade analytics is designed for the larger orders of institutional investors and large hedge funds, which are the most delicate in that they can move markets the most and thus be the most costly if not well executed. “The goal is to help traders who have a large percentage of volume to trade,” Phadnis said. “Guys trading 5,000 shares of Microsoft probably don’t care about this. It’s the guys at hedge funds and mutual funds with larger orders, who spend 80% of their time on 20% of their largest orders, trying to extract liquidity from brokers and markets.”

Featured image via Dollar Photo Club

Related articles