04.05.2012

Unlocking Big Data

04.05.2012
Terry Flanagan

Islands of automation complicate the task of distributing information in real time.

The presence of silos, or buckets of asset classes, complicates the task of managing the ever-growing streams of real-time data flowing through markets.

For capital markets, valuable data lives in silos across multiple lines of business, geography or product desks, unable to be normalized, related and interpreted for analysis.

“In this post-securities master world, the challenge is to not only create a ‘golden copy’ of instrument data but also integrate related datasets, including positions, transactions, customers and counterparties,” said Michael Chow, director of corporate development at GoldenSource Corp.

Michael Chow, director of corporate development at GoldenSource Corp

“The ultimate goal is to unlock the value at the intersection of these domains; this is exactly the kind of problem Big Data can solve,” Chow said.

These silos tend to develop not organically, but rather around specific business initiatives, such as an investment firm deciding to launch a new trading division.

Product-centric data repositories are a fact of life. Data silos exist in the form of vertical (e.g., by asset class) and horizontal (front, middle and back office), and they’re unlikely to go away soon.

Instead this data needs to be integrated at a system level, which means getting data into one place in a rationalized readily operable format.

“This data lives across dozens of disparate systems across the firm and the difficulty of connecting those cannot be underestimated,” said Joshua Walsky, chief technology officer at Broadway Technology.

“Then, once you’ve got it in the same place, you’ve got to make it fast enough to meet the demands of the market,” he said. “Those two issues, although easy to explain and put on a PowerPoint, are tremendously difficult to actually accomplish.”

Firms cannot do this from siloed pools of data and disparate systems as there is no element of centralized control.

“Our industry is built around multiple products and services for each asset class and geographic region,” said Tony Scianna, executive vice president of product management and marketing at SunGard’s brokerage and clearance business.

That creates all sorts of data synchronization issues, especially as many processing systems still run in batch mode.

“You can’t wait for a batch environment in one part of the world to wait for one in another part of the world to be completed,” Scianna said. “You need to be able to capture and have access to information in real time.”

To capture value from Big Data, organizations must have a data driven culture and deploy new technologies.

“Transformative insights can only come when companies integrate information from multiple data sources rapidly,” said Matt Benati, vice president of global marketing at Attunity. “However, legacy systems, incompatible data formats, and very large volumes of data are all obstacles to timely information access.”

Experts recommend creating an enterprise data strategy which articulates the organization’s data models and architectures for integration and analytics. An important part of this strategy is evaluating whether traditional integration models, like extract, transform and load (ETL), should be augmented with new approaches such as change data capture and data replication solutions.

“Certain platforms are better suited than others; however, when large numbers of users access a typical organization’s data warehouse for reporting purposes, it puts unnecessary load on the server, degrades performance, and business users experience unacceptable delays for reports,” said Benati.

One alternative solution is data distribution using replication and change data capture technology. For this approach to be effective, however, rapid updates from the data warehouse to the reporting servers are essential.

“Using an approach such as EEL– extract, expedite, and load – for data distribution, instead of ETL, can be a better model for organizations that need their data fast, without a lot of transformations. Using this approach, business users have constant access to up-to-date information and fast reporting,” Benati said.

SunGard’s StreamCore functions as a centralized data management framework.

“StreamCore can aggregate market, reference and transaction data from multiple systems across the enterprise to feed downstream systems,” Scianna said. “This approach can help create holistic views of activity in order monitor firm-wide risk exposure and develop new business opportunities.”

StreamCore can also help prepare for new regulations that will require real-time reporting, Scianna said.

“It’s a real-time app that captures activity of transactional activity from multiple sources,” he said. “A client might have four or five account numbers in siloed applications for individual asset classes. StreamCore provides a single view of client activity in all asset classes.”

Many firms are well equipped to process structured data within a business unit, but they may not have the ability to efficiently process data across the entire enterprise.

“When there is a need to consolidate data from multiple proprietary systems and external sources, it can be difficult to reconcile the data structures and definitions used in each source system,” said Sean Hennigar, capital markets practice lead at SWI, a software consultancy.

It may also be difficult to determine which system is the authoritative source where there is duplication of data,” said Hennigar. “These issues can pose a challenge when implementing enterprise-wide risk management, compliance monitoring, or business analytic systems.”

One approach to addressing this problem is to implement multi-asset class enterprise services that support multiple business lines. Where there is common shared infrastructure, it is easier to implement consistent auditing and reporting services.

To successfully implement this strategy, it’s necessary to segregate core data structures from extensions needed to support a given business unit.

“If core data structures are designed around open industry standards, it’s much easier to apply them across the enterprise,” Hennigar said. “This approach also facilitates integration with commercial vendor systems and external entities.”

Related articles

  1. MiCA requires crypto service providers in the EU to provide sustainability metrics.

  2. The regulator will publish draft tender documentBy by 31 January 2025,

  3. Accurately measuring biodiversity is key to unlocking finance for conservation of critical habitats.

  4. Harmonized data structure and standards will enable a seamless aggregation of the ASEAN view.

  5. There will be a single source of climate data for virtually all public and private business entities globally.