01.20.2016

What do banks need to do to survive in the new risk era? (By Neil Vernon, Gresham UK)

01.20.2016

With high profile internal security breaches, new stringent regulation, and regulatory fines on the up, banks are under greater pressure than ever before to both guarantee and evidence the integrity of their data.

But with existing banking IT infrastructures – with their patchworked systems, siloed processes and fragmented organisations – only serving to increase risk, how can banks get control and implement a framework that reflects the realities of the dangers they’re facing?

A data control framework fit for purpose

Many banks know they have holes in their systems that are only getting bigger as the number of datasets they deal with evolves and grows. Aside from the risks inherent with an inability to red-flag exceptions and aggregate data to identify potential external threats, they’re leaving themselves open to internal abuse from unscrupulous individuals who have identified how to exploit these holes to their own ends.

The only way to keep pace with the multi-faceted data integrity challenge is with a flexible, adaptable control framework. One that can on-board new regulation and implement it across multiple data feeds simultaneously – that isn’t constrained by complex data and can adapt to new controls without a heavy reliance on IT.

Luckily for banks, implementing a framework like this is no longer a case of wholesale system replacement. Agile ‘plug in’ data integrity platforms are specifically designed to adapt to change. Controls are stringent, exceptions are highlighted in real-time and existing regulatory frequently changing and evolving requirements, with flexibility to accommodate new legislation quickly and efficiently as it’s introduced.

Designed to consume complex streams of near real-time data, these platforms can handle data in multiple formats, of any width and structure. Instead of the data having to work with the technology, the technology accommodates the data, meaning it can be fully implemented in a matter of weeks, without the additional cost and time required to relabel and change processes.

A new approach to operational risk management

Just a few short years ago, implementing a new data control framework would have been a mammoth task, sapping resource from other critical operational and IT functions. As we head into 2016 it can be a simple augmentation project – live and operational within a few weeks. It’s a smarter route to data integrity. A route that empowers banks to take control without fearing what lies under the hood.

Neil Vernon is Chief Technology Officer at Gresham Computing.

Related articles

  1. Aim is to provide derivatives market participants with more transparency and control over their liquidity requ...

  2. Swap Clearing Volumes Rise in Asia

    KKPS, the largest institutional broker in Thailand, has signed a multi-year agreement.

  3. Banks' Risk Management Seen as Lagging

    They play a significant role in the efficient functioning of OTC derivatives markets.

  4. ICE provided near continuous service while transferring CDS clearing from Europe to the US.

  5. Banks' Risk Management Seen as Lagging

    The UK pension scheme can get a holistic view across multi-asset portfolios and fully integrate ESG factors.