Addressing Real-Time Regulatory Compliance with In-Memory Computing

By Nikita Ivanov | 7 October 2016

In the face of one of the slowest economic recoveries in recorded history, financial institutions face several challenges. The overall forecast for both developed and emerging markets is pessimistic, and unresolved geopolitical issues, such as Brexit, continue to create headwinds. De-leveraging pressure has pushed banks to reduce debt and risky assets on their balance sheets. Ongoing penalties and compliance costs related to the post-crash era have required these same institutions to allocate funds for any future related costs. It’s no wonder that the return on equity (ROE) profitability measure is below the normal 10 percent for the top 10 global banks, and that “easy money” is generally no longer available.

In response to these pressures, banks are cutting costs, restructuring, optimising certain business lines by selling non-core assets, and exiting less profitable activities. And all this tumult has created a tremendous operational challenge for these institutions: complying with the tighter regulations that have been imposed around the world. These regulations have been created in the hope of ensuring banks and their broader economies are stable enough to avoid another 2008-style meltdown.

These new regulations are creating significant pressure on institutions to update their technology. For example, some of the regulations focus on the risk level of the assets banks hold in their portfolios and require institutions to track the constant changes in the makeup of the assets, weigh the risk level of each asset, and evaluate the risk level against an acceptable level of exposure. In the EU, these regulations include Basel III and IV, MiFID II, and the Net Stable Funding Ratio. In the US, the regulations include CCAR, IHC, Enhanced Prudential Standards, Dodd-Frank Living Wills, and Basel III. Other regulations, such as the Culture and Ethics Standard Banking in the EU and the Enhanced Consumer Protection regulations in the US, require banks to apply analytics to their transactions to look for ethical violations.

Complying with both types of regulations – risk level and ethical – is forcing banks and other financial institutions to monitor, collect, and analyze vast amounts of data from multiple, disparate sources in real-time. This in turn is creating significant disruptions as vendors compete to leverage the latest technologies to deliver extremely fast, scalable, and cost-effective data management and processing solutions. Some of the major trends include:

  • Fintech – A new category of vendor, fintech, has taken over many core banking processes, including electronic payments, personal finance management, lending, investments, and even banking itself.
  • Digital transformation, agile architectures, big data, and advanced analytics – Batch and manual processing is being replaced with automated digital processes. Legacy systems are being replaced with agile platforms that improve time-to-market for new channels and products. To handle today’s huge datasets, banks are employing predictive analysis, data mining, big data, fast data, simulation, optimization, and location-based intelligence. Banks are also looking at how to perform big data related control at the transactional level without impacting performance.
  • Cloud – Now that most security concerns have been addressed, banks are leveraging cloud services for scalability, cost savings, better access to data, ease of moving and reusing data, removing data silos and more.
  • Distributed ledgers and blockchains – Banks are using distributed-ledger technologies, such as the blockchains pioneered by Bitcoin, to reduce the time required to execute and clear trades, and change the registry on trades.
  • Open source software and in-memory computing – Banks are making strategic investments in open source software technologies and other areas to drive down costs. They are partnering with technology firms and creating innovation labs to test new technologies. In particular, in-memory computing allows data to be stored in RAM across a distributed cluster of computers and processed in parallel. It operates thousands of times faster than traditional computing and allows additional nodes to be added to the cluster in real-time as the amount of data required to be in-memory increases. Banks are using in-memory computing, which includes in-memory data grids, to eliminate the performance and scalability problems caused by disk-based databases.

With the tight regulatory environment and cost pressures of today, banks need fast data technologies that accelerate risk-management, monitoring, and compliance. Further, as larger institutions continue to produce and collect massive amounts of data, they need to be able to achieve real-time analytics on transactional data, commonly known as Hybrid Transactional/Analytical Processing (HTAP), without a significant impact on customers. Finally, they need to be able to acquire these capabilities quickly and cost-effectively, especially given the current financial environment.

For a deeper dive into these evolving trends, how the key new regulatory requirements impact technology requirements in the financial industry, and how financial institutions are responding, download Achieving Real-Time Financial Regulatory Compliance with In-Memory Computing, a GridGain Systems whitepaper.

By Nikita Ivanov, Founder & CTO, GridGain Systems.

 

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development