The evolving demands on financial services firms: Is your technology infrastructure ready?

By Nikita Ivanov | 26 September 2017

No other industry has been impacted by evolving technology more than financial services. Mobile banking, 24-hour trading, and the dramatic rise in the total amount of data that needs to be processed have put tremendous pressure on system performance and scale. Meanwhile, evolving regulations, the expanding scope of financial fraud and the continued growth of business opportunities such as high-frequency trading mean financial services firms must be able to analyze growing amounts of data in real-time.

The In-Memory Computing for Financial Services eBook, published by GridGain Systems, will help organisations understand how in-memory computing platforms such as GridGain, which is built on Apache® Ignite, can address the performance, scale and availability demands of today’s financial services firms. Part 1 of the eBook, for example, looks in detail at high-frequency trading, fraud prevention, and real-time regulatory compliance.

The challenge of high-frequency trading

High-frequency securities trading uses high speed, rules-driven strategies based on quantitative models to drive computers which make multiple real-time trades. These computer programs analyze the market, develop trading strategies, and make decisions on when to buy or sell financial instruments. If this analysis is accurate and fast enough, a trader can gain a very profitable advantage.

The tasks required for high-frequency trading include obtaining real-time and historical market information, processing this information to create predictive algorithms, and executing trades based on this information. These transaction-intensive tasks require online transaction processing (OLTP) at the fastest possible speeds. A high-frequency trading system must also be able to rapidly fine-tune the predictive algorithms based on how they perform, which requires very robust online analytical processing (OLAP). For high-frequency trading to be successful, both OLTP and OLAP must work together simultaneously, which can put tremendous demands on computing resources.

The challenge of fraud prevention

Financial fraud costs companies billions annually and damages corporate reputations. Despite this impact, financial fraud continues to grow because preventing it requires a tremendous effort.

Fraud is very widespread, appearing as unauthorised checks, stolen credit cards, manipulated mortgages and corporate financial statements, fraudulent computer banking, illegally traded securities, diverted payment scams, identity theft, tax evasion, and a variety of forged documents. This means firms must implement multiple compute-intensive strategies for automating fraud detection, including statistical and multi-channel analysis, models and probability distributions, analysing user profiles, real-time algorithmic analysis, data clustering and classification, and artificial intelligence and machine learning. Performing these activities in real-time on extremely large datasets requires extremely high performant and highly scalable technologies.

The challenge of real-time regulatory compliance

Nearly ten years after the 2008 financial meltdown, financial services firms still face regulatory tightening from governing bodies. In the E.U., key regulations include Basel III and IV, MiFID II, the Net Stable Funding Ratio, and the Culture and Ethics Standard in Banking. In the U.S., the regulations include CCAR, IHC, Enhanced Prudential Standards, Dodd-Frank Living Wills, Basel III, and Enhanced Consumer Protection. Some of these regulations involve tracking constantly changing assets, weighting them by risk level, and evaluating them against acceptable levels of exposure. Other regulations involve monitoring transactions and applying analytics to look for ethical violations.

Further, banks are implementing or strengthening a wide range of controls related to money-laundering, customer knowledge, cybercrime, fraud, corruption, bribery, supervisor accountability, internal ethics, real-time trade compliance, and monitoring high-risk countries.

Once again, all these processes are placing a tremendous burden on companies to implement extremely fast, scalable, and cost-effective data technology.

Meeting these challenges with in-memory computing

In-memory computing platforms provide parallel distributed processing across the RAM deployed on a compute cluster. This enables much faster transaction processing compared to solutions built on disk-based databases. They can also be easily scaled out by adding nodes to the cluster. Until recently, the high cost of RAM limited in-memory computing to only the highest value applications. But a steady drop in the price of RAM has made in-memory computing platforms economical for a much wider range of use cases. Gartner now projects that the in-memory technology market will grow to $10 billion by the end of 2019, a 22 percent compound annual growth rate.

For high-frequency trading, moving to an in-memory computing platform delivers performance that is 1,000 to 1,000,000 times faster than an approach built on a disk-based database. It further benefits from the advantages of parallel processing across the server nodes. In-memory computing platforms can turn Big Data into Fast Data.

For fraud detection, an in-memory computing platform can eliminate the need to export the data from the operational system to an offline analytical system for analysis, enabling faster and more accurate fraud detection. The distributed nature of an in-memory computing platform means it can process transactions and perform the required analysis in real-time.

The benefits for regulatory compliance are similar. An in-memory computing platform accelerates risk management, monitoring, and compliance processes despite the need to analyze huge volumes of data in real-time. Further, the ease with which the platform can be scaled out means financial services firms can cost-effectively grow as needed to ensure ongoing compliance as the regulatory environment evolves and the amount of data continues to grow.

No matter the technologies a financial services firm deploys, the one constant is the need to process more data faster and more cost-effectively. In-memory computing is the key. To satisfy customer demand for real-time responses while eliminating fraud and ensuring regulatory compliance, financial services firms need to explore their in-memory computing options.