Email Contact Phone Company Visit Website

ActiveViam Office Head Office

6th floor, Shaftesbury House 151 Shaftesbury Avenue London


+44 20 894 5013


Xavier HM Bellouard
[email protected]
Back to all ActiveViam announcements

Data volume and velocity driving adoption of in-memory analytics in banking, Quartet FS research finds

In-memory analytics will become predominant architecture in investment banking within next 3 years

The rise of ‘Big Data’ and, in particular, high volume, high velocity data, is driving the adoption of in-memory analytics by the UK’s financial institutions, according to research conducted for Quartet FS. With 100% of respondents agreeing that Big Data is a significant problem for their institution, the majority (67%) believe that in-memory analytics will become the predominant architecture within the next three years to help tackle the problem.

The research, which surveyed IT managers and architects at UK sell-side institutions, found that all respondents are currently struggling with Big Data: 55% say that the problem is very significant and 20% admit that it is ‘extremely’ significant. More specifically, 47% say the main challenge is the volume of data; 37% the velocity; and 17% the variety.

At the same time, there is a growing requirement for faster data analysis with 75% of organisations highlighting a requirement for real-time reporting in relation to the front office business; 57% in regard to the back office and 50% for market risk analysis.

These dual challenges, of growing data volumes and the need for real-time analytics to remove latency, were mentioned by 68% and 55% of respondents respectively as significant factors driving the adoption of in-memory analytics. A quarter of organisations are already using in-memory analytics to help overcome these problems, with a further 50% of organisations intending to adopt in-memory analytics in the future. In-memory analytics will be rolled out across many parts of the organisation with most respondents likely to choose four main business divisions: front office, market risk management, CVA and client reporting.

However, today, the majority of financial institutions continue to rely on older technologies to analyse their data with spreadsheets the most popular method, used by 68% of respondents. Reporting and querying software is still used by 67%, followed by data warehousing (62%) and data mining (53%).

Commenting on the findings, Georges Bory, co-founder and managing director of Quartet FS, said, “With the majority of the UK’s sell-side still relying on spreadsheets to analyse their data, it’s no surprise that they are struggling with the ‘Three Vs’ of Big Data: Volume, Velocity and Variety. It seems however that we are on the cusp of a tipping point within the financial services industry with in-memory analytics evolving from discussion to widespread adoption over the coming years. Our research shows that nine in ten organisations foresee in-memory analytics becoming the predominant architecture for all users, uses and data in the future and two-thirds (67%) predict this will occur within the next three years.”

Respondents identified a range of operational and business benefits to adopting in-memory analytics, including better management of Big Data (52%), analysis of fast-moving data (50%), real-time visualisation of data (45%) and ability to make better business decisions, faster (43%).