Why data empowerment is the future of financial businesses

With senior business leaders increasingly needing real-time, comprehensive data for decision-making, a smart data fabric can help firms address the growing challenges of large-scale data management

by | July 22, 2022 | InterSystems and bobsguide

Historically, financial firms have been bricks-and-mortar businesses – their business model has been to serve customers and support the distribution, exchange and storage of cash and investments.  

Operating required an architecture characterised by large, monolithic applications and servers housed data centres owned by each bank.  

Broadly, this model is still in operation. Over time, banks have layered in new channels, starting with ATMs, then telephone banking, followed by web, and more recently mobile and these added services were, until recently, supported by the same processes.  

In practice, this has resulted in a very fragmented and siloed data architecture, with a lack of transparency over lineage and quality, varying levels of data integrity, and key data locked inside core applications.  

As financial firms have increasingly become data-driven organisations, this model is no longer fit for purpose.  

In 2022, the amount invested by banks and large financial institutions into their data architecture has never been higher. Data and analytics budgets continue to increase, with Deloitte’s annual capital markets report indicating that 84% of surveyed institutions plan to make further investments into artificial intelligence, and 62% plan to spend more on advanced analytics.  

The market is vast and growing, largely because financial institutions are in the midst of what has largely been a piecemeal transition to digital architectures.  

According to a recent survey, commissioned by InterSystems, 86% of business leaders at financial services companies aren’t confident that their data can be used for decision-making.  

The study, which surveyed more than 550 business leaders across 12 countries globally revealed the problem may stem from disconnected systems and data sources, with almost all (98%) respondents saying that there are data and application silos within their organisation. 

According to Joe Lichtenberg, head of product and marketing at data management technology company, InterSystems, one of the main problems that the piecemeal approach creates is a lack of granularity in mission-critical data. 

“Senior business leaders need the self-service analytics tools, and they also need the granularity of the data. Taking aggregations or approximations is insufficient,” says Lichtenberg.  

“You have to have the scalability so that if something looks interesting or it looks problematic, for example if a portfolio exposure creates a potentially unacceptable level of risk, it’s easy to drill down into that data.” 

Silos mean outdated information 

According to the survey, commissioned by InterSystems, only 5% of the data that’s used by business leaders, including heads of trading, heads of risk and c-suite leaders, to make decisions is less than an hour old. 18% is up to five hours old or intraday, while, 63% is more than 24 hours old. 

In businesses that process large volumes of data daily, including high-frequency trading firms and global investment banks, this can create substantial backlogs and create difficulties identifying irregularities or addressing excessive risk exposures.  

“Our customers need the ability to bring together larger volumes of data from more sources on demand, eliminate latencies and incorporate analytics into the fabric,” explains Lichtenberg.

“One of the major use cases is around providing information to business heads for decision support. Be that the global head of risk or head of markets, this not only requires real-time data, but also the ability for business users to be able to drill into live, dynamic data.” 

A low-code solution 

A smart data fabric is one solution, which not only provides the ability to patch gaps in a firm’s data architecture, but do so in a gradual, sustainable manner, as opposed to a big bang implementation.  

Unlike data centres, private cloud or even data lakes, which create multiple storage instances of various data assets, a data fabric allows all end users to access a real-time, single source of truth.  

This architecture-driven approach provides a low-code, or no-code solution, which can be implemented in stages and used alongside existing solutions.  

“Any and every application and data source, including external streams of market data, internal feeds, and even previous data management implementations like data lakes and data marts can all act as sources into the smart data fabric,” explains Lichtenberg.   

“Implementation is non-disruptive. A smart data fabric is complementary to everything already within an organization’s infrastructure. There’s no rip and replace required. It’s an incremental approach to get more value out of everything that a firm already has in place.” 

Steady progress wins the race 

As with any solution, success with a data fabric relies on forward planning. The best practice approach is an incremental implementation, according to Lichtenberg.  

The most successful implementations of a data fabric involve forward-thinking technology leaders, who identify stakeholders that have a business challenge – whether it’s around risk, operational reporting or client analytics, among others. 

The second key is to keep the scope of the initial implementation manageable.  

“Once there is a stakeholder and a specific business challenge, it’s time to identify source systems and business metrics or KPIs. It’s best to start with something where you can get quantitative value within three months or so, Lichtenberg explains.  

“This may start with three or four KPIs, seven or eight source systems, a set of analytics, whether it’s machine learning or business rules, or just SQL queries. The key is to start small and achieve success quickly.” 

After this initial stage, Lichtenberg recommends an iterative approach to expanding the coverage of the data fabric and delivering value over time.  

“From there any of the systems that you’ve exposed into the data fabric serve as a foundation. This allows users to add systems incrementally and attack additional use cases for additional stakeholders across the organization, as opposed to chasing a “big bang” implementation that takes years and is fraught with challenges.” 

Click here to read the full InterSystems survey, The Top Data Challenges in Financial Services  

Read more: Data management – reinventing a foundational capability to optimise cloud transition for financial services firms




Empowering Line of Business Users Through Data Democratization

White Paper | Asset management Empowering Line of Business Users Through Data Democratization

Start Building Privacy-Preserving Applications with Conclave

Best Practice | Asset management Start Building Privacy-Preserving Applications with Conclave

Privacy-Preserving Techniques—and the Power of Confidential Computing

Other | Asset management Privacy-Preserving Techniques—and the Power of Confidential Computing

Your immutable future - Let's talk DAML & ESG

Other | Asset management Your immutable future - Let's talk DAML & ESG