Analysing financial data: Best practices for banks

By Michael Salisbury | 7 August 2017

It is widely accepted that banking is one of the most data driven industries out there and online banking and electronic trading produces millions of data points every day.

Regulatory requirements are challenging banks more and more to capture, store, and analyse data over multiple years, departments and regions… a true multi-dimensional exercise. Often this data is scattered across a host of branches, geographies and applications, and quality can vary significantly. If your organisation captures and analyses banking data, here are some points to consider.

Use the right data. Sometimes using all the data that’s available to you can over-complicate your analysis and increase the likelihood of errors and anomalies in your data set and resulting model. To ensure you have a single source of truth in your analysis, it is important the application system has a dynamic link to the data source and the copying and pasting of data is avoided at all costs.

Asking the right question of the data is an important part of the analysis. One must strive for a complex methodology alone as the single way to find the answer. Complexity often hides the final answer requiring the model user to peel back many layers to truly understand the results. It’s often best to define the primary answer using the simplest methods possible first. The model author can then refine the model methodology as needed. Don’t forget to sanity check your assumptions. “Does this answer makes sense given today’s economic outlook?”

The data needs to feed a well structed model. We live in a multi-dimensional world and most data sets will have multiple dimensions to analyse, often with a request to ask the critical “what-if” questions of the data. Will you be opening a new branch or introducing a new product next year? Your model needs to be structured and flexible to enable users to ask those questions and generate quick and accurate results.

The business logic in the model is a key part of any analytical exercise. The model logic must be well documented and easy to understand for anyone reviewing the model. Having the ability to explain calculations in plain English, and not someone else’s complex VBA (Visual Basic for Applications) code, will go a long way to making sure your model is future-proofed for both yourself and for others in your organisation.

It may sound simple, but using the right tool for the job is key. The right tool will ensure ‘efficiency killers’, such as formula writing, error checking and auditing, are all alleviated through the use of professional features including natural language formula writing, built-in audit trail and a visual dependency inspector. Multi-dimensional modelling tools are at the forefront of financial analysis and the inherent nature of today’s global businesses means the questions asked of analysts often exist in a multi-dimensional problem space.

Along with selecting the right tool you must consider the platform the application runs on. Cloud is all the rage in today’s world, but should you only go with a cloud-based solution or can a hybrid solution (cloud and desktop) work?

Today’s tools can still leverage the power of the modern desktop and laptop computer. Technology continues to advance rapidly with smaller, slimmer and more powerful desktop and laptop computers. With 64bit technology and fast processors, what was once only achievable on an expensive server is now available through an affordable laptop – meaning complex financial data calculations involving hundreds of millions, or even billions of cells can be made at any desk, anywhere.

Despite efforts to maintain consistent HTML standards, browsers such as Internet Explorer, Chrome and Firefox still have significant differences in how they render web pages. For precise applications that create financial systems where high fidelity and consistency is imperative, a rendered web page that isn’t ‘pixel perfect’ can frustrate during the creation process. However, applications developed for computer desktops have rich UIs available that are appropriate for a high-fidelity user experience.

It is important for deployments to be flexible. To combat the threat of hackers, some companies have strict policies regarding what systems can and can’t be deployed outside of their firewalls – and if a vendor cannot offer an in-house serviced solution, it may count against them during the decision-making process. The best hybrid solutions offer the security of a desktop, and a deployable web service that can be deployed within your firewall. Alternatively, the vendor can provide the hosting service for you if they meet your IT security standards.

Despite all the advantages of the desktop, the web and cloud should still form a very important part of your strategy. You consumers want their information quickly and in an easy-to-use format. Many customers simply don’t need the horsepower of the desktop computer to consume and utilise financial data – they just need a convenient way to access your system. An intuitive web interface is cost effective, offers high adoption rates, and is easily scalable in your environment, or in the environment of a trusted hosting provider.

Even if you want to host all your financial data behind your firewall, you will still have many users needing to access your data across a variety of devices. A best-in-breed hybrid solution can offer interfaces to the data supporting all devices including computers, tablets and smart phones. A good hybrid solution should also offer excellent workflow management and communication capabilities to keep all team members informed, wherever they are based.

In conclusion, asking right questions of the data and using the right tools is a key part of your overall strategy in collecting, analysing and reporting of financial data for banking institutions. Use these guidelines to gain new insights into your data and be a hero in your organisation!