The data we all produce on a yearly basis is doubling in size every two years, according to EMC´s digital universe study. EMC reports that by 2020, our digital universe will reach 44 trillion gigabytes, nearly as many digital bits as there are stars in the real universe. That large volume of data – structured and unstructured – will inundate organisations around the world. In treasury, big data will continue to change the way finance professionals work – how they use data to create value for their businesses.
Big Data in Treasury
In general, data is expanding on three fronts at an increasing rate. First, the sheer volume of information is growing. Second, data velocity is constantly increasing and information is often available in real-time. Finally, the data variety, the number of sources and formats, is growing. To get a better idea of how data management has changed over time, let´s review some examples of what data flows looked like in treasury ten years ago and what they look now.
• In the early days, mergers and acquisitions were based on the CEO’s intuition. Today, finance teams pull together volumes of data from different departments, business units and sources to put together detailed cash forecasts to fund company growth.
• Every morning, finance professionals used to queue up in front of their CFO’s office to get a physical signature on upcoming payment runs, before entering the individual payments into multiple electronic payment platforms. Today, approval requests are sent online to a number of signatories, before world-wide payments are executed automatically in different formats through the cheapest channel, considering local business hours.
• Derivatives were traded via phone and fax, but now, state-of-the-art treasury technology connects seamlessly with trading and matching platforms automating the entire workflow. After a trade is executed, integrated market data is used for valuations that are again fully automated and in real time.
Presumably, there are still finance functions that are stuck in the past, but nearly all leading companies have started to transform their treasuries with modern technology. They use cloud-based treasury and risk management systems that capture global financial data from different sources, in real time.
Value of Big Data Analysis
Working with spreadsheets and many disparate systems in treasury is not an option anymore. First and foremost, it makes it impossible to manage global cash and risk positions thoroughly. Not knowing your cash flows and FX exposures in today´s volatile markets and in times of increasing cyber-crime, is just like walking blind on a tightrope. But not having all financial data in a single platform, also keeps finance teams from creating value with big data analysis. With it, they can:
• Avoid Re-Statements through Automated Fair Value Calculations
Companies reporting under US GAAP or IFRS need to consider the effects of credit risk when determining a fair value measurement. They need to calculate debit valuation adjustments (DVA) and credit valuation adjustments (CVA) on their derivatives.
After the risk experts have decided upon appropriate valuation techniques, they are challenged to source good credit data. Many companies look to the bond or DDS markets or to credit ratings to determine an appropriate spread over LIBOR that´s representing their counterparties and their own credit. Next, they have to calculate the actual fair values for their derivative portfolio, and finally, monitor the effectiveness of their hedges through-out their lifecycle.
Overall, correct fair value measurement consumes and produces a great deal of data. By automating CVA and DVA calculations, financial managers can save much manual spreadsheet work, reduce risk, and avoid re-statements for incorrect valuations and missing audit documentation.
• Mitigate Commodity Risk with Advanced Regression Analysis
Finance and procurement teams are increasingly working together to mitigate commodity risk. Close cooperation helps prevent over-hedging or hedging with inappropriate derivatives. That said, risk managers are challenged to capture all commodity-related cash flows and include those into their risk analyses. They need to source market data or calculate appropriate curves when no public market data is available.
With advanced market data analysis, finance groups can identify the effects of correlation between commodity exposures and derivatives. As some commodities are riskier than others, and some commodity prices often move in opposite directions of each other, selecting whether to hedge an exposure, with which derivative and in what ratio often depends on this analysis. Having done so, advanced regression analysis is used to establish a historical and ongoing relationship to ensure hedge effectiveness.
• Save Hedging Cost with Cash Flow at Risk
Cash Flow at Risk (CFaR) helps modeling risk on a short or long term horizons by measuring potential cash flow shortfalls. CFaR considers correlations between exposures across different asset classes and related volatilities. CFaR results allow risk managers to tailor hedging and, rather than simply hedging all their exposures 100%, test their hedging assumptions. Taking account of correlations, they can adapt their hedging decisions, often hedging less and saving as much as 40-60% of transaction costs.
From a data perspective, treasury technology captures cash flows and market data for all asset classes in order to calculate valuations, correlations, stress tests and scenarios in real-time. Fast algorithms and modern storage techniques allow for many millions of data points to be analyzed and compressed into multi-factor models to simulate future cash flows.
Big data can bring big problems without technology that can help financial professionals make sense of it. Capturing global financial data in a single cloud platform and using robust analytics and visualization can bring significant cost savings, while also providing a solid basis for business decisions.
By Philip Pettinato, Chief Technology Officer, Reval.