Why the Smart Money is on Bitemporal

By Rupert Brown | 18 May 2015

Since 2008 the need to mitigate systemic risk across the globally interconnected Financial Services ecosystem has elevated data management expectations. The increased political scrutiny of financial institutions has increased pressure on organisations to ensure the highest quality of data on which to base their risk and capital adequacy calculations. Financial institutions need to keep up with an evolving family of regulations enforced by legislation led by the Basel Committee of Banking Supervision (BCBS). Specifically, they are expected to be able to answer detailed questions posed by auditors/regulators about the aggregation methodology and lineage of all the elements of their trading and risk analysis data sets.

Failure to meet these expectations to implement accurate and efficient data supply chain management causes massive financial burdens on institutions. In 2012, ING was fined a whopping $169 million. But without correct data and repeatable computation methodologies, the disadvantages go beyond mere monetary fines. Non-compliance can result in the closing of a business unit and the loss of top talent; it can also damage the brand image and relationships with customers and partners.  Other, even more painful consequences, can ultimately lead to the revocation of a banking licence.

Financial institutions have generally evolved their data supply chains via a mixture of batch and message-based integration techniques with varying temporal support mechanisms. In fact, the majority of information in applications is “mono” temporal (i.e. tracking data ‘as it was recorded’).  

However, data requirements have changed as the integrated world of interdependent financial services has evolved, meaning that this approach is no longer viable.The “mono” temporal approach is flawed because subsequent data adjustments are not recorded. For example, trade2 execution time is amended as an administrative error was found and the trade in fact was executed earlier in the day. The Valid End of the first trade is adjusted correctly to 09:59 and Valid Start on trade2 is now amended to the correct value of 10:00. The adjusted historical data is in the database, but it is impossible to determine what the organisation thought the position was at 12:30.

This dichotomy between “System Time” and “Valid Time” is known as bitemporality and financial institutions are now retrofitting this capability along their data and computation pipelines. Bitemporal is now as fundamental a capability for database platforms today as high availability was to the industry in the 1990s. This technology handles data along two different timelines, making it possible to rewind the information “as it actually was” in combination with “as it was recorded” at a chosen point in time. For example, bitemporal can ask: “what was my credit at 12.30, as I knew it at 12.30.” The answer is: We thought we had credit, as the debit had not hit our account yet, but in fact, I had no credit.

Following are a few examples of the differences between temporal and bitemporal:

Temporal - commonly used for handling data involving time ‘as it was recorded.’ The problem is that is only provides partial answers to business critical questions.

  • Where did John Thomas live on August 20th?
  • What was the information we had on that Blue Van on October 12th?

Bitemporal – handles data along two different timelines, making it possible to rewind the information “as it actually was” in combination with “as it was recorded” at a chosen point in time

  • Where did John Thomas live on August 20th as we knew it on September 1st?
  • What was the information we had on that Blue Van on October 12th as we knew it on October 23rd?

A bitemporal feature in a database platform makes it possible to go back in time and explore data, manage historical data across systems, ensure data integrity, and conduct complex temporal analysis with ease. Bitemporal data can provide the business with the following:

  • A complete history (audit trail) of what you knew and when you knew it.
  • Full support for corrections (history of corrections to past, current & future business time).
  • Reproducible business perspective history as you knew it at any point in time.

Financial institutions like Broadridge Financial are innovative in how they use their data, creating new revenue-driving services, mitigating risk and improving business processes. Here’s how:

Regulation and Audits

Financial and insurance industries need to address increasing regulatory pressure. They can accomplish this with bitemporal as it captures how decision information is used (including algorithmic), does not allow information to be altered post event, and it proves business state awareness/forensics.

Transaction History

All transaction history needs to be preserved to clear audits. When trades are reconciled with counterparties and closed, updates can occur. Bitemporal helps ensure investment banks can always go back and see when updates occurred.

Risk Management

Risk assessment models need to factor in all history, which is a primary function of bitemporal. Institutions can also better monitor traders’ current and historical positions, and eliminate uncleared margins.

Business Analysis

With a holistic and historical view of data, organizations can better understand the business with snapshots at any system times.

Reduced Costs

Bitemporal reduces cost and operational risks of redundant data stores.

Savvy financial companies are demanding built in bitemporal support from their database platform providers and it’s easy to see why. Implement bitemporal today so your organisation can cut costs, achieve faster time to value and mitigate risk.


By Rupert Brown, CTO Financial Services, MarkLogic

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development