At the Real-Time Technology Roundtable
, held at BT’s headquarters in the City of London last week on 19 January, the focus on ‘big data’ this year continued. It is expected to be one of the top technology topics for 2012 in what is predicted to be a tough 12 months for IT spending.
Organised by the Financial Information Services Division of the Software & Information Industry Association, the SIIA FISD Real-Time Technology Roundtable focused on how the huge increase in data volumes and in the analytical computing power available to interrogate it will impact the market data and reference data segments – not to mention the financial markets as a whole. The lively discussions reminded me of a comment from Mike Lynch, the founder of Autonomy, made at a Bloomberg Tech Summit
I attended last December in my old job, that “massive change is imminent” in the IT sector. Big data is at the heart of this inflection point in the industry.
Speaking at the SIIA FISD Real-Time Technology Roundtable, Darren Lewis, head of CityVision Development at Arcontech, discussed how technology approaches in the future would be impacted by the need to accommodate big data flows and analytical techniques. In addition, he pointed out that the big data trend further enhances the need for standardisation and common Application Program Interfaces. One of the other four speakers to make a short 10 minute presentation at the event – before a half hour panel discussion at the end – Chris Pickles, added that market data is probably the biggest dump of data going into a bank’s dealing rooms. The head of industry initiatives at BT’s financial markets and wholesale banking unit expounded his view that by combining this with smaller data flows from post-trade payments, for instance, and then placing it in one unified comms ‘pipe’ firms could save money [by consolidating their big data processing needs].
The relevance or otherwise of cloud computing; social media interactions, especially the move to mobile; the value of unstructured data and so forth were all discussed as part of the event. Robin Manicom, Equinix’s head of financial services in Europe, is no doubt hoping for a surge in business for his co-location and data centre operating firm from the increased amount of data being created at present by businesses across all sectors. Analysing it needs the power that large data centres provide as well.
Martin Millstam, a solutions consultant with Exegy, a firm focused on delivering faster processing in the High Frequency Trading space, talked about how market infrastructures, such as exchanges, are going to have to scale up to handle increased data flows, with Nasdaq, BATS and others already moving to increase their connectivity and capacity.
During the panel session at the end of the roundtable – which was more of a short conference really – Michael Le Lion, executive vice president of Panopticon Software, explained that he wanted to add another V – for visualisation – to the four that Andrew Delaney, editor-in-chief of the A Team Group, had already spelled out as the key issue in the field of big data: namely, value, volume, velocity and variety. As Le Lion works for a data visualisation firm, his addition is not that surprising, but he is quite right in stressing the need for a good overview of the real-time data flowing about in firms and the financial markets.
Stuart Grant, Sybase’s FS EMEA business development manager, and the final member of the panel discussion group at the event, highlighted how data can now be analysed in-flow these days before it is even stored, as well as the way that cloud computing will play a large part in how organisations view and use data in the future. The fast rising new job title of Chief Data Officer, and his or her role in handling this data and analytical explosion, was also discussed at the event – truly a more welcome CDO than the financial services has given us previously.