Going with the flow: Rewriting the economic rulebook with high frequency trading

18 August 2011

By Eric Schwartz,
president, EMEA,
Equinix

Last year, high frequency trading (HFT) and the need for speed was all the rage. Then two things happened: firstly, the flash crash stunned the public consciousness, which was already reeling from crisis and recession; then secondly, an influx of market participants, including many small start-ups, rushed eagerly to join in.

A year on talk is all about increasing competition between high frequency traders, their diminishing returns and strategy diversification. Investment in high frequency technologies remains strong, as does global demand for open, network neutral data centres in close proximity to markets and their financial ecosystems.

As such, the technology arms race continues, with an even greater focus on strong partners to provide the necessary scope and agility to stay in the game.

We see four key trends driving the technology arms race:

• Commoditisation and innovation.
• Real-time risk and compliance management.
• Shorter technology life cycles.
• Diversification of data sources.

Commoditisation and innovation

What was bleeding edge technology last year has now become a commodity, yet the inexorable race to zero latency continues. This year, traders are turning to exotic hardware such as field programmable gate arrays (FPGA) and graphic processing units (GPU), or clever use of traditional multi-core, network cards and 10gigabit Ethernet connectivity. They rely on Moore’s Law to keep them ahead of the competition, with 2 GHz FPGA and 40 gigabit Ethernet in the pipeline, each promising 4x upgrades.

Consequently, component latencies have fallen into low microseconds and some are even talking about nanoseconds for feed handlers. This is up to a thousand times faster than some of the mainstream legacy software solutions, especially during a microburst. However, this emphasis on unfamiliar hardware increases complexity, risks and especially costs. Learning curves inevitably ramp up, and, as always, advanced technology demands constant investment to keep pace with global competition. Today’s market leaders can easily become tomorrow’s followers.

Therefore, choosing the right partners is vital in order to share the investment costs and shorten or avoid the delays associated with the latest technology. More and more trading firms are increasingly colocating their servers in the most popular network neutral data centres to gain access to the wealth of colocated financial technology, network and other service providers. These firms are already deploying advanced technologies, such as warp speed FPGA, reflective memories, microsecond time-stamping on the network cards etc, prepositioning them in network neutral data centres, where demand is mushrooming close to the markets.

Real-time risk and compliance management

The regulators are also raising the bar for high frequency traders. In July, for example, the Security and Exchange Commission’s market access rule came into force. It mandated pre-trade risk and compliance checks and post-trade real-time surveillance reports across all access channels and orders, even those of in-house prop traders.

By the end of November, broker dealers will have to apply aggregate risk and compliance checks cross-asset and cross-market by customer at the same time. Interestingly, solutions for risk analytics or low latency pre-trade checks to address these requirements are already appearing in colocation data centres. They are using technologies such as data clouds and event stream processing, or once again, FPGA.

Putting some of these increased data volumes into context: while normal intraday volatility is down and despite generally thin markets, trading flow can be very 'bursty' in terms of volume - the growing number and firepower of high frequency traders means the peaks are ever higher. The latest market sell-off in August 2011 saw them exceed 5.25 million messages per second for all US equity, futures and options markets, according to marketdatapeaks.com (1.). During peak activity, over five million pre and post-trade risk and compliance checks will need to be carried out and reported per second across the market. Europe is still an order of magnitude lower, with fewer mandated checks, but that could change with the implementation of MiFID II and once electronic options trading takes off.

In 2003, one US trading firm apparently went bust in 16 seconds with a rogue algorithm, but only discovered it nearly an hour later. These kinds of errors would be prevented with the proposed pre-trade checks now being implemented. As in Formula 1 racing, the faster we go, the more we need to focus on safety and the management of risk.

Just over the horizon are the new over-the-counter (OTC) trading rules that will mandate central counterparty clearing and repository reporting along with electronic swaps trading and real-time collateral and margin calculations. Large investment houses could be chasing daily margin calls for potentially 20 or 30 global central counterparties, bilateral counterparties and clearing agents. Swap margin calls can significantly amplify market turmoil when liquidity dries up, so risk managers should beware and focus on agility. Interoperability for equity clearing in Europe will only add to the complexity.

Post-trade service providers, including exchanges as well as data vendors, are starting to take the lead with real-time risk and margin data-feeds. As such, the business opportunities for colocation data centres right now seem never ending.

Shorter technology life cycles

An interesting development is happening. The closer we get to zero latency, the shorter the technology life-cycle becomes and the greater the range of specialist knowledge needed to stay in the race. Traders are increasingly turning to component outsourcing for data feed handlers, for analytics, for benchmarking or risk management services, for algorithmic development and back testing, and indeed for trading engine hosting as well. These choices have to be made for each geographic region, as offerings will differ. Few have the internal resources to keep up when everything is changing at once and volumes are spiking so dramatically. If the macro-economic situation worsens, things could become even more challenging, so the combination of strong partners and in-house agility will be crucial.

Once again, high frequency traders, or brokers who compete with them for liquidity, are colocating their trading engines in network neutral data centres close to the markets, where low-latency services and connectivity are only a cross-connect away. Even low frequency, long-only traders who want to poll a range of dark pools on the way to market may find low-latency access and liquidity tracking services a distinct advantage.

The old paradigm of proprietary, fortress-style data centres required all technology suppliers to be individually linked in. This simply cannot keep pace with ever-diminishing technology life cycles. Instead, colocation has rewritten the economic rulebooks for global trading systems.

Diversification of data sources

The fourth trend is the growing diversity of data feeds used in trading decisions. As competitors have piled in, the alpha potential of single markets has been rapidly eroded, so traders have turned to cross-market, cross-asset strategies, emerging markets and global strategies. In parallel, high frequency statistical arbitrage methods have become crucial to keep fragmented markets and asset classes aligned. Suddenly, market data requirements have exploded.

We have also seen liquidity moving intraday from one market to another as fashions change, so traders need to be sensitive to these shifts and follow them. And if traders feel that a particular venue has attracted ‘toxic’ liquidity, which is trying to game and front-run their trading strategy, then that venue must be avoided.

Trading algorithms are becoming smarter by monitoring a wide range of technical and liquidity metadata, including latencies, quote and fill statistics, update gaps, and various deviations against a consolidated book. By comparing trade timestamps with order timestamps, algorithms can estimate, for example, how close they were to capturing the trade and therefore preference a venue accordingly.

Fading quotes and liquidity mirages are a common complaint that can only be solved with a faster infrastructure and more intelligent, data-rich strategies. Colocation is an important way to speed up infrastructure for any class of trader. With network providers constantly leapfrogging each other in terms of latencies and value added information, network neutral data centres enable traders to draw on the widest range of data sources with a minimal of plumbing. Some vendors even offer free trials of their services through a cross-connect, which enables traders to evaluate their options at low cost.

Indeed, we are now seeing hedge funds trading from multiple colocation sites accessing multiple network carriers, perhaps with a centralised risk and execution management server. Each trading engine is receiving a wide range of direct and consolidated market data feeds, possibly filtered, and doing its own aggregation locally, which guarantees minimum latency. A common strategy is then to colocate the analytics processes together with the relevant data source and only to forward short alpha or risk signals to the trading engines, rather than the whole data feed. This saves both costs and significant latency delays.

News feeds and various kinds of event streams, like corporate actions, are another rich source of alpha or risk signaling. Market sentiment towards a name can be gauged directly from the news articles or via constructed news analytics. These analytics have been found to be particularly useful, for example, to predict volume spikes or a surge of volatility. Meanwhile, low-latency macro economic data direct from the government lock-up rooms is much in demand to anticipate directional moves in the market. There also appears to be growing interest in using Twitter feeds or other extracts from the blogosphere in order to beat the official news agencies to the story.

As data sources diversify and disperse, so traders are moving their analytics to the data rather than bringing the bulky and expensive data to the analytics. This may be done by buying in the analytics or else by colocating one’s own software at the data vendor’s hubs. Once again, market trends are driving traders and their suppliers to cluster together in network neutral data centres, featuring rich and mature multi-asset financial services ecosystems.

Meeting the technology arms race challenge

Speed, scale and electronic connectivity drive financial markets. As globalisation revs up, financial companies need to be literally everywhere: to access global, fragmented markets and information services, and to integrate with multi-asset clearing houses, a host of new services, industry utilities, and a growing range of counterparties. Their processing must both be flexibly embedded across the network of liquidity as well as able to respond to incessant technical innovation and increasing regulation. Thus, traders are clustering with their specialist partners in open, network neutral data centres that are rapidly expanding around the markets to increase agility and reduce costs.

Because Equinix works with a wide range of trading venues, buy- and sell-side firm, low latency networks and financial services providers, it has developed a unique perspective on the emerging trends in HFT and has found that openness in a neutral facility is a key trading component. As any organisation can colocate in a neutral data centre, business opportunities are maximised and participants enjoy wide choice. This contrasts to offerings from network operator-owned data centres, where opportunities are more limited because they cannot mobilise flows outside their own network. Network neutral data centres have now become the preferred option.

Whereas the first generation of HFT focused purely on colocation with markets to achieve low-latency execution, HFT 2.0 is all about shortening the end-to-end value chain and sharing costs across the whole community of traders, markets, intermediaries and their financial technology or other service providers.

We have seen how HFT not only accelerates the race to zero latency with exotic new hardware and software solutions, but is also forcing the pace to real-time risk management, shortening technology life cycles and turning traditional trading architectures inside-out. Instead of bringing the data to the trading algorithm, the algos are rewriting the economic rulebook by venturing out into the network neutral data centres and across the whole market fabric. We call this going with the flow.

1. Opening Cross: The Cost of a Ride on the Wall Street Rollercoaster

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development