Keeping up with the machines: Direct Market Access and the move to real-time post-trade risk management.

8 September 2010

By James Tomlin,
director of Risk Systems,
Patsystems

Over the last few years, a growing proportion of volume on the financial markets has come from automated trading engines, capable of thousands of trades per second. Exchanges have introduced co-location facilities and other measures to facilitate this activity, and clearing firms increasingly depend on hedge funds and other algorithmic traders for large proportions of their revenue. In order to gain an advantage over competitors, these traders require direct, unmediated access to the markets, presenting a unique and pressing challenge to the clearing firms: how can they compete effectively for this lucrative business while maintaining adequate risk controls?

High frequency trading of the sort practiced by Direct Market Access (DMA) traders has come under increasing scrutiny since the so-called “flash crash” of May 2010, when financial markets suffered a brief but severe drop in value. The root cause of the problem is still the subject of investigation, but at the time it was widely thought to have been exacerbated by the combined action of automated trading engines. The SEC has subsequently proposed a rule mandating the application of pre-trade risk controls by brokers. This would effectively prohibit naked access to exchanges and alternative trading systems in the equity and equity options markets. Meanwhile, the Commodity Futures Trading Commission (CFTC) has reactivated its Technology Advisory Committee specifically to look at issues surrounding DMA in the futures markets. CFTC chairman Gary Gensler has already said there may be a role for regulatory action in this area.

Paring back latency is a constant priority for DMA traders. Their whole set-up, from locating as close to an exchange as possible, to deploying the fastest hardware and software, is geared to quickly and efficiently identifying and exploiting market opportunities. Major resources are devoted to refining their infrastructure and removing barriers to speedy interaction with the exchange.

Pre-trade risk management is one such barrier. At black box trading speeds, the fraction of a second needed to determine whether or not a trade is within agreed risk limits can be enough to close the window of opportunity for the trade itself. Since DMA traders are competing with each other to be fastest to the exchange, clearing firms compete to facilitate that speed. Consequently, the ability and willingness to forego pre-trade risk management has become a point of competition between clearing firms.

Leaving the trading to a machine may eliminate literal ‘fat-finger’ errors, but sophisticated trading algorithms are ultimately only as good - and as fallible - as the people who create them. Software glitches, unforeseen market situations and human intervention all have the potential to trigger trades in excess of agreed limits. What is more, the speed so carefully engineered into the system makes it possible for a huge volume of bad trades to be made in a very short space of time.

All of which leaves clearing firms with a real conundrum: the clients for whom pre-trade risk management would be most vital, are the very clients for whom it is impractical. The result is that DMA trading activity is visible to the clearing firm - the entity ultimately responsible for the trades - only after it has taken place.

Rapid response

If errant trades cannot be prevented from reaching the exchange, then the next best thing is to identify them immediately and act swiftly. Once an overnight process, post-trade risk management is now increasingly becoming a front-line defence against trading losses, demanding risk calculations not only intra-day but in real-time. To make this work, two things are required:

1. Timely, reliable data. Using back office data as the only basis for risk management is no longer a viable option; by the time an errant trade becomes visible via the back office, untold further damage can already have been done. Major exchanges provide a direct clearing feed that notifies members of trades and price changes as they happen; these should be the primary data sources driving risk management.

2. A system that can monitor and react to this data as close to real-time as possible. It is no use having access to a constant stream of near real-time data if that data is only processed periodically. Many risk systems rely on batch processing that takes a snapshot of the situation once every ten or fifteen minutes. When you consider the enormous volume of trades a typical algo black box is able to execute in that time, it becomes clear that batch processing of this kind provides massively inadequate protection.

The SEC illustrates its recommendations with the following scenario: “Today, order placement rates can exceed 1,000 orders per second with the use of high-speed, automated algorithms. If, for example, an algorithm such as this malfunctioned and placed repetitive orders with an average size of 300 shares and an average price of $20, a two-minute delay in the detection of the problem could result in the entry of, for example, 120,000 orders valued at $720 million.” (1)

A similar error in a leveraged instrument like futures or options could result in losses of many times that amount - enough to bankrupt the clearing member in whose name the trades were made and place severe strain on the clearing house that ultimately guarantees the integrity of the market.

In a post-trade risk management environment, catching a problem before it escalates into a disaster requires the same attention to speed as is given to the trading itself. Calculations cannot wait for batch processes; they need to be triggered - immediately - by every significant new piece of data. Every trade fill and price change needs to trigger a fresh calculation, enabling a firm to know - and take automatic action - the moment a client breaches its limits. Only with such event-driven processes can risk management systems keep pace with the trading algorithms they police.

Risk management: responsibility and opportunity

With market forces working in the opposite direction, the widespread imposition of pre-trade risk management by brokers on DMA traders remains unlikely without the introduction of a regulatory obligation. Even with such an obligation, there would inevitably be a tendency toward the minimum compliance threshold, if that were also the lowest latency option. Whatever the regulators decide, there is likely to remain a gap between the level of pre-trade risk management applied and the overall level of risk management needed to provide an adequate level of comfort to brokers.

An alternative solution, proposed by the Futures Industry Association (FIA), is pre-trade risk management at the exchange level. An FIA working group, made up of trading firms, brokerages and exchanges, has recommended that exchanges “provide basic risk management tools, and construct them in such a manner that latency is identical for all direct access firms, no matter how clearing firms utilize such tools.”(2)

The aim is to encourage clearing firms to use these tools in the most responsible fashion, “without fear that it will lose business to other clearing firms that do not act so responsibly". (3)

Exchange-side pre-trade risk management would be a welcome development. If implemented across the board, it should increase risk protection while maintaining a level playing field for DMA traders, and to some extent negating potential systemic risks from the competitive reduction of risk management among clearing firms. However, the burden of responsibility for bad trades will ultimately remain with the member firms, and it is their bottom lines that remain at risk. This being the case, clearing firms will continue to require more than the “basic” risk management tools proposed by exchanges.

Also, exchange-side measures are inherently limited to a view of trading activity on that particular exchange. Many DMA traders pursue multi-market strategies; with trades spread over a number of trading venues, it is possible for them to exceed overall limits without raising an alarm at any individual exchange. There is, therefore, a growing trend for clearing firms to take a holistic view of their clients’ risk exposure across multiple markets. Once again, competition for speed makes doing this on a pre-trade basis impractical. Therefore real-time, post-trade risk management is the best available option.

Without real-time controls in place, clearing firms risk exposing themselves to catastrophic losses. In an effort to mitigate that risk, some firms may require high margin payments from their clients. Not only is that an imperfect risk mitigation strategy, it also puts the firm at a competitive disadvantage compared to firms with lower margin requirements. Real-time risk management, using standard margining methodologies such as SPAN, enables brokers to make lower margin calls while still being confident that those margins are sufficient. The broker is therefore better equipped both to attract more DMA business, and to take on that additional business without increasing its risk profile unduly. In short, a robust, real-time risk management regime is not only essential for catastrophe avoidance, but also enables brokerage firms to grow their DMA business.

1. SEC rule proposal "Risk Management Controls for Brokers or Dealers with Market Access" (page nine)
2./3. FIA Market Access Risk Management Recommendations (page three)

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development