The Quantitative Services Group LLC (QSG) Publishes Analysis Revealing The Costs of Algorithmic Trading

NAPERVILLE, IL, Jan. 4, 2004 – In its recently published research study "The Implementation Costs of Algorithmic Trading", the Quantitative Services Group LLC (QSG) reports that significant cost savings can be achieved by increasing the sophistication of an algorithm’s order allocation process. The analysis examined over a $1 billion in stock transactions from a large institutional investor, covering more than 120,000 unique stock executions. QSG used its T Cost Pro® transaction cost analysis platform to examine the component costs of the trading algorithms employed. The results indicated that front running or ‘information leakage’ was a significant contributor to the significantly higher costs realized by one of the automated trading techniques.

The equity manager who contributed to the study utilized two different automated stock trading algorithms to guide their trading decisions. The first, referred to as Non-Random, was a commonly available algorithm designed to deliver the VWAP for the trading period. The second, referred to as Random, was a custom algorithm that used a special randomization technique to distribute the orders to achieve the same goal. The Non-Random algorithm generated 26 basis points in trading costs compared to the Random technique’s 2 basis points of total costs; a difference of 24 basis points.

In addition, the report examines each algorithmic trading strategy’s costs by trading venue. The study identified that the Random strategy, relative to the Non-Random Strategy, reduced Liquidity Charges® by nearly 50% for every trading venue, except on the NASDAQ (30%). The study also found significantly higher Timing Consequences® for the Non-Random strategy, especially for NYSE executions.

The report highlights recent academic studies, which suggest that when the demands of other investors are known, informed traders will engage in front running behavior. The data collected in QSG’s study confirms the suspicion that algorithmic techniques that distribute orders at regular intervals/amounts often invite identification and exploitation by other market participants. "Improvements in trading technology and decreases in bid/ask spreads have effectively lowered the hurdle for those seeking to profit by reverse engineering these techniques. Increases in the use of algorithms, huge trading volume and an increasing number of trading venues have increased the opportunity for sophisticated ‘parasites’ to practice their trade," said John Wightkin, QSG Managing Partner.

The report suggests that users of automated trading algorithms must first have a measurement system that can systematically and accurately track the costs of employing these techniques and that thorough consideration should be given to the cost/benefit of using any ‘off the shelf’ algorithmic solutions. Moreover, the report argues that any successful trading strategy should be an outgrowth of the stock selection process itself, i.e., the signals that determine a buy/sell candidates must be considered when deriving the appropriate automated execution strategy. Finally, QSG found that a successful solution must be more thoughtful, thorough and dynamic than the one it replaces. Any change that oversimplifies the trading process will only reward the market opportunists and betray the value of selection decisions.

Become a bobsguide member to access the following

1. Unrestricted access to bobsguide
2. Send a proposal request
3. Insights delivered daily to your inbox
4. Career development