Using XSEDE shared-memory resources, researchers have started to show how the rapid speed of computerized stock trading may have little understood, non-beneficial effects on the market

Strange things have been happening on Wall Street, and some of them are related to the increasing role of computers in stock trading. Earlier this year (May 18) was the much discussed Facebook IPO (initial public offering) on the NASDAQ exchange. After technical difficulties delayed the offering, a huge influx of orders to buy, sell and cancel overwhelmed NASDAQ’s software, causing a 17-second blackout in trading.

PHOTO:Mao Ye

Mao Ye, University of Illinois, Urbana-Champaign. Ye credits XSEDE resources and consulting assistance. “Without XSEDE and shared memory, we wouldn’t be able to effectively study these large amounts of data produced by high-frequency trading.”

Suspicion immediately fell on “high-frequency trading” (HFT) — a catch-all term for the practice of using high-powered computers to execute trades at very fast speeds, thousands or millions per second. Since the U.S. Securities and Exchange Commission (SEC) authorized electronic trades in 1998, trading firms have developed the speed and sophistication of HFT, and over the last few years, it has come to dominate the market.

With HFT, profits accrue in fractions of a penny. A stock might, for instance, momentarily be priced slightly lower in New York than London, and with an algorithm in charge, an HFT trader can almost instantaneously buy and sell for risk-free profit. With HFT, traders typically move in and out of positions quickly and liquidate their entire portfolios daily. They compete on the basis of speed.

In June, The Wall Street Journal reported that trading had entered the nanosecond age. A London firm called Fixnetix announced a microchip that “prepares a trade in 740 billionths of a second,” noted the WSJ, and investment banks and trading firms are spending millions to shave infinitesimal slivers of time off their “latency” to get to picoseconds, trading in trillionths of a second.

Does faster equal better? HFT has happened so quickly that regulators and academics are barely beginning to delve into the complex implications. In theory, increased trade volume and improved liquidity — the ease of buying and selling — makes markets more accurate and efficient. But HFT is a different beast from traditional investing, which places a premium on fundamental analysis, information and knowledge about businesses in which you invest.

Many questions arise about fairness and things that can go wrong (such as computer glitches) to the detriment of the market. One of the first problems researchers face, however, is that with HFT the amount of data has exploded almost beyond the means to study it — a problem highlighted by the “flash crash” of May 6, 2010. The Dow Jones Industrial Average dropped nearly 1,000 points, 9 percent of its value, in about 20 minutes, the biggest one-day drop in its history. Analysis eventually revealed HFT-related glitches as the culprit, but it took the SEC five months to analyze the data and arrive at answers.

“Fifteen years ago, trade was done by humans,” says Mao Ye, assistant professor of finance at the University of Illinois, Urbana-Champaign (UIUC), “and you didn’t need supercomputing to understand and regulate the markets. Now the players in the trading game are superfast computers. To study them you need the same power. The size of trading data has increased exponentially, and the raw data of a day can be as large as ten gigabytes.”

To directly address the data problem and a number of other questions related to HFT, Ye and colleagues at UIUC and Cornell turned to XSEDE, specifically the shared-memory resources of Blacklight at the Pittsburgh Supercomputing Center (PSC) and Gordon at the San Diego Supercomputer Center (SDSC). Anirban Jana of PSC and XSEDE’s Extended Collaborative Support Services worked with Ye to use these systems effectively.

In a study they reported in July 2011, the researchers — Ye, Chen Yao of UIUC and Maureen O’Hara of Cornell — processed prodigious quantities of NASDAQ historical market data, two years of trading — to look at how a lack of transparency in odd-lot trades (trades of fewer than 100 shares) may skew perceptions of the market. Their paper, “What’s Not There: The Odd-Lot Bias in TAQ Data,” was published in The Journal of Finance, the top journal in this field. The study has received wide attention, and in September, as a result, the Financial Industry Regulatory Authority (FINRA), which oversees the securities exchanges, reported plans to reconsider the odd-lots policy.

In more recent work, Ye and UIUC colleagues Yao and Jiading Gai, examined effects of increasing trading speed from microseconds to nanoseconds. Their calculations with Gordon and Blacklight, processing 55 days of NASDAQ trading data from 2010, looked at the ratio of orders cancelled to orders executed, finding evidence of a manipulative practice called “quote stuffing” — in which HFT traders place an order only to cancel it within 0.001 seconds or less, with the aim of generating congestion. Their analysis provides justification for regulatory changes, and in September their study was referred to as “ground-breaking” in expert testimony on computerized trading before the U.S. Senate Subcommittee on Securities, Insurance and Investment.

Beyond Flash Crash: Odd Lots

Ye and colleagues recruited Blacklight’s shared memory to take on their study of “odd-lot” trades, specifically the absence of them — since trades of less than 100 shares aren’t reported in the “consolidated tape” of trade and quote (TAQ) data. The TAQ aggregates trade data across the 13 U.S. stock exchanges as well as several off-exchange trading venues that don’t display “bid” and “ask” prices.



Volume of trades not reported to trade-and-quote (TAQ) data as a percentage of total volume, showing that the total missing odd-lot volume of about 2.25 percent in January 2008 rose to 4 percent by the end of 2009.

To assess the implications of the exclusion of odd lots, the researchers relied on two datasets — NASDAQ TotalView-ITCH and NASDAQ high-frequency trading data — that are more comprehensive than TAQ. With Blacklight (and this year augmented by Gordon), the researchers analyzed a large cross-section (7000 stocks) and time series of data: two years — January 2008 to January 2010. For the TotalView-ITCH data, Blacklight’s shared memory could store all the files, a total of 7.5 terabytes, at one time — saving considerable time, says Ye, for some of the analyses. Ye accessed the NASDAQ high frequency data, about 15 gigabytes, from the research server at the Wharton School of Business.

In a series of computations, the researchers compared their comprehensive data with the commonly used (but incomplete) TAQ. They found that, due to HFT, odd-lot trades increased from 2.25 percent of volume in January 2009 to 4 percent by the end of 2009. The median number of missing trades per stock was 19 percent, while for some stocks missing trades are as high as 66 percent of total transactions. For the two-year period they studied, they found, furthermore, that 30 percent of “price discovery” — the amount of price change during a day of trading — was due to odd-lot trades. “This is huge,” says Ye.

Many odd-lot trades, their analysis showed, are the result of informed traders splitting orders. Suppose you want to trade 10,000 shares, but you slice it — through HFT — into 200 trades of 50 shares. “If you trade in large lots,” explains Ye, “people will guess something has happened and they can follow you. If you trade quietly through slicing into small lots, it looks to other people like no trade has happened.”

Prior to HFT, odd lots were a much smaller fraction of market activity, less than 1 percent of New York Stock Exchange volume in the 1990s, and their omission from TAQ wasn’t of major consequence. “Because odd-lot trades are more likely to arise from high-frequency traders,” the researchers write in their paper, “we argue that their exclusion from TAQ raises important regulatory issues.”

Ye and colleagues’ findings aroused discussion and were reported, among other places, in Business Week and Bloomberg Businessweek. Motivated by their study, the Consolidated Tape Association, a group of stock exchange executives that administers price and quote reporting, appointed a subcommittee to look at the implications of the truncated odd-lot data. Bloomberg reported in September that FINRA planned to vote in November on whether to include odd-lot trades in the consolidated tape.

Fleeting Orders & Quote Stuffing

Relying again on the NASDAQ TotalView-ITCH data, Ye, Gai and Yao this year looked at the effects on the market of increasing the speed of trading from microseconds to nanoseconds. For this work, which mainly used SDSC’s Gordon but also did some of the analysis on Blacklight, the researchers analyzed “fleeting orders” — orders that are canceled within 50 milliseconds of being placed.

Fleeting Orders
On August 30, 2011, about three-million orders were submitted to the NASDAQ exchange to trade the stock SPDR S&P 500 Trust (ticker symbol SPY). This image shows that 18.3 percent of the orders were cancelled within one millisecond, and 42.5 percent of orders had a lifespan of less than 50 milliseconds, less time than it takes to transfer a signal between New York and California. More than 40 percent of orders, in other words, disappeared before a trader in California could react.

Studies of the May 2010 “flash crash” have suggested that HFT speed in executing and canceling orders may have contributed to the sudden price drop. As a result, proposals for r egulation have sug gested a minimum quote life or a cancellation fee, which could be based on the average number of order cancellations to transactions.

Processing data files that contain the order instructions for stocks, Ye and colleagues did an “event study” — analyzing order messages that covered two periods during 2010, a total of 55 trading days from March 19 to June 7, when trading speed rapidly increased. Both these periods, which the researchers term “technology shocks,” occurred on weekends, when — the researchers note — “it is more convenient for exchanges or traders to test their technology enhancement.”

Because of this XSEDE-supported research, the Financial Industry Regulatory Authority is reconsidering the policy of excluding odd-lot trades from the consolidated tape.

Their paper, the researchers write, is “the first paper to explore the impact of high frequency trading in a nanosecond environment.” They found that as trading frequency increased from microseconds to nanoseconds the order cancellation/execution ratio increased dramatically from 26:1 to 32:1. Their analysis found no impact on liquidity, price efficiency and trading volume, but found evidence consistent with quote stuffing — a high volume of trade aimed at congesting the market.

The increase in speed from seconds to milliseconds, say the researchers, may have social benefit by creating new trading opportunities, but they doubt whether such benefit will continue as speed goes from micro to nanoseconds, or possibly, to picoseconds. Their analysis gives justification for regulatory changes, such as a speed limit on orders or a fee for order cancellation. “While it is naive to eliminate high frequency traders,” they write, “it is equally naive to let the arms race of speed proceed without any restriction.”


© Pittsburgh Supercomputing Center, Carnegie Mellon University, University of Pittsburgh
300 S. Craig Street, Pittsburgh, PA 15213 Phone: 412.268.4960 Fax: 412.268.5832

This page last updated: November 09, 2012