Floor to Electronic Trading

Alexandra Coym and Karl Ward

We all have this image of trading floors with hundreds of people yelling at each other while maniacally waving little sheets of paper in the air. Though this may have been the case 40 years ago, 82% of today’s trading is done electronically. The shift to electronic trading began in the late 1980’s and 1990’s, first to phone trading, then to electronic trading done in so-called “upstairs” offices close to the exchange. This new style of trading really took off in 1992 with Globex, the first global electronic futures trading platform. Traditional trading floor exchanges such as the New York Stock Exchange implemented their own electronic trading systems to compete with the heavily computerized and decentralized NASDAQ exchange.

Exchanges exist to bring potential buyers and sellers of securities together (either in physical or virtual form) and, through facilitating the process, reduce the risk of investing. There are two kinds of markets on which people trade – the primary and the secondary market. On the primary market, securities are created (i.e. through an IPO), whereas on the secondary market already-created securities are traded. There are still trading floors around the world that mainly deal with primary market trading and large institutional secondary trading, whereas most secondary market trading is done digitally.

During the ascendance of electronic trading, the traditional stock market roles of broker and specialist changed dramatically, as most markets minimized the role of brokers and some eliminated the role of specialists. Specialists (also called “designated market makers”) are similar to brokers, but only for a single stock, and with the additional responsibility of reducing volatility in that stock when supply or demand becomes unbalanced. They reduce volatility by selling more of their own stock when demand is very high, or buying that stock when supply is very high.

What works? What doesn’t?
As with many machine integrations that replace human activity, the advantages and disadvantages brought on by the change are less than clear. Proponents of electronic trading tout its role in reducing execution costs for trades via “straight through process” (i.e. removing the middleman). They also cite the increased liquidity it creates by bringing many more potential buyers and sellers into the market. Because electronic trading is accessible globally, trades have definitely become geographically independent, more competitive, and more anonymized. Arguably, there is also increased transparency into pricing (due to real time quoting and trade data) and accountability into nefarious practices by brokers (due to the electronic audit trail)–though the accuracy of any of this will be discussed later. Last but not least, tighter spreads (the difference between the ask and the bid price) allow for more successful trading due to prices not having to move far in one direction or the other.

Critics of electronic trading tend to express several systemic concerns. First, that the incredible volume of electronic trades by high frequency trading algorithms does tend to increase liquidity when markets are stable, but exacerbates volatility to a dangerous extent when the markets encounter a destabilizing event. One typical strategy for HFT algorithms is to withdraw all bids and offers when the market becomes unpredictable, which translates to a sudden disappearance of demand and usually a precipitous price drop. Second, the HFT traders flood the quote and order systems with fake trades in order to influence the price of stocks or to fool potential traders into believing the market is moving a particular direction–more than 90% of all quotes are currently fake. Third, the HFT computers are so fast and so close to the data coming out of the exchanges’ computers that they have an asymmetric advantage over human traders. These advantages make it trivial for them to get ahead of a human trader who has put in even a modestly lucrative bid, thus beating that trader to the purchase (a technique called adverse selection). As Dennis Dick from Bright Trading noted:

HFT’s are the new market makers without the traditional affirmative obligation of designated market makers to keep markets orderly. When uncertainty enters the picture, they cancel their orders and liquidity disappears. Without traditional market makers to step in and be the buyer of last resort, prices can fall quickly as we saw in the flash crash in May 2010.

With technological advancements, there is always the danger of unforeseen events exceeding human control. The 1987 “Black Monday” crash was the first hint of the pitfalls of electronic trading, when the Dow Jones index dropped 23% in a single day. “Program trading” (an early name for algorithmic and electronic trading) bore the brunt of the blame, and the SEC responded by establishing a system for halting trades when the market displays excessive volatility. A more recent and dramatic example of electronic trading risk is found in the 2010 “Flash Crash,” where the Dow dropped 9% within minutes. There are a lot of theories and explanations as to how this happened, but in the end they all point to computer error. One algorithm ‘decided’ to sell a large bulk which cause other computers to react by panic selling and buying, or aggressively short selling. It is hard to say how the situation might have played out differently if humans had been involved in the trades, but most likely the trades would not have occurred instantaneously, and multiple brokers/specialists on the floor would have caught the error before it was posted as an offer. Computerized trading works at such a high speed and volume that the severity of damage that can occur within minutes or even seconds is hard to fathom. The algorithms are set to search for certain patterns and react in certain ways, but that does not allow a lot of room for the kind of judgement calls that would have been helpful in the Flash Crash incident. Trading algorithms and proprietary trading platforms are well kept secrets that make it difficult to regulate the practices appropriately.

Who gets left behind?
With so much of the trading dependent on technological solutions, it’s clear that smaller companies that could not keep up with the latest multi-million dollar tech were pushed out of the market. Even for the large players, there is a constant drive to develop new and faster technology for handling the trades, all proprietary and secret. It even goes as far as trading companies paying to have their servers in the same building as the exchanges’ computers, or even on the same floor. Computerized trading enables firms to post buy and sell prices they don’t intend to follow through on, misleading potential investors and increasing instability in the market. The same goes for traders rapidly selling and buying back and forth between two entities they control, making it appear as though there is a lot of interest in that stock.

A big problem that has arisen from electronic trading are the so-called ‘Dark Pools of Liquidity’. In essence, these are large trades that are offered anonymously and away from the public between big financial institutions. The reason people use these dark pools is to make large trades anonymously so as to not reveal their strategy and stir up the market. The problem here is that investors that aren’t participating in those trades are disadvantaged by not seeing the trade beforehand, and therefore not participating in the price discovery and auction occurring behind closed doors by the participants in the dark pool. This brings us back to the earlier assertion that computerized trading has not necessarily made the market more transparent, but instead created more opportunities for making it opaque.

Some would argue that the machines and their algorithms help remove any bias and emotions of human participation that could affect the process, yet others might say exactly the opposite: that this bias and emotion is necessary for successful trading beyond the short term. The fact that over the past years so many regulations have been set up for computerized trading demonstrates that the harmful potential is larger (and weirder) than originally anticipated. With technology advancing so quickly it is nearly impossible to control what is being used on the market, unless you standardize the process or tightly regulate the network and information flow, in which case all competitive advantage of trading firms would disappear. On the other hand, if more and more restrictions are applied it could happen that the restrictions themselves become loop holes that certain players can use to their advantage.

One restriction that makes sense is a ban on dark pool trading, to restore the market transparency essential to make computerized trading fair for all players.

To contain the potential damage done by errant electronic trading and HFT, regulators might enforce a limit on trading volumes, or (better) an automatic “rate limit” that slows down trading as volatility increases.  The current system of halts is not fast enough to prevent incidents like the Flash Crash, which have nearly instant global market repercussions.

To deal with the volatility problems caused by fake trades, there should be a small penalty on canceled trades, as recommended by the SEC in their report on the Flash Crash.


2 thoughts on “Floor to Electronic Trading”

  1. This is cool.
    I have always thought of Dark Pools as ways for trader to transact without algorithms seeing them. That they were resistance to algorithms and HFT, not extensions. But from here it sounds more like they are a way to do it more effectively. I wonder if it’s both?

    What your analysis makes me wonder most is whether this stuff amounts to a meta-game on top of the existing structure, or a way of going *beneath* the standard game, into the “source code” of transaction. Seems to me more the former, more abstracted, even though it appeared at first to be a disintermediation. No?

Leave a Reply