This week has been mostly spent finishing research for my paper, and starting to write the rough draft. I had a skype call with Piero La Mura, who is one of the leading experts on quantum computing and the financial system. His research and our talk was very interesting, but I’m not entirely sure whether I can incorporate it into my work at this moment, or whether it will lead to projects in the future.
High Frequency Geography: Mapping the Materiality of the Global Financial System
This paper focuses on recent directions in my artistic practice exploring the relationship between finance and geography. In the past 15-20 years, the global financial system has come to rely more and more on computer mediated trading practices, known as algorithmic trading. Within this larger field, one area that has generated a particular amount of interest is high frequency trading, where profits hinge on the speed with which algorithms can react to fluctuations in market prices. In a global economic system where activities in Tokyo might affect algorithmic decisions in New Jersey, High Frequency Trading has come up against the fundamental limit of physical reality – the speed of light. This paper describes how trading activities have shifted away from human actors on trading floors towards algorithmic actors inside data centers, with a focus on the underlying infrastructure that these electronic trading exchanges rely on. I frame my research within a discussion of previous works by artists addressing the complex global financial system.
Two serendipitous things happened today.
Toby sent me an article on a plan in the works where three different companies, one Russian, one Canadian and one American, are all investing heavily into laying down high speed fiber optic cables that would traverse the arctic circle and provide much faster connections between the US, Western Europe and Japan. Despite the huge investment and undertaking (each cable is estimated to cost between $600 million and $1.5 billion each, and will reduce latency between London and Tokyo by 30%), this is only possible due to global warming and polar ice caps receding significantly in the last few years.
I believe that this cable, this story, really brings together the connections between the financial system and the environment that I’ve been trying to deal with in more metaphorical ways (with the Unity landscape). Currently, I’m trying to figure out how to relate the speed gained by investing billions into these cables, an idea only made possible by human impact on our planet, and the effects that the financial system (and the infrastructural feats we are willing to do in its name) will continue to have on the environment. Here is the map of the planned cable.
I also met with Heather earlier in the day about my paper topic for her research studio class and my thesis project. Since I haven’t actually completed the projects I want to make, I can’t necessarily write the artist’s paper she had in mind for me. I still want to frame what I’ve been researching in that vein, but I may have to focus on other artists who have tried to do similar things.
So that got me looking at a book I’ve had for a while called Else/Where: Mapping New Cartographies of Networks and Territories. Flipping through, I came across a diagram that immediately resonated with me in light of the previous map.
The diagram, called “Centers and Peripheries,” was originally made by geographer Denis Retaillé in 1992, but included in a 1994 volume on the “globalization of capital” by the economist François Chesnais. In his chapter “Counter Cartographies” Brian Holmes discusses the map.
This map shows three things. First, a circuit linking the United States, Western Europe and Japan, the so-called “Triad” regions, which form a “global oligopoly” accounting for the majority of industrial and financial exchanges. Second, the major nodes of the world network, represented by densely outlined circles. And third, the hierarchical relations between the regions, as described with these categories: center; periphery integrated to the center; annexed periphery; exploited periphery; abandoned periphery. Chesnais performs a Marxist analysis, showing how globally fragmented production lines are coordinated through the computerized circuits of the financial sphere. His map describes the hierarchy of social relations in a post-national era, when no political formation can erect any substantial barrier to the dictates of capital. And it reveals the near-perfect correlation between the graph of virtual flows and the geography of human exploitation.
I need to think about the relationship of these diagrams a bit more, but it’s as if one is predicting the existence of the other.
This past week has been a whirlwind of speaking to experts and consultants about my research. In chronological order, these are the people I’ve spoken to with a few notes from our conversations.
Meeting with Nancy was wonderful. She immediately got my concept and was really good about offering references that she thought might be relevant. The first thing we discussed were waterlots, and a project we’d both seen at the CCA Curatorial MFA show in 2010 (I guess we’re both from the Bay Area). Sandra Nakamura makes installations with pennies that represent larger value connected with prices of land. For this piece she turned a grant from CCA for the amount it would have cost to buy the waterlot it sits on during the Gold Rush. She then turned this amount into pennies.
She also referenced the Propeller Group, a collective from Vietnam who were in the last Triennial at the New Museum. They did a project where they re-branded Communism, overlaying two opposing forces that makes the viewer confront the absurdity of the capitalist machine.
She told me to look at Carsten Holler’s work, since he approaches his artwork very grounded in his scientific background. She told me that the slides are about doubt, which is an interesting and non-interactive take on them.
Find a poetics about the technical
Try to make connections; 5-6 thought experiments
Do something that confronts the body – where the emotion lies
Look at Xavier LeRoy (choreographer) and Cassie Thornton, who has an excellent project where she turns people’s debts (bank statements, bills) into nuggets of paper mache gold, or bling.
Exercise to examine the core mechanics of a prop – light, heavy, bouncy. Emotion as an end goal.
Sean McIntyre, a first year who does a lot of work with mesh networks, had a previous life as a high frequency trading programmer. It’s sort of the best case scenario, since he gets what we do here at ITP, and he’s not under an NDA like all current high frequency traders. He was nice enough to sit down with me last week and tell me a little bit about how the system worked (from his experience). He worked at Virtu, and did a lot of quality assurance, which was basically making sure that the algorithms worked properly before they “unleashed the beasts” (his words). He confirmed that they colocated their algorithms in three locations – Carteret, Weehawken and Seacaucus – most likely so that they could communicate with the trading exchanges nearby. At this point, NYSE didn’t have their Mahwah data center built yet (2008-2010).
GETCO was the company to beat
Bankruptcy in seconds (if things went bad)
Arbitrage across data centers
Citigroup stock was consistently in the top 5 for volume. Volume was the biggest indicator for HFT, much more important than closing price. They liked Citigroup cuz it was relatively cheap, and predictable.
In terms of liquidity rebates and transaction fees (other factors beside bid and ask spread that affect HFT algo decision-making), these are negotiated individually between the exchange and each trading company. This is one of the reasons that Virtu poached Chris Concannon, a former VP of Nasdaq, due to his connections and ability to negotiate better prices for their company.
Rebates are tiered according to a firm’s performance. More volume, lower fees
Chronos – their name for a time-based strategy
Algos usually incorporated multiple strategies
Not so sure about the HFT algos that lure others by buying and cancelling – Your can easily piss off an exchange by spamming them with buy/cancel orders
HFT has a data processing problem
Nasdaq exchange protocol = FIX protocol
Every exchange has a different standard of sending messages, need to figure out how to get them all to talk to each other
Messages from data center in 1 of 2 formats
Whole book or stock specific
I spoke with Petter Kolm from Courant Mathematical Finance Dept last Thursday, which has direct connections to Wall Street firms.
He told me that HFT algorithms are actually not that complex, just operate really fast
He suggested that I might model one simple system – and change parameters overtime
If volatility in the market goes up, algorithms become more aggressive
trading – sell-side activity, service to customers to minimize transaction costs – “agency algorithms, sell-side algos”
Aggressive HFT – one strategy is to pick off those large orders, and buy ahead in order to sell them the stock they want at a profit
Passive – place limit orders in the book
Limit – spread based on supply/demand
Aggressive HFT – instead of providing liquidity, you take it
Prop Trading Firms, Hedge Funds & HFT firms all employ different strategies at different frequencies
Market Making – they can be at the top of the limit order, buy low, sell high
colocated latency, 3-4 ms in exchange, longer if outside
Dark Pools – 30 dark pools exist
-they’re listed on the web
-just “another form of electronic trading”
-allow people who want to trade larger amounts of shares at once to execute them in one go, without the market seeing it and changing the price of the trades – “slippage”
-in dark pools, trades are marked at the midpoint between the buy and ask prices, don’t have to reach the ask.
-in reality, the avg size of a trade in dark pools isn’t as large as what they were designed to accomodate
Very helpful, offered to put me in touch with Knight Capital Group, who he did dataviz for. They gave him a full day of intraday trading data for various stocks to visualize. He said I could use his name and perhaps they’d offer something similar.
He said the map sounded really interesting, referenced “They Rule”, Kevin Slavin, etc. Something he would “tweet”
He said to look at the Nanex crop circles and pick them apart with someone who might know what’s going on – they could be an opportunity to visualize
Went to office hours with Tom for his advice on the “understanding networks” angle to my research and project. He had some great ideas in terms of distilling my concept and how to proceed
Some notes from our meeting:
Not just actual locations, but distances/speeds in relation to how fast packets can travel. For instance, speed of fiber cable will connect the same two locations at different rates.
How has HFT changed the daily workflow for traders? (Trying to see if it has affects on human actors, or if things have changed over time because of it)
In addition to talking to quants, reach out to fund managers to see if this has changed the way they manage their team?
How might it affect a fund manager vs a specific stock trader?
If HFT injects liquidity it also injects volatility
Look at the GPS Spoofer article again with Higgs Boson debunk in mind – basically, they thought they find a particle that went faster than the speed of light but really it was an error in their GPS signal data
What do I need to ask a quant vs a trader?
Have the principles behind shorting changed (because of HFT?)
How have human trading patterns changed since HFT
Arbitrage – pure inter market arbitrage, other strategies
Tell me about some of the different methods that traders use, pattern of those methods over time, both manually and algorithmically
In addition to these first person sources, I have also been reading about human perception of time, and the time it takes to process actions and our consciousness of our actions. I read a chapter from The User Illusion, by Tor Norretranders, that described an experiment done by Benjamin Libet in which he attempted to determine the time and order of people’s consciousness of their own actions. Essentially, people react before their conscious brains do, so the whole idea of conscious action and agency generates from impulses in the body, where our brains explain them by saying they “wanted” to do something.
Another interesting thing the reading referenced is Wilhelm Wundt’s complexity clock – which is a clock that takes about 2.56 seconds to make a full rotation. People can still visually see the 3 ‘o clock, 7 ‘o clock (etc) spots around the clock, so that they can pinpoint smaller amounts of time more easily than just trying to sense what time it was when they made a decision.
I’m also reading up on whitepapers about GPS and different high frequency trading strategies, which I will summarize in another blog post.
Here are some thoughts on three papers related to the qualities of high frequency trading when analyzing them in terms of small increments of time at incredibly high speeds.
High frequency trading has been in the news as of late. People have been forwarding me a few insightful articles that led me to new journal articles. This well-written article referenced this paper:
Hasbrouck, J., & Saar, G. (2011). Low-Latency Trading, 10012(September).
It gave me a better understanding between what they call Agency Algorithms (which I believe other papers have referred to as ‘passive’ algorithms), and Proprietary Algorithms. Agency algorithms are used by large institutions when buying or selling many orders at once, in order to time them so as to “reduce slippage,” and keep as much profit on the order as they can. These algos still look to larger market trends, and might suggest to a human trader which stock(s) to buy or sell, but the trader would most likely determine volume, then execute the order via Agency Algos.
Proprietary Algorithms are what actually qualify as “low-latency algos” or aggressive high frequency trading. These try to game the speed of the system itself, baiting other algorithms to place an order, so that they can pounce and do it first. The patterns of these algorithms are a lot of buy-cancel-execute orders in milisecond periods of time, in an attempt to confuse the other algorithms out there and profit before they can.
Another article that many people have emailed me is Wired’s article on how HFT could negatively affect markets. The article was primarily a synopsis of this paper:
Johnson, N., Zhao, G., Hunsader, E., Meng, J., Ravindar, A., Carran, S., & Tivnan, B. (n.d.). Financial black swans driven by ultrafast machine ecology. Physics.
I found this paper incredibly compelling. The authors look at periods of time less than 650 ms, which is the threshold of human response time. As an example, they cite that 650 ms is the time it takes a chess Grandmaster to realize they are in trouble. This gives context to the transition away from “traditional human-machine systems,” where human oversight is possible if changes to the system are observable within human response time.
They describe the global financial system as governed by “the self-organized activity of a global collective of trading agents, including both humans and machine algorithms.” Since this system operates without much oversight or “real-time controller,” the study heeds researchers to develop a “scientific theory for the underlying human-machine ecology on these ultrafast timescales.”
They use the term “black swan” to describe events in the market that reflect extreme volatility, or jumps in pricing. Their definition: stock price had to tick down or up at least ten times before ticking up or down and the price change had to exceed 0.8%. They reference Francis Bacon in their interest in studying these “black swan events” as “it is in such moments that a complex system offers glimpses into the true nature of the underlying fundamental forces that drive it.” They also determine that the nature of black swan events change fundamentally as “the duration threshold is reduced beyond typical human reaction times.” The paper reflects a shift away from a market made up of mixed decisions between humans and machines, in which humans have time to asses information, to a system primarily governed by ultrafast machines dictating pricing.
These articles have made me intrigued by the “quantum” properties of HFT at the sub-second time level. In the same way that Newtonian physics gives way to different behaviors at the subatomic level, financial systems (generated from the interplay of programmed agents collectively displaying complex behaviors) have the similar properties at the sub-second level – namely at time periods below the threshold of human perception: 650ms. As Wired states, “While market behavior tends to rise and fall in patterns that repeat themselves, fractal-style, in periods of days, weeks, months and years, “that only holds down to the time scale at which human stop being able to respond,” said Johnson. “The fractal gets broken.”
These areas of inquiry lead me back to the first paper I read on this topic, when my interests were primarily on Colocation and the effect of distance on high frequency trading. I went back and re-read it, and some things were made clearer, and others not. There is a lot of math there, but here’s what the paper describes in my understanding.
Wissner-Gross, a., & Freer, C. (2010). Relativistic statistical arbitrage. Physical Review E, 82(5), 1-7. doi:10.1103/PhysRevE.82.056104
1. The speed of HFT – with typical trade latencies below 500 microseconds – have made the speed that information travels over distance relevant. Basically, firms are bumping up against a fundamental physical constant, the speed of light.
2. The paper calculates optimal nodes for communication with multiple exchanges.
3. “Within financial markets, the relevant time series are typically the logarithms of the prices (log-prices) of financial instruments.” (look into log prices)
4. They use the Vasicek model to describe the behavior of “correlated financial instruments.” Based on Brownian motion.
5. The optimal intermediate location simplifies to the two center locations weighted by speeds of reversion. The speeds of reversion scale with market turnover velocities. The optimal intermediate locations are midpoints weighted by turnover velocity.
6. Note that while some nodes are in regions with dense fiber-optic networks, many others are in the ocean or other sparsely connected regions, perhaps ultimately motivating the deployment of low-latency trading infrastructure at stuch remote but well-positioned locations.
7. “Such slowing or stopping of the propagation of pricing information due to arbitrage is somewhat analogous to the refraction and scattering of light by a dielectric medium, but novel in an econophysical context…This result also raises the possibility of establishing arbitrage analogs of other concepts from optics and acoustics, such as reflection and diffratction.” Perhaps there’s something here I can tap for an installation.
I’m still trying to determine how to tie what I’d like to build for Thesis, with the research, programming, and writing I’ll be doing in Algorithms, Research Studio.
This week has been productive in some ways, but I’m still feeling a bit behind where I’d like to be in terms of determining the best direction to take in regards to “visualizing” high frequency trading, and conveying its qualities to a non-financial audience.
I submitted a proposal for an article to Triple Canopy magazine that would explore the colocation aspect of High Frequency Trading. I will see whether they’d like to work with me to pursue this research by early March. If they don’t, I intend to use the paper I write in Research Studio to reflect the direction I proposed to Triple Canopy. Hopefully this will become settled once I really determine the focus of my research.
After I was able to get the Bloomberg API running in Eclipse, I realized (with Dan Shiffman and Heather’s guidance), that I wouldn’t be able to access any of the data without an active subscription. Apparently the API for Bloomberg software is open, but the data itself is still commodified, and very expensive to access. I wrote to them inquiring about student discounts, but received a reply asking me if real-time data was totally crucial to my project. At this point, it’s not, but having access to the Bloomberg API would allow me to call so much data incredibly easily, that it’s a shame I can’t work that way. Instead, I’ve resorted to downloading CSV files of historical data from Yahoo Finance, which gives me the bare minimum of datasets, but which are probably sufficient for now. I have can access Date, Open, Close, High, Low, Volume and Adjusted Close. From looking into the metrics of the stock market a bit more, I realized that the Close value, and the Volume are what everyone watches for.
If I were to write my own HFT algorithm, I would need access to a certain stock’s buy and sell pricing, as well as the share volumes available at those prices. As far as I know, this data is not available in the free historical datasets. I will have to keep looking into alternatives, or press the Bloomberg people I’ve been corresponding with that access to the API is essential to the project.
I began using the historical datasets to see if I could visualize the data in 3D space. One aim is to generate a 3D landscape from financial data. I worked with Miguel Bermudez, who wrote a CSV parser in C++ (thanks Miguel!), and then we brought the data into OpenFrameworks. We debated various avenues, but decided that feeding the data into a noise function would be an easy way to generate nice looking topography. It’s definitely not the most accurate, as 2D noise essentially turns the data into a lot of structured noise, but the effect makes for a nice way to extrude the texture into 3D space, mapping brightness to the y axis. Miguel found a nice tutorial on noise in C++ that we followed, in order to generate some “financial noise.”
Next step is to map the y coordinates to the brightness of the image. Then it should be a 3D model. My next goal is to have “real-time” data animating through the 3D stock terrain. I’m having trouble determining which part of the data makes sense for this and how, but hopefully with more thought I’ll get it.
I also tried experimenting with generating a 3D mesh straight from the data, essentially mapping x to time, y to normalized volume, and z to normalized close. Instead of a plane, it generates more of a line. I’m trying to map mesh to an image so that it looks as if it’s coming out of a plane, but haven’t quite finished that yet. The first image is before I normalized the data, mapping the min and max values to 0 and 1000.
I’m going to continue pursuing the 3D landscape generation from the data for another week. Then I plan to assess whether this direction makes sense with my larger goals, and if so, how I can improve my visualization algorithms to better reflect qualities in the data sets.
Our first assignment in Research Studio Algorithms was to research and summarize relevant research pertaining to our topics. I had been reading a few papers about High Frequency Trading over break, but took this as an opportunity to delve more deeply into the topic. My sources are a combination of academic papers in finance, economics, math, physics, law, as well as news articles, and general books on the topic.
For a more in depth description on High Frequency Trading and the qualities I will be looking into, download my Literature Review here.