- Source: Algorithmic trading
Algorithmic trading is a method of executing orders using automated pre-programmed trading instructions accounting for variables such as time, price, and volume. This type of trading attempts to leverage the speed and computational resources of computers relative to human traders. In the twenty-first century, algorithmic trading has been gaining traction with both retail and institutional traders. A study in 2019 showed that around 92% of trading in the Forex market was performed by trading algorithms rather than humans.
It is widely used by investment banks, pension funds, mutual funds, and hedge funds that may need to spread out the execution of a larger order or perform trades too fast for human traders to react to. However, it is also available to private traders using simple retail tools.
The term algorithmic trading is often used synonymously with automated trading system. These encompass a variety of trading strategies, some of which are based on formulas and results from mathematical finance, and often rely on specialized software.
Examples of strategies used in algorithmic trading include systematic trading, market making, inter-market spreading, arbitrage, or pure speculation, such as trend following. Many fall into the category of high-frequency trading (HFT), which is characterized by high turnover and high order-to-trade ratios. HFT strategies utilize computers that make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. As a result, in February 2012, the Commodity Futures Trading Commission (CFTC) formed a special working group that included academics and industry experts to advise the CFTC on how best to define HFT. Algorithmic trading and HFT have resulted in a dramatic change of the market microstructure and in the complexity and uncertainty of the market macrodynamic, particularly in the way liquidity is provided.
History
= Early developments
=Computerization of the order flow in financial markets began in the early 1970s, when the New York Stock Exchange introduced the "designated order turnaround" system (DOT). SuperDOT was introduced in 1984 as an upgraded version of DOT. Both systems allowed for the routing of orders electronically to the proper trading post. The "opening automated reporting system" (OARS) aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).
With the rise of fully electronic markets came the introduction of program trading, which is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice, program trades were pre-programmed to automatically enter or exit trades based on various factors. In the 1980s, program trading became widely used in trading between the S&P 500 equity and futures markets in a strategy known as index arbitrage.
At about the same time, portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black–Scholes option pricing model.
Both strategies, often simply lumped together as "program trading", were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.
= Refinement and growth
=The financial landscape was changed again with the emergence of electronic communication networks (ECNs) in the 1990s, which allowed for trading of stock and currencies outside of traditional exchanges. In the U.S., decimalization changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share in 2001, and may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus increasing market liquidity.
This increased market liquidity led to institutional traders splitting up orders according to computer algorithms so they could execute orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time-weighted average price or more usually by the volume-weighted average price.
A further encouragement for the adoption of algorithmic trading in the financial markets came in 2001 when a team of IBM researchers published a paper at the International Joint Conference on Artificial Intelligence where they showed that in experimental laboratory versions of the electronic auctions used in the financial markets, two algorithmic strategies (IBM's own MGD, and Hewlett-Packard's ZIP) could consistently out-perform human traders. MGD was a modified version of the "GD" algorithm invented by Steven Gjerstad & John Dickhaut in 1996/7; the ZIP algorithm had been invented at HP by Dave Cliff (professor) in 1996. In their paper, the IBM team wrote that the financial impact of their results showing MGD and ZIP outperforming human traders "...might be measured in billions of dollars annually"; the IBM paper generated international media coverage.
In 2005, the Regulation National Market System was put in place by the SEC to strengthen the equity market. This changed the way firms traded with rules such as the Trade Through Rule, which mandates that market orders must be posted and executed electronically at the best available price, thus preventing brokerages from profiting from the price differences when matching buy and sell orders.
As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers, as they can react rapidly to price changes and observe several markets simultaneously.
Many broker-dealers offered algorithmic trading strategies to their clients – differentiating them by behavior, options and branding. Examples include Chameleon (developed by BNP Paribas), Stealth (developed by the Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse). These implementations adopted practices from the investing approaches of arbitrage, statistical arbitrage, trend following, and mean reversion.
In modern global financial markets, algorithmic trading plays a crucial role in achieving financial objectives. For nearly 30 years, traders, investment banks, investment funds, and other financial entities have utilized algorithms to refine and implement trading strategies. The use of algorithms in financial markets has grown substantially since the mid-1990s, although the exact contribution to daily trading volumes remains imprecise.
Technological advancements and algorithmic trading have facilitated increased transaction volumes, reduced costs, improved portfolio performance, and enhanced transparency in financial markets. According to the Foreign Exchange Activity in April 2019 report, foreign exchange markets had a daily turnover of US$6.6 trillion, a significant increase from US$5.1 trillion in 2016.
= Case studies
=Profitability projections by the TABB Group, a financial services industry research firm, for the US equities HFT industry were US$1.3 billion before expenses for 2014, significantly down on the maximum of US$21 billion that the 300 securities firms and hedge funds that then specialized in this type of trading took in profits in 2008, which the authors had then called "relatively small" and "surprisingly modest" when compared to the market's overall trading volume. In March 2014, Virtu Financial, a high-frequency trading firm, reported that during five years the firm as a whole was profitable on 1,277 out of 1,278 trading days, losing money just one day, demonstrating the benefits of trading millions of times, across a diverse set of instruments every trading day.
A third of all European Union and United States stock trades in 2006 were driven by automatic programs, or algorithms. As of 2009, studies suggested HFT firms accounted for 60–73% of all US equity trading volume, with that number falling to approximately 50% in 2012. In 2006, at the London Stock Exchange, over 40% of all orders were entered by algorithmic traders, with 60% predicted for 2007. American markets and European markets generally have a higher proportion of algorithmic trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets. Foreign exchange markets also have active algorithmic trading, measured at about 80% of orders in 2016 (up from about 25% of orders in 2006). Futures markets are considered fairly easy to integrate into algorithmic trading, with about 40% of options trading done via trading algorithms in 2016. Bond markets are moving toward more access to algorithmic traders.
Algorithmic trading and HFT have been the subject of much public debate since the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission said in reports that an algorithmic trade entered by a mutual fund company triggered a wave of selling that led to the 2010 Flash Crash. The same reports found HFT strategies may have contributed to subsequent volatility by rapidly pulling liquidity from the market. As a result of these events, the Dow Jones Industrial Average suffered its second largest intraday point swing ever to that date, though prices quickly recovered. (See List of largest daily changes in the Dow Jones Industrial Average.) A July 2011 report by the International Organization of Securities Commissions (IOSCO), an international body of securities regulators, concluded that while "algorithms and HFT technology have been used by market participants to manage their trading and risk, their usage was also clearly a contributing factor in the flash crash event of May 6, 2010." However, other researchers have reached a different conclusion. One 2010 study found that HFT did not significantly alter trading inventory during the Flash Crash. Some algorithmic trading ahead of index fund rebalancing transfers profits from investors.
Strategies
= Trading ahead of index fund rebalancing
=Most retirement savings, such as private pension funds or 401(k) and individual retirement accounts in the US, are invested in mutual funds, the most popular of which are index funds which must periodically "rebalance" or adjust their portfolio to match the new prices and market capitalization of the underlying securities in the stock or other index that they track. Profits are transferred from passive index investors to active investors, some of whom are algorithmic traders specifically exploiting the index rebalance effect. The magnitude of these losses incurred by passive investors has been estimated at 21–28bp per year for the S&P 500 and 38–77bp per year for the Russell 2000. John Montgomery of Bridgeway Capital Management says that the resulting "poor investor returns" from trading ahead of mutual funds is "the elephant in the room" that "shockingly, people are not talking about".
= Pairs trading
=Pairs trading or pair trading is a long-short, ideally market-neutral strategy enabling traders to profit from transient discrepancies in relative value of close substitutes. Unlike in the case of classic arbitrage, in case of pairs trading, the law of one price cannot guarantee convergence of prices. This is especially true when the strategy is applied to individual stocks – these imperfect substitutes can in fact diverge indefinitely. In theory, the long-short nature of the strategy should make it work regardless of the stock market direction. In practice, execution risk, persistent and large divergences, as well as a decline in volatility can make this strategy unprofitable for long periods of time (e.g. 2004-2007). It belongs to wider categories of statistical arbitrage, convergence trading, and relative value strategies.
= Delta-neutral strategies
=In finance, delta-neutral describes a portfolio of related financial securities, in which the portfolio value remains unchanged due to small changes in the value of the underlying security. Such a portfolio typically contains options and their corresponding underlying securities such that positive and negative delta components offset, resulting in the portfolio's value being relatively insensitive to changes in the value of the underlying security.
= Arbitrage
=In economics and finance, arbitrage is the practice of taking advantage of a price difference between two or more markets: striking a combination of matching deals that capitalize upon the imbalance, the profit being the difference between the market prices. When used by academics, an arbitrage is a transaction that involves no negative cash flow at any probabilistic or temporal state and a positive cash flow in at least one state; in simple terms, it is the possibility of a risk-free profit at zero cost. Example: One of the most popular arbitrage trading opportunities is played with the S&P futures and the S&P 500 stocks. During most trading days, these two will develop disparity in the pricing between the two of them. This happens when the price of the stocks which are mostly traded on the NYSE and NASDAQ markets either get ahead or behind the S&P Futures which are traded in the CME market.
Conditions for arbitrage
Arbitrage is possible when one of three conditions is met:
The same asset does not trade at the same price on all markets (the "law of one price" is temporarily violated).
Two assets with identical cash flows do not trade at the same price.
An asset with a known price in the future does not today trade at its future price discounted at the risk-free interest rate (or, the asset does not have negligible costs of storage; as such, for example, this condition holds for grain but not for securities).
Arbitrage is not simply the act of buying a product in one market and selling it in another for a higher price at some later time. The long and short transactions should ideally occur simultaneously to minimize the exposure to market risk, or the risk that prices may change on one market before both transactions are complete. In practical terms, this is generally only possible with securities and financial products which can be traded electronically, and even then, when first leg(s) of the trade is executed, the prices in the other legs may have worsened, locking in a guaranteed loss. Missing one of the legs of the trade (and subsequently having to open it at a worse price) is called 'execution risk' or more specifically 'leg-in and leg-out risk'. In the simplest example, any good sold in one market should sell for the same price in another. Traders may, for example, find that the price of wheat is lower in agricultural regions than in cities, purchase the good, and transport it to another region to sell at a higher price. This type of price arbitrage is the most common, but this simple example ignores the cost of transport, storage, risk, and other factors. "True" arbitrage requires that there be no market risk involved. Where securities are traded on more than one exchange, arbitrage occurs by simultaneously buying in one and selling on the other. Such simultaneous execution, if perfect substitutes are involved, minimizes capital requirements, but in practice never creates a "self-financing" (free) position, as many sources incorrectly assume following the theory. As long as there is some difference in the market value and riskiness of the two legs, capital would have to be put up in order to carry the long-short arbitrage position.
= Mean reversion
=Mean reversion is a mathematical methodology sometimes used for stock investing, but it can be applied to other processes. In general terms the idea is that both a stock's high and low prices are temporary, and that a stock's price tends to have an average price over time. An example of a mean-reverting process is the Ornstein-Uhlenbeck stochastic equation.
Mean reversion involves first identifying the trading range for a stock, and then computing the average price using analytical techniques as it relates to assets, earnings, etc.
When the current market price is less than the average price, the stock is considered attractive for purchase, with the expectation that the price will rise. When the current market price is above the average price, the market price is expected to fall. In other words, deviations from the average price are expected to revert to the average.
The standard deviation of the most recent prices (e.g., the last 20) is often used as a buy or sell indicator.
Stock reporting services (such as Yahoo! Finance, MS Investor, Morningstar, etc.), commonly offer moving averages for periods such as 50 and 100 days. While reporting services provide the averages, identifying the high and low prices for the study period is still necessary.
= Scalping
=Scalping is liquidity provision by non-traditional market makers, whereby traders attempt to earn (or make) the bid-ask spread. This procedure allows for profit for so long as price moves are less than this spread and normally involves establishing and liquidating a position quickly, usually within minutes or less.
A market maker is basically a specialized scalper and also referred to as dealers. The volume a market maker trades is many times more than the average individual scalper and would make use of more sophisticated trading systems and technology. However, registered market makers are bound by exchange rules stipulating their minimum quote obligations. For instance, NASDAQ requires each market maker to post at least one bid and one ask at some price level, so as to maintain a two-sided market for each stock represented.
= Transaction cost reduction
=Most strategies referred to as algorithmic trading (as well as algorithmic liquidity-seeking) fall into the cost-reduction category. The basic idea is to break down a large order into small orders and place them in the market over time. The choice of algorithm depends on various factors, with the most important being volatility and liquidity of the stock. For example, for a highly liquid stock, matching a certain percentage of the overall orders of stock (called volume inline algorithms) is usually a good strategy, but for a highly illiquid stock, algorithms try to match every order that has a favorable price (called liquidity-seeking algorithms).
The success of these strategies is usually measured by comparing the average price at which the entire order was executed with the average price achieved through a benchmark execution for the same duration. Usually, the volume-weighted average price is used as the benchmark. At times, the execution price is also compared with the price of the instrument at the time of placing the order.
A special class of these algorithms attempts to detect algorithmic or iceberg orders on the other side (i.e. if you are trying to buy, the algorithm will try to detect orders for the sell side). These algorithms are called sniffing algorithms. A typical example is "Stealth".
Some examples of algorithms are VWAP, TWAP, Implementation shortfall, POV, Display size, Liquidity seeker, and Stealth. Modern algorithms are often optimally constructed via either static or dynamic programming.
= Strategies that only pertain to dark pools
=As of 2009, HFT, which comprises a broad set of buy-side as well as market making sell side traders, has become more prominent and controversial. These algorithms or techniques are commonly given names such as "Stealth" (developed by the Deutsche Bank), "Iceberg", "Dagger", " Monkey", "Guerrilla", "Sniper", "BASOR" (developed by Quod Financial) and "Sniffer". Dark pools are alternative trading systems that are private in nature—and thus do not interact with public order flow—and seek instead to provide undisplayed liquidity to large blocks of securities. In dark pools, trading takes place anonymously, with most orders hidden or "iceberged". Gamers or "sharks" sniff out large orders by "pinging" small market orders to buy and sell. When several small orders are filled the sharks may have discovered the presence of a large iceberged order.
"Now it's an arms race," said Andrew Lo, director of the Massachusetts Institute of Technology's Laboratory for Financial Engineering in 2006. "Everyone is building more sophisticated algorithms, and the more competition exists, the smaller the profits."
= Market timing
=Strategies designed to generate alpha are considered market timing strategies. These types of strategies are designed using a methodology that includes backtesting, forward testing and live testing. Market timing algorithms will typically use technical indicators such as moving averages but can also include pattern recognition logic implemented using finite-state machines.
Backtesting the algorithm is typically the first stage and involves simulating the hypothetical trades through an in-sample data period. Optimization is performed in order to determine the most optimal inputs. Steps taken to reduce the chance of over-optimization can include modifying the inputs +/- 10%, shmooing the inputs in large steps, running Monte Carlo simulations and ensuring slippage and commission is accounted for.
Forward testing the algorithm is the next stage and involves running the algorithm through an out of sample data set to ensure the algorithm performs within backtested expectations.
Live testing is the final stage of development and requires the developer to compare actual live trades with both the backtested and forward tested models. Metrics compared include percent profitable, profit factor, maximum drawdown and average gain per trade.
High-frequency trading
As noted above, high-frequency trading (HFT) is a form of algorithmic trading characterized by high turnover and high order-to-trade ratios. Although there is no single definition of HFT, among its key attributes are highly sophisticated algorithms, specialized order types, co-location, very short-term investment horizons, and high cancellation rates for orders.
In the U.S., high-frequency trading (HFT) firms represent 2% of the approximately 20,000 firms operating today, but account for 73% of all equity trading volume. As of the first quarter in 2009, total assets under management for hedge funds with HFT strategies were US$141 billion, down about 21% from their high. The HFT strategy was first made successful by Renaissance Technologies.
High-frequency funds started to become especially popular in 2007 and 2008. Many HFT firms are market makers and provide liquidity to the market, which has lowered volatility and helped narrow bid–offer spreads making trading and investing cheaper for other market participants. HFT has been a subject of intense public focus since the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission stated that both algorithmic trading and HFT contributed to volatility in the 2010 Flash Crash. Among the major U.S. high frequency trading firms are Chicago Trading Company, Optiver, Virtu Financial, DRW, Jump Trading, Two Sigma Securities, GTS, IMC Financial, and Citadel LLC.
There are four key categories of HFT strategies: market-making based on order flow, market-making based on tick data information, event arbitrage and statistical arbitrage. All portfolio-allocation decisions are made by computerized quantitative models. The success of computerized strategies is largely driven by their ability to simultaneously process volumes of information, something ordinary human traders cannot do.
= Market making
=Market making involves placing a limit order to sell (or offer) above the current market price or a buy limit order (or bid) below the current price on a regular and continuous basis to capture the bid-ask spread. Automated Trading Desk, which was bought by Citigroup in July 2007, has been an active market maker, accounting for about 6% of total volume on both NASDAQ and the New York Stock Exchange.
= Statistical arbitrage
=Another set of HFT strategies in classical arbitrage strategy might involve several securities such as covered interest rate parity in the foreign exchange market which gives a relation between the prices of a domestic bond, a bond denominated in a foreign currency, the spot price of the currency, and the price of a forward contract on the currency. If the market prices are different enough from those implied in the model to cover transaction cost then four transactions can be made to guarantee a risk-free profit. HFT allows similar arbitrages using models of greater complexity involving many more than 4 securities. The TABB Group estimates that annual aggregate profits of low latency arbitrage strategies currently exceed US$21 billion.
A wide range of statistical arbitrage strategies have been developed whereby trading decisions are made on the basis of deviations from statistically significant relationships. Like market-making strategies, statistical arbitrage can be applied in all asset classes.
= Event arbitrage
=A subset of risk, merger, convertible, or distressed securities arbitrage that counts on a specific event, such as a contract signing, regulatory approval, judicial decision, etc., to change the price or rate relationship of two or more financial instruments and permit the arbitrageur to earn a profit.
Merger arbitrage also called risk arbitrage would be an example of this. Merger arbitrage generally consists of buying the stock of a company that is the target of a takeover while shorting the stock of the acquiring company. Usually the market price of the target company is less than the price offered by the acquiring company. The spread between these two prices depends mainly on the probability and the timing of the takeover being completed, as well as the prevailing level of interest rates. The bet in a merger arbitrage is that such a spread will eventually be zero, if and when the takeover is completed. The risk is that the deal "breaks" and the spread massively widens.
= Spoofing
=One strategy that some traders have employed, which has been proscribed yet likely continues, is called spoofing. It is the act of placing orders to give the impression of wanting to buy or sell shares, without ever having the intention of letting the order execute to temporarily manipulate the market to buy or sell shares at a more favorable price. This is done by creating limit orders outside the current bid or ask price to change the reported price to other market participants. The trader can subsequently place trades based on the artificial change in price, then canceling the limit orders before they are executed.
Suppose a trader desires to sell shares of a company with a current bid of $20 and a current ask of $20.20. The trader would place a buy order at $20.10, still some distance from the ask so it will not be executed, and the $20.10 bid is reported as the National Best Bid and Offer best bid price. The trader then executes a market order for the sale of the shares they wished to sell. Because the best bid price is the investor's artificial bid, a market maker fills the sale order at $20.10, allowing for a $.10 higher sale price per share. The trader subsequently cancels their limit order on the purchase he never had the intention of completing.
= Quote stuffing
=Quote stuffing is a tactic employed by malicious traders that involves quickly entering and withdrawing large quantities of orders in an attempt to flood the market, thereby gaining an advantage over slower market participants. The rapidly placed and canceled orders cause market data feeds that ordinary investors rely on to delay price quotes while the stuffing is occurring. HFT firms benefit from proprietary, higher-capacity feeds and the most capable, lowest latency infrastructure. Researchers showed high-frequency traders are able to profit by the artificially induced latencies and arbitrage opportunities that result from quote stuffing.
Low latency trading systems
Network-induced latency, a synonym for delay, measured in one-way delay or round-trip time, is normally defined as how much time it takes for a data packet to travel from one point to another. Low latency trading refers to the algorithmic trading systems and network routes used by financial institutions connecting to stock exchanges and electronic communication networks (ECNs) to rapidly execute financial transactions. Most HFT firms depend on low latency execution of their trading strategies. Joel Hasbrouck and Gideon Saar (2013) measure latency based on three components: the time it takes for (1) information to reach the trader, (2) the trader's algorithms to analyze the information, and (3) the generated action to reach the exchange and get implemented. In a contemporary electronic market (circa 2009), low latency trade processing time was qualified as under 10 milliseconds, and ultra-low latency as under 1 millisecond.
Low-latency traders depend on ultra-low latency networks. They profit by providing information, such as competing bids and offers, to their algorithms microseconds faster than their competitors. The revolutionary advance in speed has led to the need for firms to have a real-time, colocated trading platform to benefit from implementing high-frequency strategies. Strategies are constantly altered to reflect the subtle changes in the market as well as to combat the threat of the strategy being reverse engineered by competitors. This is due to the evolutionary nature of algorithmic trading strategies – they must be able to adapt and trade intelligently, regardless of market conditions, which involves being flexible enough to withstand a vast array of market scenarios. As a result, a significant proportion of net revenue from firms is spent on the R&D of these autonomous trading systems.
Strategy implementation
Most of the algorithmic strategies are implemented using modern programming languages, although some still implement strategies designed in spreadsheets. Increasingly, the algorithms used by large brokerages and asset managers are written to the FIX Protocol's Algorithmic Trading Definition Language (FIXatdl), which allows firms receiving orders to specify exactly how their electronic orders should be expressed. Orders built using FIXatdl can then be transmitted from traders' systems via the FIX Protocol. Basic models can rely on as little as a linear regression, while more complex game-theoretic and pattern recognition or predictive models can also be used to initiate trading. More complex methods such as Markov chain Monte Carlo have been used to create these models.
Issues and developments
Algorithmic trading has been shown to substantially improve market liquidity among other benefits. However, improvements in productivity brought by algorithmic trading have been opposed by human brokers and traders facing stiff competition from computers.
= Cyborg finance
=Technological advances in finance, particularly those relating to algorithmic trading, has increased financial speed, connectivity, reach, and complexity while simultaneously reducing its humanity. Computers running software based on complex algorithms have replaced humans in many functions in the financial industry. Finance is essentially becoming an industry where machines and humans share the dominant roles – transforming modern finance into what one scholar has called, "cyborg finance".
= Concerns
=While many experts laud the benefits of innovation in computerized algorithmic trading, other analysts have expressed concern with specific aspects of computerized trading.
"The downside with these systems is their black box-ness," Mr. Williams said. "Traders have intuitive senses of how the world works. But with these systems you pour in a bunch of numbers, and something comes out the other end, and it's not always intuitive or clear why the black box latched onto certain data or relationships."
"The Financial Services Authority has been keeping a watchful eye on the development of black box trading. In its annual report the regulator remarked on the great benefits of efficiency that new technology is bringing to the market. But it also pointed out that 'greater reliance on sophisticated technology and modelling brings with it a greater risk that systems failure can result in business interruption'."
UK Treasury minister Lord Myners has warned that companies could become the "playthings" of speculators because of automatic high-frequency trading. Lord Myners said the process risked destroying the relationship between an investor and a company.
Other issues include the technical problem of latency or the delay in getting quotes to traders, security and the possibility of a complete system breakdown leading to a market crash.
"Goldman spends tens of millions of dollars on this stuff. They have more people working in their technology area than people on the trading desk...The nature of the markets has changed dramatically."
On August 1, 2012 Knight Capital Group experienced a technology issue in their automated trading system, causing a loss of $440 million.
This issue was related to Knight's installation of trading software and resulted in Knight sending numerous erroneous orders in NYSE-listed securities into the market. This software has been removed from the company's systems. ... Clients were not negatively affected by the erroneous orders, and the software issue was limited to the routing of certain listed stocks to NYSE. Knight has traded out of its entire erroneous trade position, which has resulted in a realized pre-tax loss of approximately $440 million.
Algorithmic and high-frequency trading were shown to have contributed to volatility during the May 6, 2010 Flash Crash, when the Dow Jones Industrial Average plunged about 600 points only to recover those losses within minutes. At the time, it was the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on an intraday basis in Dow Jones Industrial Average history.
= Recent developments
=Financial market news is now being formatted by firms such as Need To Know News, Thomson Reuters, Dow Jones, and Bloomberg, to be read and traded on via algorithms.
"Computers are now being used to generate news stories about company earnings results or economic statistics as they are released. And this almost instantaneous information forms a direct feed into other computers which trade on the news."
The algorithms do not simply trade on simple news stories but also interpret more difficult to understand news. Some firms are also attempting to automatically assign sentiment (deciding if the news is good or bad) to news stories so that automated trading can work directly on the news story.
"Increasingly, people are looking at all forms of news and building their own indicators around it in a semi-structured way," as they constantly seek out new trading advantages said Rob Passarella, global director of strategy at Dow Jones Enterprise Media Group. His firm provides both a low latency news feed and news analytics for traders. Passarella also pointed to new academic research being conducted on the degree to which frequent Google searches on various stocks can serve as trading indicators, the potential impact of various phrases and words that may appear in Securities and Exchange Commission statements and the latest wave of online communities devoted to stock trading topics.
"Markets are by their very nature conversations, having grown out of coffee houses and taverns," he said. So the way conversations get created in a digital society will be used to convert news into trades, as well, Passarella said.
"There is a real interest in moving the process of interpreting news from the humans to the machines" says Kirsti Suutari, global business manager of algorithmic trading at Reuters. "More of our customers are finding ways to use news content to make money."
An example of the importance of news reporting speed to algorithmic traders was an advertising campaign by Dow Jones (appearances included page W15 of The Wall Street Journal, on March 1, 2008) claiming that their service had beaten other news services by two seconds in reporting an interest rate cut by the Bank of England.
In July 2007, Citigroup, which had already developed its own trading algorithms, paid $680 million for Automated Trading Desk, a 19-year-old firm that trades about 200 million shares a day. Citigroup had previously bought Lava Trading and OnTrade Inc.
In late 2010, The UK Government Office for Science initiated a Foresight project investigating the future of computer trading in the financial markets, led by Dame Clara Furse, ex-CEO of the London Stock Exchange and in September 2011 the project published its initial findings in the form of a three-chapter working paper available in three languages, along with 16 additional papers that provide supporting evidence. All of these findings are authored or co-authored by leading academics and practitioners, and were subjected to anonymous peer-review. Released in 2012, the Foresight study acknowledged issues related to periodic illiquidity, new forms of manipulation and potential threats to market stability due to errant algorithms or excessive message traffic. However, the report was also criticized for adopting "standard pro-HFT arguments" and advisory panel members being linked to the HFT industry.
System architecture
A traditional trading system consists primarily of two blocks – one that receives the market data while the other that sends the order request to the exchange. However, an algorithmic trading system can be broken down into three parts:
Exchange
The server
Application
Exchange(s) provide data to the system, which typically consists of the latest order book, traded volumes, and last traded price (LTP) of scrip. The server in turn receives the data simultaneously acting as a store for historical database. The data is analyzed at the application side, where trading strategies are fed from the user and can be viewed on the GUI. Once the order is generated, it is sent to the order management system (OMS), which in turn transmits it to the exchange.
Gradually, old-school, high latency architecture of algorithmic systems is being replaced by newer, state-of-the-art, high infrastructure, low-latency networks. The complex event processing engine (CEP), which is the heart of decision making in algo-based trading systems, is used for order routing and risk management.
With the emergence of the FIX (Financial Information Exchange) protocol, the connection to different destinations has become easier and the go-to market time has reduced, when it comes to connecting with a new destination. With the standard protocol in place, integration of third-party vendors for data feeds is not cumbersome anymore.
Effects
One of the more ironic findings of academic research on algorithmic trading might be that individual trader introduce algorithms to make communication more simple and predictable, while markets end up more complex and more uncertain. Since trading algorithms follow local rules that either respond to programmed instructions or learned patterns, on the micro-level, their automated and reactive behavior makes certain parts of the communication dynamic more predictable. However, on the macro-level, it has been shown that the overall emergent process becomes both more complex and less predictable. This phenomenon is not unique to the stock market, and has also been detected with editing bots on Wikipedia.
Though its development may have been prompted by decreasing trade sizes caused by decimalization, algorithmic trading has reduced trade sizes further. Jobs once done by human traders are being switched to computers. The speeds of computer connections, measured in milliseconds and even microseconds, have become very important.
More fully automated markets such as NASDAQ, Direct Edge and BATS (formerly an acronym for Better Alternative Trading System) in the US, have gained market share from less automated markets such as the NYSE. Economies of scale in electronic trading have contributed to lowering commissions and trade processing fees, and contributed to international mergers and consolidation of financial exchanges.
Competition is developing among exchanges for the fastest processing times for completing trades. For example, in June 2007, the London Stock Exchange launched a new system called TradElect that promises an average 10 millisecond turnaround time from placing an order to final confirmation and can process 3,000 orders per second. Since then, competitive exchanges have continued to reduce latency with turnaround times of 3 milliseconds available. This is of great importance to high-frequency traders, because they have to attempt to pinpoint the consistent and probable performance ranges of given financial instruments. These professionals are often dealing in versions of stock index funds like the E-mini S&Ps, because they seek consistency and risk-mitigation along with top performance. They must filter market data to work into their software programming so that there is the lowest latency and highest liquidity at the time for placing stop-losses and/or taking profits. With high volatility in these markets, this becomes a complex and potentially nerve-wracking endeavor, where a small mistake can lead to a large loss. Absolute frequency data play into the development of the trader's pre-programmed instructions.
In the U.S., spending on computers and software in the financial industry increased to $26.4 billion in 2005.
Algorithmic trading has caused a shift in the types of employees working in the financial industry. For example, many physicists have entered the financial industry as quantitative analysts. Some physicists have even begun to do research in economics as part of doctoral research. This interdisciplinary movement is sometimes called econophysics. Some researchers also cite a "cultural divide" between employees of firms primarily engaged in algorithmic trading and traditional investment managers. Algorithmic trading has encouraged an increased focus on data and had decreased emphasis on sell-side research.
Communication standards
Algorithmic trades require communicating considerably more parameters than traditional market and limit orders. A trader on one end (the "buy side") must enable their trading system (often called an "order management system" or "execution management system") to understand a constantly proliferating flow of new algorithmic order types. The R&D and other costs to construct complex new algorithmic orders types, along with the execution infrastructure, and marketing costs to distribute them, are fairly substantial. What was needed was a way that marketers (the "sell side") could express algo orders electronically such that buy-side traders could just drop the new order types into their system and be ready to trade them without constant coding custom new order entry screens each time.
FIX Protocol is a trade association that publishes free, open standards in the securities trading area. The FIX language was originally created by Fidelity Investments, and the association Members include virtually all large and many midsized and smaller broker dealers, money center banks, institutional investors, mutual funds, etc. This institution dominates standard setting in the pretrade and trade areas of security transactions. In 2006–2007, several members got together and published a draft XML standard for expressing algorithmic order types. The standard is called FIX Algorithmic Trading Definition Language (FIXatdl).
See also
2010 Flash Crash
Algorithmic tacit collusion
Alpha generation platform
Alternative trading system
Artificial intelligence
Best execution
Complex event processing
Electronic trading platform
Mirror trading
Quantitative investing
Technical analysis
Notes
References
External links
Kata Kunci Pencarian:
- Anjungan perdagangan digital
- Analitik prediktif
- Ekonomi anjungan
- Algorithmic trading
- Algorithmic
- High-frequency trading
- Program trading
- Day trading
- Pairs trade
- News analytics
- Systematic trading
- Electronic trading
- Automated trading system