An algorithm is a predetermined sequence of steps which are followed in order to produce a desired end state. The rules which make up an algorithm are defined in order to make certain decisions on behalf of the person who has created it (allowing it to be executed outside of human interaction, after it has been defined). In the world of trading where rules are used and decisions are made on multiple assets daily, the introduction of algorithms designed to trade based on these rules was inevitable.
This is a re-blog post originally posted by Laithan Morisco-Tarr and published with kind permission.
The original post can be found here.
Do you have a blog post which you are proud of? Submit your blog post for reblogging on UKEdChat.com by clicking here.
Computer algorithms which have been created to make certain trading decisions which mimic the desires of the programmer have the ability to search the markets, and make rule based decisions efficiently and quickly. Algorithms for trading have been around since the 1990s for listed derivatives, before they were introduced to equities (after a number of market events including the decimalisation of trades, the sub-penny rule and the growth of the FIX protocol).
The recent growth in electronic trading has seen a shift in the priorities of the typical investor. It is no longer necessary for a trader to analyse the fundamentals and details of a stock in order for them to hold them for a long time period. Instead, the ability to snipe out favourably priced stocks relative to selected factors using algorithms has reduced the need for investors to hold their stocks for the long term. The first algorithmic traders were able to make the most of the market when the practice was small; however the growth in algorithmic trading has seen various financial firms now competing for the best systems, improving hardware in order to reduce latency (executing at the best prices before everyone else). Hedge funds are typically the most effective at this due to their heavy expenditure on technology; however Investment Banks also fare well. The trading systems used rarely highlight the methodologies that define them, instead being named “Stealth”, or “Iceberg” to hide the trading strategies of the firm from its rivals.
So you want to create your own algorithm?
There are several key parameters that you will need to assess which underpin a typical algorithm, to make these trading decisions:
This is the trigger for the algorithm to either buy or sell the particular asset. The algorithm will “scour” the market for the best possible price, and execute if the defined price is met.
Before purchasing the trade at the price found in the market, it is necessary to identify the transaction cost of carrying out the trade. A rule of thumb is that fewer larger trades are better than frequent small trades (in order to minimise the total cost). The costs for a transaction not only include taxes, commissions and fees, but also latency costs such as market impact (a large buy order would increase the market price), and opportunity costs.
Size and Liquidity
If an algorithm is designed to purchase as many assets as possible at a certain price, it must do so without disturbing the market (as explained above). An algorithm can be designed to measure the liquidity of the market to assess whether purchasing the stock in the quantity it desires is feasible. It is sometimes necessary if there is a large buy order, to purchase in heterogeneous “waves” in order for rival trading systems from identifying the strategy of the algo.
Timing of Execution
Apart from executing a trade quickly (in order to give the algorithm the best chance of obtaining the asset at the desired price), the timing of typical orders into the market needs to be measured too. Large buy side orders are typically conducted early in the trading day, as well as late in the day. A Time Weighted Average Price (TWAP) for an execution needs to be established for each particular time period.
This ratio outlines the likelihood that an order of a specific size will be fulfilled. Some assets are heavily traded, and the likelihood of fulfilling a large order is high, whereas some assets could be relatively illiquid, and satisfying the trade will take a little more time (or not be satisfied entirely).
The combination of all of these factors establishes the execution criteria. The algorithm may determine that the market it will operate in is more appropriate for an aggressive strategy (looking to profit from every market movement, in bull and bear markets), or a passive strategy (a little less aggressive looking to buy or sell assets at a few ticks either side of the bid-offer).
There are a number of tasks that the algorithm needs to perform simultaneously in order to successfully execute the requirements of the programmer. These tasks include:
There is a vast source of information relating to each asset price at each point in time. The algorithm will need to source all relevant information, and base its trading decisions on this. Higher speeds for processing are a huge factor for success here, as the market may have already adjusted to reflect the fair price that the information infers.
In order for a trade to be completed, a “match” needs to be found. If an algo tries to sell an asset at a particular price, it must find an investor willing to purchase at the price that the algo has defined.
Once an algorithm has found an asset which meets the execution criteria, it must be able to enter the market and purchase or sell the asset at the determined price, taking into account the method of execution given the size of the order.
The predetermination of the factors required in order for the algorithm to execute a trade needs to be established either by the programmer, or by the algorithm itself (if it is learning to identify certain patterns in the market). This quantitative analysis needs to be carried out simultaneously to its trading activities.
Smart Order Routing
With the constant growth in trading venues, and exchanges, the algorithm must know to check all areas for the best possible price.
There are multiple varieties of algorithms which are used in the industry, but they can be broken down into three main types:
The aim of this type of algorithm is to reduce the potential market impact of any trade, and also to limit the ability of other competing algorithms to understand the strategy that they are employing. The most common way that the algorithm achieves this is by using the Volume Weighted Average Price (VWAP), which is the average price paid for a stock across all trading of that stock in the day. It is defined as:
The figure that established from this formula is used as a rough guide for executing a trade. For an algorithm which is trying to buy a stock, if the buy price is below the VWAP, then the trade is good. If the price is higher, then the trade is bad. In order to make the most of this strategy, it is important to make the most of market liquidity. If the market is not very liquid, even the smallest purchases can have a large impact on the price.
Another impact driven strategy is the Time Weighted Average Price (TWAP) / Sensitivity strategy. This looks to either buy or sell the order in equal split sizes throughout the day. The size of the orders can be changed slightly to account for slight differences in the market liquidity, and also to mask the trading activity from rival algorithms.
A Percent of Volume strategy looks to only trade at the maximum of a certain predefined percentage of the market available liquidity. If there are 10,000 stocks in the market, and the trader currently has 800 in their portfolio; having a 5% volume strategy means the trader would sell 300 of their shares to reach the 5% market liquidity level of 500.
The aim of this type of strategy is to minimise the transaction costs of each trade (fees, taxes, opportunity costs, market driven and commissions).
The Implementation Shortfall strategy looks to identify where the average execution price deviates from a determined benchmark (e.g. the VWAP or the strike price). The benchmark is the price at which the transaction could be carried out without any cost. In order to minimise gap between the average execution price and the cost free strike price, the investor will find the optimal trading horizon, and the size and frequency of the trades in order to execute the transaction, with market impact and risk in mind.
The Market Close strategy benchmarks against the closing price of the market. This means that the algorithm will execute its trades close to the end of the day.
These types of strategy aim to profit from any favourable prices, liquidity or volatility of the stocks in the market.
The Price Inline algorithm is an adjustment of the impact driven algorithms (based on the VWAP or Percent of Volume strategies), but with the addition of a price adaptive function. The adaptive element looks to analyse where the price of the stock diverges from the set benchmark.
Liquidity Driven algorithms were initially created for investors to trade in traditionally illiquid markets. The existence of multiple trading venues means that the market is fragmented, and the algorithm will search all venues for hidden liquidity (and when liquidity becomes available) in order to fulfil its order.
Finally, the Pairs Trading algorithm looks for find tradable pairs in the market, buy and selling simultaneously in order to make a risk free profit (statistical arbitrage). The main assumption for this is that any divergence of the two assets is likely to revert back to the average price.
Outside of these three algorithm types, there are a variety of other strategies which have gained prominence. The Arrival Price strategy aims to analyse the difference between the current market price of the asset, and the midpoint of the bid-offer spread. Once the stock is trading at or below the midpoint, then the stock will be bought. Conversely, if the stock is trading above the midpoint, then the stock will be sold. Furthermore, the Target Volume strategy looks to obtain and retain a fixed percentage of the active market traded volume. As the availability of assets in the market adjusts, the shares are either bought or sold to return to the defined percentage.
High Frequency trading is a form of algorithmic trading, but is not the same thing. It looks to exploit price mismatches in the market, without any regard for the fundamentals of a stock, or market conditions. The price mismatches do not last for long, so the trading systems will execute the trades to create profit almost instantaneously.
So what are the risks of creating your own algorithm?
The ability of someone to put high value trading decisions in the hands of a computer is immediately a daunting prospect. Having defined how the system will react to certain market events in order to trade has only built a very primitive system, one which is theoretically unable to deal with any shocks or “black swan” events. Putting faith in a system to deal with unpredictable shocks is a huge risk, and ensuring that controls are put in place to deal with any exceptional events is incredibly important. By implementing sophisticated detection techniques which could analyse data sources such as the media, or twitter, or can identify the signs of malpractice, then the risk could be identified before it occurs, and the potential loss minimised.
Implementing sophisticated algorithms which are sensitive to market changes, and that calculate risk comes at a high cost. Any addition to the processing of an algorithm takes away from the processing power of the hardware to carry out its other (main) function. Given that the power of the algorithm can suffer as a result of the implementation of any risk mitigating code, many go in favour of profit making. It is because of this that many believe that algorithms have made the market far more volatile than before. If you are still deciding that creating an algorithm to trade your spare funds is still something you wish to do, just be sure that you understand the risks of putting a computer in control of your finances.
Laithanomics on Twitter…