It says on the front: On July 30, 2016, the main protagonist of the trading portal, high-frequency trader Lio, at the invitation of the Hong Kong Alumni Conference of the Hong Kong University of Transport, held a seminar on the topic of Quantitative Finance and High-Frequency Trading.
Figure 1
Figure 2
Figure 3
The City Strategy
The main purpose of the market-dealer strategy is to provide liquidity in the market, to go to the bid/ask listing, to narrow the bid/ask listing, to make a difference in the middle. In other words, some of the dams here are doing better. There's a lot to talk about, like how to control your holdings, your risk. There's also a lot of prediction. There's a lot of prediction. There's a lot of volatility and price prediction. The cost of IT is high, because everyone is competing, everyone wants to be faster, from Co-Location, to FPGA, and now microwave is up. For the average investor, the existence of a market makes it possible to buy and sell at a lower price. Figure fourThis is how one of my strategies performed on the August 12th, last year, in the IPO of the 50 stock futures. On that day, the total market volume was 225,000 hands, my strategy accounted for 4.1% (9,180 hands), P&L was also good, and the drawdown was small. Capital requirements were also low, only 500,000 needed for a full day, earning over 210,000, a gain of 43.5%. In July last year, due to the stock crash, the central bank began to restrict some investors on stock futures. As you can see, in July, the few days of the Bid/Ask Spread showed signs of stretching, by September 7, the central bank began to restrict speculators, raising the collateral to 40%, the settlement fee increased to twenty-three thousandths, and the single-variety one-day open trade volume did not exceed 10 hands. Figure 5Figure 6Therefore, market strategies can increase market liquidity, narrow the Bid/Ask Spread, and reduce the number of slippage points when the buy/sell volume is high. The market strategy requires a rough estimate of what is a more reasonable price. Stock futures are marketed, and some people use a basket of stocks to predict the fair price of the stock.
Statistical interest rate Each of these is a big topic. I'm just talking about it. The statistical suite includes probabilities, data mining, modeling, transaction execution, and how to do data cleaning. Data mining is very important, and sometimes it's a headache when you don't handle it properly. There's a classic saying: Garbage in, Garbage out. Many Quants spend a lot of time processing data. One of the simplest models of leverage is the historical price volatility, with some execution bands on both sides. For example, you buy 100 bucks of milk powder from Hong Kong and sell it for 120 bucks to the mainland. In the middle you spend 10 bucks of fare and end up making 10 bucks. Gold, for example, has a standard contract in the domestic and foreign markets, theoretically the same value, and the two gold bars are taken out. But the price will fluctuate, and if we calculate this price difference, if we find that it deviates from the historical statistical range, such as Brexit, we will find that Chinese gold is cheaper, American gold is more expensive.
Forecast By comparing past market data with the current market environment, predict future price movements: Price = a + b + c. This is a future move that can be the next second, the next minute, the next trading day, the next week, the next month. If your model predicts accurately, it is better than NB, whether it is the next second, the next minute or the next week. Figure 7The basic process is to get the data together and figure out what exactly is affecting the market. You can start quickly, take a straight line, and you can get results quickly, but how long will your model stability last, and it's going to take constant tuning, constant cycling. Of course, there are a lot of factors now, and some people just throw in 500 factors. His model can tell him which factors are useful and which are not, and he can remove the high correlation factors himself. But this thing, I'm still learning, I don't have much experience. The secret of a Super Simple is not that it's simple to use, the simplest prediction model is that the price will return to the average line. What the average line is cyclical, you have to polish it yourself. The complexity in between, most of it comes from Data. Both Data and Factor need constant polishing.
In both cases, IT is important and can make you lose a lot of money (you've made a lot of money but you've been fined a lot of money). Figure 8The IT system is mainly divided into four parts. Price Data is relatively simple, more like Fundamental Data, Unstructured Data is a little bit more complex, it requires a lot of programmer code, how to collect, format, unify, access. As a Quant, I want to take a day's worth of data to draw a diagram. We are basically in this situation now, it is easy to take a bunch of data to do a bunch of things, and Quant at the other end writes very little code. Of course you can't be wrong, your tolerance for mistakes and your ability to check for mistakes is also very high. We've seen this before, retests are very good, we make money every day, and the results are wrong. Very stupid mistakes. This execution is all kinds of APIs, all kinds of market access, all kinds of controls. In the high-frequency domain, speed is very important. Because a lot of data is public, a lot of people can see it. When a lot of people see an opportunity, only the fastest can get it. Back Testing, sometimes Quant comes up with something that your backtesting system may not support yet, and you need to change the framework of the backtesting. Visualization is important. You can't say, "Generate me a bunch of numbers, I can't see it". Seeing a graph is more obvious. We've spent a lot of time drawing graphs in Scala, drawing graphs in R. Because a bunch of graphs and a bunch of data are not the same thing. The speed of retesting is also important. For example, retesting a strategy, a year's worth of data, you need a week. For those who wait a week to see your results, a minute might be a bit more acceptable. Parameters in the policy have an iterative process, such as parameters, I want to see how it goes from 1 to 100. We've also made a lot of optimizations here, like how to take data, how to cache it, and improve its performance in between. I've done some cloud computing experiments before at my last company, and I've distributed some of the re-tested engines all over many servers. So, one request passes, many machines are running at the same time. The other is Monitoring. There's a lot of automation in there. There's a lot of strategy. How to Monitor Risk, How to Alert, this is also a very important link. Like our current strategy is automated, all the strategies are monitored, the risk level of each strategy can not exceed a lot, more than an alarm. Especially we also trade at night, so it is not very realistic to have programmers often stay up late. When you're trading a lot of varieties, it's basically impossible for everyone to be there, so you have to do a lot of monitoring.
Flash Boys
Quantitative Trading: How to Build Your Own Algorithmic Trading Business
The Quants: How a New Breed of Math Whizzes Conquered Wall Street and Nearly Destroyed It
The Problem of HFT - Collected Writings on High Frequency Trading & Stock Market Structure Reform
Inside the Black Box: A Simple Guide to Quantitative and High Frequency Trading
Algorithmic Trading: Winning Strategies and Their Rationale
Quantitative Trading with R: Understanding Mathematical and Computational Tools from a Quant's Perspective
http://numericalmethod.com/courses/introduction-to-algorithmic-tradingstrategies-2011-2013/ https://www.quantstart.com/articles/beginners-guide-to-quantitative-trading https://www.zhihu.com/publications/nacl/19550372