9/23/2019

Introduction

Objective

The objective of this project is to understand how human traders make buy/sell decision in limit order books by looking at the variations in volumes and prices of different levels. Hypothesis: the model can be used for transfer learning in other limit order book prediction tasks. The secondary objective (and most challenging task) was to scale up the preprocessing of financial data

Example Image

Data source

Privately acquired data. Only results will be shown

Data / feature engineering

Features, filters and labels are engineered as they were in the previous analysis. The final volumes at different prices was converted into a matrix (image) shaped to match the price levels. For each day the ticker with the most data was chosen for modeling

Hypothesis

The image shown in previous slide suggests that change in level 1 volume is likely to change the VWAP. The objective is to test whether a stack of images of states of volumes can be used to capture the signal of increase / decrease in VWAP. As mentioned above, appropriate filters are used on the returns to remove noise

Problem

In the previous analysis the data was considerably small in size: limited to < 100,000 examples for training and testing. This was because the total data used for modeling summed up to 1 day (86400 seconds). The approximate preprocessing time per day was ~ 6 mins. In the current setting we have ~ 220 files, which suggests a total preprocessing time of ~ 22 hours. Also some of the hyperparameters are associated with the preprocessing - therefore the preprocessing may be repeated with each experiment, which is not acceptable. Therefore, multiprocessing and MPI (local network) were used to reduce the computing time per experiment

Deep learning and transfer learning

Convolutional neural network

Convolutional neural network was applied on the images preprocessed using multiprocessing. It applies filters on the stack of images of volumes from previous times to model the direction of change. Volumes were scaled to avoid effects of change of ticker. This makes the network weights transferable to other problems of limit order book analysis

Sampling

The data set is arranged in chronological order (ascending). Data set is sampled for running deep learning in the following way:

  • Training: 0%-60%
  • Validation: 60%-80%
  • Test: 80%-100%

Modeling scheme

  • Single layer and multi-layer convolutional network, single channel and multi-channel convolutions
  • Batch normalization was applied for transfer learning. Hyperparameters are tuned using randomized grid search

Results and conclusion

  • Evaluation metrics: Accuracy (compared with majority guess) benchmarked with logistic regression
  • Results: CNN significantly outperformed majority guess on the test set in case 7 where the volumes of buy and sell side were arranged appropriately. All other combinations of preprocessing steps and models performed worse than majority guess
  • Scope for improvement: All models were trained locally even though the preprocessing was done on a server because the server did not have a GPU, but my laptop has one