Economics at your fingertips  

Axial-LOB: High-Frequency Trading with Axial Attention

Damian Kisiel and Denise Gorse

Papers from

Abstract: Previous attempts to predict stock price from limit order book (LOB) data are mostly based on deep convolutional neural networks. Although convolutions offer efficiency by restricting their operations to local interactions, it is at the cost of potentially missing out on the detection of long-range dependencies. Recent studies address this problem by employing additional recurrent or attention layers that increase computational complexity. In this work, we propose Axial-LOB, a novel fully-attentional deep learning architecture for predicting price movements of stocks from LOB data. By utilizing gated position-sensitive axial attention layers our architecture is able to construct feature maps that incorporate global interactions, while significantly reducing the size of the parameter space. Unlike previous works, Axial-LOB does not rely on hand-crafted convolutional kernels and hence has stable performance under input permutations and the capacity to incorporate additional LOB features. The effectiveness of Axial-LOB is demonstrated on a large benchmark dataset, containing time series representations of millions of high-frequency trading events, where our model establishes a new state of the art, achieving an excellent directional classification performance at all tested prediction horizons.

Date: 2022-12
New Economics Papers: this item is included in nep-big, nep-cmp and nep-mst
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2) Track citations by RSS feed

Downloads: (external link) Latest version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

Access Statistics for this paper

More papers in Papers from
Bibliographic data for series maintained by arXiv administrators ().

Page updated 2023-11-04
Handle: RePEc:arx:papers:2212.01807