Towards Robust Representation of Limit Orders Books for Deep Learning Models
Yufei Wu,
Mahmoud Mahfouz,
Daniele Magazzeni and
Manuela Veloso
Papers from arXiv.org
Abstract:
The success of deep learning-based limit order book forecasting models is highly dependent on the quality and the robustness of the input data representation. A significant body of the quantitative finance literature focuses on utilising different deep learning architectures without taking into consideration the key assumptions these models make with respect to the input data representation. In this paper, we highlight the issues associated with the commonly-used representations of limit order book data from both a theoretical and practical perspectives. We also show the fragility of the representations under adversarial perturbations and propose two simple modifications to the existing representations that match the theoretical assumptions of deep learning models. Finally, we show experimentally how our proposed representations lead to state-of-the-art performance in both accuracy and robustness utilising very simple neural network architectures.
Date: 2021-10, Revised 2022-12
New Economics Papers: this item is included in nep-big, nep-cmp, nep-fmk and nep-mst
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://arxiv.org/pdf/2110.05479 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2110.05479
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().