Multi-Transformer: A New Neural Network-Based Architecture for Forecasting S&P Volatility
Eduardo Ramos-P\'erez,
Pablo J. Alonso-Gonz\'alez and
Jos\'e Javier N\'u\~nez-Vel\'azquez
Papers from arXiv.org
Abstract:
Events such as the Financial Crisis of 2007-2008 or the COVID-19 pandemic caused significant losses to banks and insurance entities. They also demonstrated the importance of using accurate equity risk models and having a risk management function able to implement effective hedging strategies. Stock volatility forecasts play a key role in the estimation of equity risk and, thus, in the management actions carried out by financial institutions. Therefore, this paper has the aim of proposing more accurate stock volatility models based on novel machine and deep learning techniques. This paper introduces a neural network-based architecture, called Multi-Transformer. Multi-Transformer is a variant of Transformer models, which have already been successfully applied in the field of natural language processing. Indeed, this paper also adapts traditional Transformer layers in order to be used in volatility forecasting models. The empirical results obtained in this paper suggest that the hybrid models based on Multi-Transformer and Transformer layers are more accurate and, hence, they lead to more appropriate risk measures than other autoregressive algorithms or hybrid models based on feed forward layers or long short term memory cells.
Date: 2021-09
New Economics Papers: this item is included in nep-big, nep-cmp, nep-for, nep-ias and nep-rmg
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (8)
Published in Mathematics 2021, 9, 1794
Downloads: (external link)
http://arxiv.org/pdf/2109.12621 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2109.12621
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().