Recurrent Neural Networks with more flexible memory: better predictions than rough volatility
Damien Challet and
Vincent Ragel (vincent.ragel@centralesupelec.fr)
Additional contact information
Vincent Ragel: MICS - Mathématiques et Informatique pour la Complexité et les Systèmes - CentraleSupélec - Université Paris-Saclay
Working Papers from HAL
Abstract:
We extend recurrent neural networks to include several flexible timescales for each dimension of their output, which mechanically improves their abilities to account for processes with long memory or with highly disparate time scales. We compare the ability of vanilla and extended long short term memory networks (LSTMs) to predict asset price volatility, known to have a long memory. Generally, the number of epochs needed to train extended LSTMs is divided by two, while the variation of validation and test losses among models with the same hyperparameters is much smaller. We also show that the model with the smallest validation loss systemically outperforms rough volatility predictions by about 20% when trained and tested on a dataset with multiple time series.
Keywords: Time series; Long memory; Recurrent Neural Networks; Rough Volatility; Volatility modelling (search for similar items in EconPapers)
Date: 2023-07-18
Note: View the original document on HAL open archive server: https://hal.science/hal-04165354
References: Add references at CitEc
Citations:
Downloads: (external link)
https://hal.science/hal-04165354/document (application/pdf)
Related works:
Working Paper: Recurrent Neural Networks with more flexible memory: better predictions than rough volatility (2023) 
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hal:wpaper:hal-04165354
Access Statistics for this paper
More papers in Working Papers from HAL
Bibliographic data for series maintained by CCSD (hal@ccsd.cnrs.fr).