EconPapers    
Economics at your fingertips  
 

Neural stochastic differential equations for conditional time series generation using the Signature-Wasserstein-1 metric

Pere Díaz Lozano, Toni Lozano Bagén and Josep Vives

Journal of Computational Finance

Abstract: (Conditional) generative adversarial networks (GANs) have had great success in recent years, due to their ability to approximate (conditional) distributions over extremely high-dimensional spaces. However, they are highly unstable and computationally expensive to train, especially in the time series setting. Recently, the use of a key object in rough path theory, called the signature of a path, has been proposed. This is able to convert the min–max formulation given by the (conditional) GAN framework into a classical minimization problem. However, this method is extremely costly in terms of memory, which can sometimes become prohibitive. To overcome this, we propose the use of conditional neural stochastic differential equations, designed to have a constant memory cost as a function of depth, being more memory efficient than traditional deep learning architectures. We empirically test the efficiency of our proposed model against other classical approaches, in terms of both memory cost and computational time, and show that it usually outperforms them according to several metrics.

References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.risk.net/journal-of-computational-fina ... wasserstein-1-metric (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:rsk:journ0:7957441

Access Statistics for this article

More articles in Journal of Computational Finance from Journal of Computational Finance
Bibliographic data for series maintained by Thomas Paine ().

 
Page updated 2025-03-19
Handle: RePEc:rsk:journ0:7957441