EconPapers    
Economics at your fingertips  
 

Meta-Learning Neural Process for Implied Volatility Surfaces with SABR-induced Priors

Jirong Zhuang and Xuan Wu

Papers from arXiv.org

Abstract: Constructing the implied volatility surface (IVS) is reframed as a meta-learning problem training across trading days to learn a general process that reconstructs a full IVS from few quotes, eliminating daily recalibration. We introduce the Volatility Neural Process, an attention-based model that uses a two-stage training: pre-training on SABR-generated surfaces to encode a financial prior, followed by fine-tuning on market data. On S&P 500 options (2006-2023; out-of-sample 2019-2023), our model outperforms SABR, SSVI, Gaussian Process, and an ablation trained only on real data. Relative to the ablation, the SABR-induced prior reduces RMSE by about 40% and dominates in mid- and long-maturity regions where quotes are sparse. The learned prior suppresses large errors, providing a practical, data-efficient route to stable IVS construction with a single deployable model.

Date: 2025-09
New Economics Papers: this item is included in nep-cmp and nep-rmg
References: Add references at CitEc
Citations:

Downloads: (external link)
http://arxiv.org/pdf/2509.11928 Latest version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2509.11928

Access Statistics for this paper

More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().

 
Page updated 2025-10-01
Handle: RePEc:arx:papers:2509.11928