Gradient boosting-based numerical methods for high-dimensional backward stochastic differential equations
Long Teng
Applied Mathematics and Computation, 2022, vol. 426, issue C
Abstract:
In this work we propose new algorithms for solving high-dimensional backward stochastic differential equations (BSDEs). Based on the general theta-discretization for the time-integrands, we show how to efficiently use eXtreme Gradient Boosting (XGBoost) regression to approximate the resulting conditional expectations in a quite high dimension. A rigorous analysis of the convergence and time complexity is provided. Numerical results illustrate the efficiency and accuracy of our proposed algorithms for solving very high-dimensional (up to 10,000 dimensions) nonlinear BSDEs. Notably, our new algorithms works also quite well on the problems with highly complex structure in high dimension, which cannot be tackled with most of the state-of-art numerical methods.
Keywords: Backward stochastic differential equations (BSDEs); XGBoost; High-dimensional problem; Regression (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S009630032200203X
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:apmaco:v:426:y:2022:i:c:s009630032200203x
DOI: 10.1016/j.amc.2022.127119
Access Statistics for this article
Applied Mathematics and Computation is currently edited by Theodore Simos
More articles in Applied Mathematics and Computation from Elsevier
Bibliographic data for series maintained by Catherine Liu ().