EconPapers    
Economics at your fingertips  
 

The efficient application of automatic differentiation for computing gradients in financial applications

Wei Xu, Xi Chen and Thomas F. Coleman

Journal of Computational Finance

Abstract: ABSTRACT Automatic differentiation (AD) is a practical field of computational mathematics that is of growing interest across many industries, including finance. The use of reverse-mode AD is particularly interesting, since it allows for the computation of gradients in the same time required to evaluate the objective function itself. However, it requires excessive memory. This memory requirement can make reverse-mode AD infeasible in some cases (depending on the function complexity and available RAM) and slower than expected in others, due to the use of secondary memory and nonlocalized memory references. However, it turns out that many complex (expensive) functions in finance exhibit a natural substitution structure. In this paper, we illustrate this structure in computational finance as it arises in calibration and inverse problems, and determine Greeks in a Monte Carlo setting. In these cases, the required memory is a small fraction of that required by reverse-mode AD, but the computing time complexity is the same. In fact, our results indicate a significant realized speedup compared with straight reverse-mode AD.

References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.risk.net/journal-of-computational-fina ... nancial-applications (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:rsk:journ0:2439194

Access Statistics for this article

More articles in Journal of Computational Finance from Journal of Computational Finance
Bibliographic data for series maintained by Thomas Paine ().

 
Page updated 2025-03-22
Handle: RePEc:rsk:journ0:2439194