Stochastic Algorithmic Differentiation of (Expectations of) Discontinuous Functions (Indicator Functions)
Christian Fries
Papers from arXiv.org
Abstract:
In this paper, we present a method for the accurate estimation of the derivative (aka.~sensitivity) of expectations of functions involving an indicator function by combining a stochastic algorithmic differentiation and a regression. The method is an improvement of the approach presented in [Risk Magazine April 2018]. The finite difference approximation of a partial derivative of a Monte-Carlo integral of a discontinuous function is known to exhibit a high Monte-Carlo error. The issue is evident since the Monte-Carlo approximation of a discontinuous function is just a finite sum of discontinuous functions and as such, not even differentiable. The algorithmic differentiation of a discontinuous function is problematic. A natural approach is to replace the discontinuity by continuous functions. This is equivalent to replacing a path-wise automatic differentiation by a (local) finite difference approximation. We present an improvement (in terms of variance reduction) by decoupling the integration of the Dirac delta and the remaining conditional expectation and estimating the two parts by separate regressions. For the algorithmic differentiation, we derive an operator that can be injected seamlessly - with minimal code changes - into the algorithm resulting in the exact result.
Date: 2018-11, Revised 2019-11
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://arxiv.org/pdf/1811.05741 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:1811.05741
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().