Finite-Sample Guarantees for High-Dimensional DML
Victor Quintas-Martinez
Papers from arXiv.org
Abstract:
Debiased machine learning (DML) offers an attractive way to estimate treatment effects in observational settings, where identification of causal parameters requires a conditional independence or unconfoundedness assumption, since it allows to control flexibly for a potentially very large number of covariates. This paper gives novel finite-sample guarantees for joint inference on high-dimensional DML, bounding how far the finite-sample distribution of the estimator is from its asymptotic Gaussian approximation. These guarantees are useful to applied researchers, as they are informative about how far off the coverage of joint confidence bands can be from the nominal level. There are many settings where high-dimensional causal parameters may be of interest, such as the ATE of many treatment profiles, or the ATE of a treatment on many outcomes. We also cover infinite-dimensional parameters, such as impacts on the entire marginal distribution of potential outcomes. The finite-sample guarantees in this paper complement the existing results on consistency and asymptotic normality of DML estimators, which are either asymptotic or treat only the one-dimensional case.
Date: 2022-06
New Economics Papers: this item is included in nep-big and nep-ecm
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://arxiv.org/pdf/2206.07386 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2206.07386
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().