Recovering Best Statistical Guarantees via the Empirical Divergence-Based Distributionally Robust Optimization
Henry Lam ()
Additional contact information
Henry Lam: Department of Industrial Engineering and Operations Research, Columbia University, New York, New York 10027
Operations Research, 2019, vol. 67, issue 4, 1090-1105
We investigate the use of distributionally robust optimization (DRO) as a tractable tool to recover the asymptotic statistical guarantees provided by the central limit theorem, for maintaining the feasibility of an expected value constraint under ambiguous probability distributions. We show that using empirically defined Burg-entropy divergence balls to construct the DRO can attain such guarantees. These balls, however, are not reasoned from the standard data-driven DRO framework because, by themselves, they can have low or even zero probability of covering the true distribution. Rather, their superior statistical performances are endowed by linking the resulting DRO with empirical likelihood and empirical processes. We show that the sizes of these balls can be optimally calibrated using χ 2 -process excursion. We conduct numerical experiments to support our theoretical findings.
Keywords: distributionally robust optimization; empirical likelihood; empirical process; chi-square process; central limit theorem (search for similar items in EconPapers)
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1) Track citations by RSS feed
Downloads: (external link)
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
Persistent link: https://EconPapers.repec.org/RePEc:inm:oropre:v:67:y:2019:i:4:p:1090-1105
Access Statistics for this article
More articles in Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Matthew Walls ().