Generalization Bounds in the Predict-Then-Optimize Framework
Othman El Balghiti (),
Adam N. Elmachtoub (),
Paul Grigas () and
Ambuj Tewari ()
Additional contact information
Othman El Balghiti: Department of Industrial Engineering and Operations Research, Columbia University, New York, New York 10027
Adam N. Elmachtoub: Department of Industrial Engineering and Operations Research, Columbia University, New York, New York 10027
Paul Grigas: Department of Industrial Engineering and Operations Research, University of California, Berkeley, Berkeley, California 94720
Ambuj Tewari: Department of Statistics, University of Michigan, Ann Arbor, Michigan 48109
Mathematics of Operations Research, 2023, vol. 48, issue 4, 2043-2065
Abstract:
The predict-then-optimize framework is fundamental in many practical settings: predict the unknown parameters of an optimization problem and then solve the problem using the predicted values of the parameters. A natural loss function in this environment is to consider the cost of the decisions induced by the predicted parameters in contrast to the prediction error of the parameters. This loss function is referred to as the smart predict-then-optimize (SPO) loss. In this work, we seek to provide bounds on how well the performance of a prediction model fit on training data generalizes out of sample in the context of the SPO loss. Because the SPO loss is nonconvex and non-Lipschitz, standard results for deriving generalization bounds do not apply. We first derive bounds based on the Natarajan dimension that, in the case of a polyhedral feasible region, scale at most logarithmically in the number of extreme points but, in the case of a general convex feasible region, have linear dependence on the decision dimension. By exploiting the structure of the SPO loss function and a key property of the feasible region, which we denote as the strength property, we can dramatically improve the dependence on the decision and feature dimensions. Our approach and analysis rely on placing a margin around problematic predictions that do not yield unique optimal solutions and then providing generalization bounds in the context of a modified margin SPO loss function that is Lipschitz continuous. Finally, we characterize the strength property and show that the modified SPO loss can be computed efficiently for both strongly convex bodies and polytopes with an explicit extreme point representation.
Keywords: Primary: 90B99; 68Q32; generalization bounds; prescriptive analytics; regression; predict-then-optimize (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://dx.doi.org/10.1287/moor.2022.1330 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:ormoor:v:48:y:2023:i:4:p:2043-2065
Access Statistics for this article
More articles in Mathematics of Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().