A scalable global optimization algorithm for stochastic nonlinear programs
Yankai Cao and
Victor M. Zavala ()
Additional contact information
Yankai Cao: University of Wisconsin-Madison
Victor M. Zavala: University of Wisconsin-Madison
Journal of Global Optimization, 2019, vol. 75, issue 2, No 5, 393-416
Abstract:
Abstract We present a global optimization algorithm for two-stage stochastic nonlinear programs (NLPs). The algorithm uses a tailored reduced-space spatial branch and bound (BB) strategy to exploit the nearly decomposable structure of the problem. At each node in the BB scheme, a lower bound is constructed by relaxing the so-called non-anticipativity constraints and an upper bound is constructed by fixing the first-stage variables to the current candidate solution. A key advantage of this approach is that both lower and upper bounds can be computed by solving individual scenario subproblems. Another key property of this approach is that it only needs to perform branching on the first-stage variables to guarantee convergence (branching on the second-stage variables is performed implicitly during the computation of lower and upper bounds). Notably, convergence results for this scheme also hold for two-stage stochastic MINLPs with mixed-integer first-stage variables and continuous recourse variables. We present a serial implementation of the algorithm in Julia, that we call SNGO. The implementation is interfaced to the structured modeling language Plasmo.jl, which facilitates benchmarking and model processing. Our implementation incorporates typical features that help accelerate the BB search such as LP-based lower bounding techniques, local search-based upper bounding techniques, and relaxation-based bounds tightening techniques. These strategies require the solution of extensive forms of the stochastic program but can potentially be solved using structured interior-point solvers (when the problem is an NLP). Numerical experiments are performed for a controller tuning formulation, a parameter estimation formulation for microbial growth models, and a stochastic test set from GLOBALlib. We compare the computational results against SCIP and demonstrate that the proposed approach achieves significant speedups.
Keywords: Stochastic NLP; Global optimization; Scalable (search for similar items in EconPapers)
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://link.springer.com/10.1007/s10898-019-00769-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:jglopt:v:75:y:2019:i:2:d:10.1007_s10898-019-00769-y
Ordering information: This journal article can be ordered from
http://www.springer. ... search/journal/10898
DOI: 10.1007/s10898-019-00769-y
Access Statistics for this article
Journal of Global Optimization is currently edited by Sergiy Butenko
More articles in Journal of Global Optimization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().