EconPapers    
Economics at your fingertips  
 

Parallel Bayesian Global Optimization of Expensive Functions

Jialei Wang (), Scott C. Clark (), Eric Liu () and Peter I. Frazier ()
Additional contact information
Jialei Wang: SensesAI, Beijing 100016, China
Scott C. Clark: SigOpt, San Francisco, California 94104
Eric Liu: Yelp, Inc., San Francisco, California 94105
Peter I. Frazier: School of Operations Research and Information Engineering, Cornell University, Ithaca, New York 14853

Operations Research, 2020, vol. 68, issue 6, 1850-1865

Abstract: We consider parallel global optimization of derivative-free expensive-to-evaluate functions, and propose an efficient method based on stochastic approximation for implementing a conceptual Bayesian optimization algorithm proposed by Ginsbourger in 2008. At the heart of this algorithm is maximizing the information criterion called the “multipoints expected improvement,” or the q - EI . To accomplish this, we use infinitesimal perturbation analysis (IPA) to construct a stochastic gradient estimator and show that this estimator is unbiased. We also show that the stochastic gradient ascent algorithm using the constructed gradient estimator converges to a stationary point of the q - EI surface, and therefore, as the number of multiple starts of the gradient ascent algorithm and the number of steps for each start grow large, the one-step Bayes-optimal set of points is recovered. We show in numerical experiments using up to 128 parallel evaluations that our method for maximizing the q - EI is faster than methods based on closed-form evaluation using high-dimensional integration, when considering many parallel function evaluations, and is comparable in speed when considering few. We also show that the resulting one-step Bayes-optimal algorithm for parallel global optimization finds high-quality solutions with fewer evaluations than a heuristic based on approximately maximizing the q - EI . A high-quality open source implementation of this algorithm is available in the open source Metrics Optimization Engine (MOE).

Keywords: Bayesian optimization; parallel optimization; parallel expected improvement; infinitesimal perturbation analysis (search for similar items in EconPapers)
Date: 2020
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://doi.org/10.1287/opre.2019.1966 (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:inm:oropre:v:68:y:2020:i:6:p:1850-1865

Access Statistics for this article

More articles in Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().

 
Page updated 2025-03-19
Handle: RePEc:inm:oropre:v:68:y:2020:i:6:p:1850-1865