# A derivative-free algorithm for linearly constrained optimization problems

E. Gumma (), M. Hashim () and Majid Ali ()

Computational Optimization and Applications, 2014, vol. 57, issue 3, 599-621

Abstract: Based on the NEWUOA algorithm, a new derivative-free algorithm is developed, named LCOBYQA. The main aim of the algorithm is to find a minimizer $x^{*} \in\mathbb{R}^{n}$ of a non-linear function, whose derivatives are unavailable, subject to linear inequality constraints. The algorithm is based on the model of the given function constructed from a set of interpolation points. LCOBYQA is iterative, at each iteration it constructs a quadratic approximation (model) of the objective function that satisfies interpolation conditions, and leaves some freedom in the model. The remaining freedom is resolved by minimizing the Frobenius norm of the change to the second derivative matrix of the model. The model is then minimized by a trust-region subproblem using the conjugate gradient method for a new iterate. At times the new iterate is found from a model iteration, designed to improve the geometry of the interpolation points. Numerical results are presented which show that LCOBYQA works well and is very competing against available model-based derivative-free algorithms. Copyright Springer Science+Business Media New York 2014

Keywords: Derivative-free optimization; Linearly constrained problem; Least Frobenius norm method; Trust-region subproblem; Conjugate gradient method (search for similar items in EconPapers)
Date: 2014
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2) Track citations by RSS feed

http://hdl.handle.net/10.1007/s10589-013-9607-y (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589