Economics at your fingertips  

Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition

Brian Irwin () and Eldad Haber ()
Additional contact information
Brian Irwin: The University of British Columbia
Eldad Haber: The University of British Columbia

Computational Optimization and Applications, 2023, vol. 84, issue 3, No 1, 702 pages

Abstract: Abstract In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that treating the secant condition with a penalty method approach motivated by regularized least squares estimation generates a parametric family with the original BFGS update at one extreme and not updating the inverse Hessian approximation at the other extreme. Furthermore, we find the curvature condition is relaxed as the family moves towards not updating the inverse Hessian approximation, and disappears entirely at the extreme where the inverse Hessian approximation is not updated. These developments allow us to develop a method we refer to as Secant Penalized BFGS (SP-BFGS) that allows one to relax the secant condition based on the amount of noise in the gradient measurements. SP-BFGS provides a means of incrementally updating the new inverse Hessian approximation with a controlled amount of bias towards the previous inverse Hessian approximation, which allows one to replace the overwriting nature of the original BFGS update with an averaging nature that resists the destructive effects of noise and can cope with negative curvature measurements. We discuss the theoretical properties of SP-BFGS, including convergence when minimizing strongly convex functions in the presence of uniformly bounded noise. Finally, we present extensive numerical experiments using over 30 problems from the CUTEst test problem set that demonstrate the superior performance of SP-BFGS compared to BFGS in the presence of both noisy function and gradient evaluations.

Keywords: Quasi-Newton methods; Secant condition; Penalty methods; Least squares estimation; Measurement error; Noise robust optimization (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations: Track citations by RSS feed

Downloads: (external link) Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

Ordering information: This journal article can be ordered from

DOI: 10.1007/s10589-022-00448-x

Access Statistics for this article

Computational Optimization and Applications is currently edited by William W. Hager

More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

Page updated 2023-03-24
Handle: RePEc:spr:coopap:v:84:y:2023:i:3:d:10.1007_s10589-022-00448-x