Economics at your fingertips  

Computational aspects of column generation for nonlinear and conic optimization: classical and linearized schemes

Renaud Chicoisne ()
Additional contact information
Renaud Chicoisne: Université Clermont-Auvergne

Computational Optimization and Applications, 2023, vol. 84, issue 3, No 5, 789-831

Abstract: Abstract Solving large scale nonlinear optimization problems requires either significant computing resources or the development of specialized algorithms. For Linear Programming (LP) problems, decomposition methods can take advantage of problem structure, gradually constructing the full problem by generating variables or constraints. We first present a direct adaptation of the Column Generation (CG) methodology for nonlinear optimization problems, such that when optimizing over a structured set $${\mathcal {X}}$$ X plus a moderate number of complicating constraints, we solve a succession of (1) restricted master problems on a smaller set $${\mathcal {S}}\subset {\mathcal {X}}$$ S ⊂ X and (2) pricing problems that are Lagrangean relaxations wrt the complicating constraints. The former provides feasible solutions and feeds dual information to the latter. In turn, the pricing problem identifies a variable of interest that is then taken into account into an updated subset $${\mathcal {S}}'\subset {\mathcal {X}}$$ S ′ ⊂ X . Our approach is valid whenever the master problem has zero Lagrangean duality gap wrt to the complicating constraints, and not only when $${\mathcal {S}}$$ S is the convex hull of the generated variables as in CG for LPs, but also with a variety of subsets such as the conic hull, the linear span, and a special variable aggregation set. We discuss how the structure of $${\mathcal {S}}$$ S and its update mechanism influence the number of iterations required to reach near-optimality and the difficulty of solving the restricted master problems, and present linearized schemes that alleviate the computational burden of solving the pricing problem. We test our methods on synthetic portfolio optimization instances with up to 5 million variables including nonlinear objective functions and second order cone constraints. We show that some CGs with linearized pricing are 2–3 times faster than solving the complete problem directly and are able to provide solutions within 1% of optimality in 6 h for the larger instances, whereas solving the complete problem runs out of memory.

Keywords: Nonlinear optimization; Conic programming; Column generation; Lagrangean duality; Portfolio optimization (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations: Track citations by RSS feed

Downloads: (external link) Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

Ordering information: This journal article can be ordered from

DOI: 10.1007/s10589-022-00445-0

Access Statistics for this article

Computational Optimization and Applications is currently edited by William W. Hager

More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

Page updated 2023-03-24
Handle: RePEc:spr:coopap:v:84:y:2023:i:3:d:10.1007_s10589-022-00445-0