A dual adaptive algorithm for matrix optimization with sparse group lasso regularization
Jinji Yang (),
Jiang Hu () and
Chungen Shen ()
Additional contact information
Jinji Yang: University of Shanghai for Science and Technology
Jiang Hu: University of California
Chungen Shen: University of Shanghai for Science and Technology
Journal of Global Optimization, 2025, vol. 92, issue 3, No 9, 737-774
Abstract:
Abstract Matrix optimization has various applications in finance, statistics, and engineering, etc. In this paper, we derive the Lagrangian dual of the matrix optimization problem with sparse group lasso regularization, and develop an adaptive gradient/semismooth Newton algorithm for this dual. The algorithm adaptively switches between semismooth Newton and gradient descent iterations, relying on the decrease of the residuals or values of the dual objective function. Specifically, the algorithm starts with the gradient iteration and switches to the semismooth Newton iteration when the residual decreases to a given threshold value. If the trial step size for the semismooth Newton iteration has been shrunk several times or the residual does not decrease sufficiently, the algorithm switches back to the gradient iteration and reduces the threshold value for invoking the semismooth Newton iteration. Under some mild conditions, the global convergence of the proposed algorithm is proved. Moreover, local superlinear convergence is achieved under one of two scenarios: either when the constraint nondegeneracy condition is met, or when both the strict complementarity and the local error bound conditions are simultaneously satisfied. Some numerical results on synthetic and real data sets demonstrate the efficiency and robustness of our proposed algorithm.
Keywords: Sparse group lasso regularization; Semismooth Newton iteration; Gradient descent iteration; Constraint nondegeneracy condition; Local error bound condition (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10898-025-01492-7 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:jglopt:v:92:y:2025:i:3:d:10.1007_s10898-025-01492-7
Ordering information: This journal article can be ordered from
http://www.springer. ... search/journal/10898
DOI: 10.1007/s10898-025-01492-7
Access Statistics for this article
Journal of Global Optimization is currently edited by Sergiy Butenko
More articles in Journal of Global Optimization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().