EconPapers    
Economics at your fingertips  
 

A refined convergence analysis of $$\hbox {pDCA}_{e}$$ pDCA e with applications to simultaneous sparse recovery and outlier detection

Tianxiang Liu (), Ting Kei Pong () and Akiko Takeda ()
Additional contact information
Tianxiang Liu: RIKEN Center for Advanced Intelligence Project
Ting Kei Pong: The Hong Kong Polytechnic University
Akiko Takeda: RIKEN Center for Advanced Intelligence Project

Computational Optimization and Applications, 2019, vol. 73, issue 1, No 3, 69-100

Abstract: Abstract We consider the problem of minimizing a difference-of-convex (DC) function, which can be written as the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous possibly nonsmooth concave function. We refine the convergence analysis in Wen et al. (Comput Optim Appl 69, 297–324, 2018) for the proximal DC algorithm with extrapolation ( $$\hbox {pDCA}_e$$ pDCA e ) and show that the whole sequence generated by the algorithm is convergent without imposing differentiability assumptions in the concave part. Our analysis is based on a new potential function and we assume such a function is a Kurdyka–Łojasiewicz (KL) function. We also establish a relationship between our KL assumption and the one used in Wen et al. (2018). Finally, we demonstrate how the $$\hbox {pDCA}_e$$ pDCA e can be applied to a class of simultaneous sparse recovery and outlier detection problems arising from robust compressed sensing in signal processing and least trimmed squares regression in statistics. Specifically, we show that the objectives of these problems can be written as level-bounded DC functions whose concave parts are typically nonsmooth. Moreover, for a large class of loss functions and regularizers, the KL exponent of the corresponding potential function are shown to be 1/2, which implies that the $$\hbox {pDCA}_e$$ pDCA e is locally linearly convergent when applied to these problems. Our numerical experiments show that the $$\hbox {pDCA}_e$$ pDCA e usually outperforms the proximal DC algorithm with nonmonotone linesearch (Liu et al. in Math Program, 2018. https://doi.org/10.1007/s10107-018-1327-8 , Appendix A) in both CPU time and solution quality for this particular application.

Keywords: Difference-of-convex optimization; Kurdyka–Lojasiewicz property; Sparse recovery; Outlier detection (search for similar items in EconPapers)
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (8)

Downloads: (external link)
http://link.springer.com/10.1007/s10589-019-00067-z Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:coopap:v:73:y:2019:i:1:d:10.1007_s10589-019-00067-z

Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589

DOI: 10.1007/s10589-019-00067-z

Access Statistics for this article

Computational Optimization and Applications is currently edited by William W. Hager

More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:coopap:v:73:y:2019:i:1:d:10.1007_s10589-019-00067-z