Proximal methods for reweighted lQ-regularization of sparse signal recovery
Bo Peng and
Hong-Kun Xu
Applied Mathematics and Computation, 2020, vol. 386, issue C
Abstract:
To recover a sparse signal from a noised linear measurement system Ax=b+e, convex lp regularization methods (i.e., 1 ≤ p < 2, in particular, p=1) are commonly used under certain conditions. Recently, however, more attentions have been paid to nonconvex lq regularization methods (i.e., 0 < q < 1, in particular, q=1/2) for recovering a sparse signal. In this paper, we use proximal methods to study both convex and nonconvex reweighted lQ regularization for recovering a sparse signal. Convex lQ regularization is introduced by S. Voronin and I. Daubechies [19]. We extend it to the nonconvex case and our results therefore supplement those of Voronin and Daubechies [19]. We also study Nesterov’s acceleration method for the nonconvex case. Our numerical experiments show that nonconvex lQ regularization can more effectively recover sparse signals.
Keywords: Proximal method; Sparsity; Noise; Signal recovery; lQ-Regularization; Reweight; Nonconvex optimization (search for similar items in EconPapers)
Date: 2020
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0096300320303702
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:apmaco:v:386:y:2020:i:c:s0096300320303702
DOI: 10.1016/j.amc.2020.125408
Access Statistics for this article
Applied Mathematics and Computation is currently edited by Theodore Simos
More articles in Applied Mathematics and Computation from Elsevier
Bibliographic data for series maintained by Catherine Liu ().