Matrix Factorization Techniques in Machine Learning, Signal Processing, and Statistics
Ke-Lin Du (),
M. N. S. Swamy,
Zhang-Quan Wang and
Wai Ho Mow
Additional contact information
Ke-Lin Du: Department of Electrical and Computer Engineering, Concordia University, Montreal, QC H3G 1M8, Canada
M. N. S. Swamy: Department of Electrical and Computer Engineering, Concordia University, Montreal, QC H3G 1M8, Canada
Zhang-Quan Wang: College of Information Science and Technology, Zhejiang Shuren University, Hangzhou 310015, China
Wai Ho Mow: Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Hong Kong SAR, China
Mathematics, 2023, vol. 11, issue 12, 1-50
Abstract:
Compressed sensing is an alternative to Shannon/Nyquist sampling for acquiring sparse or compressible signals. Sparse coding represents a signal as a sparse linear combination of atoms, which are elementary signals derived from a predefined dictionary. Compressed sensing, sparse approximation, and dictionary learning are topics similar to sparse coding. Matrix completion is the process of recovering a data matrix from a subset of its entries, and it extends the principles of compressed sensing and sparse approximation. The nonnegative matrix factorization is a low-rank matrix factorization technique for nonnegative data. All of these low-rank matrix factorization techniques are unsupervised learning techniques, and can be used for data analysis tasks, such as dimension reduction, feature extraction, blind source separation, data compression, and knowledge discovery. In this paper, we survey a few emerging matrix factorization techniques that are receiving wide attention in machine learning, signal processing, and statistics. The treated topics are compressed sensing, dictionary learning, sparse representation, matrix completion and matrix recovery, nonnegative matrix factorization, the Nyström method, and CUR matrix decomposition in the machine learning framework. Some related topics, such as matrix factorization using metaheuristics or neurodynamics, are also introduced. A few topics are suggested for future investigation in this article.
Keywords: compressed sensing; dictionary learning; sparse approximation; matrix completion; nonnegative matrix factorization (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
https://www.mdpi.com/2227-7390/11/12/2674/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/12/2674/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:12:p:2674-:d:1169553
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().