Principal Component Analysis
Ke-Lin Du () and
M. N. S. Swamy
Additional contact information
Ke-Lin Du: Concordia University, Department of Electrical and Computer Engineering
M. N. S. Swamy: Concordia University, Department of Electrical and Computer Engineering
Chapter Chapter 13 in Neural Networks and Statistical Learning, 2019, pp 373-425 from Springer
Abstract:
Abstract Subspace learning techniques project high-dimensional data onto low-dimensional spaces. They are typically unsupervised. Well-known subspace learning algorithms are PCA, ICA, locality-preserving projection, and NMF. Discriminant analysis is a supervised subspace learning method and uses the data class label information. PCA is a classical statistical method for signal processing and data analysis. It is a feature extractor in the neural network processing setting, and is related to eigenvalue decomposition and singular value decomposition. This chapter introduces PCA, and the associated methods such as minor component analysis, generalized eigenvalue decomposition, singular value decomposition, factor analysis, and canonical correlation analysis.
Date: 2019
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-1-4471-7452-3_13
Ordering information: This item can be ordered from
http://www.springer.com/9781447174523
DOI: 10.1007/978-1-4471-7452-3_13
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().