EconPapers    
Economics at your fingertips  
 

Convergence rate of Krasulina estimator

Jiangning Chen

Statistics & Probability Letters, 2019, vol. 155, issue C, -

Abstract: Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. Consider the points X1,X2,…,Xn are vectors drawn i.i.d. from a distribution with mean zero and covariance Σ, where Σ is unknown. Let An=XnXnT, then E[An]=Σ. This paper considers the problem of finding the smallest eigenvalue and eigenvector of matrix Σ. A classical estimator of this type is due to (Krasulina, 1969). We are going to state the convergence proof of Krasulina for the smallest eigenvalue and corresponding eigenvector, and then find their convergence rate.

Keywords: PCA; Incremental; Online updating; Covariance matrix; Rate of convergence; Adaptive estimation (search for similar items in EconPapers)
Date: 2019
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167715219302081
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:155:y:2019:i:c:6

Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01

DOI: 10.1016/j.spl.2019.108562

Access Statistics for this article

Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul

More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:stapro:v:155:y:2019:i:c:6