Common Principal Components for Dependent Random Vectors
Beat E. Neuenschwander and
Bernard D. Flury
Journal of Multivariate Analysis, 2000, vol. 75, issue 2, 163-183
Abstract:
Let the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, and let the covariance matrix [Psi] of X be partitioned analogously into submatrices [Psi]ij. The common principal component (CPC) model for dependent random vectors assumes the existence of an orthogonal p by p matrix [beta] such that [beta]t[Psi]ij[beta] is diagonal for all (i, j). After a formal definition of the model, normal theory maximum likelihood estimators are obtained. The asymptotic theory for the estimated orthogonal matrix is derived by a new technique of choosing proper subsets of functionally independent parameters.
Keywords: asymptotic distribution; eigenvalue; eigenvector; entropy; maximum likelihood estimation; multivariate normal distribution; patterned covariance matrices (search for similar items in EconPapers)
Date: 2000
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0047-259X(00)91908-0
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:jmvana:v:75:y:2000:i:2:p:163-183
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Journal of Multivariate Analysis is currently edited by de Leeuw, J.
More articles in Journal of Multivariate Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().