Initializing the EM algorithm in Gaussian mixture models with an unknown number of components
Volodymyr Melnykov and
Igor Melnykov
Computational Statistics & Data Analysis, 2012, vol. 56, issue 6, 1381-1395
Abstract:
An approach is proposed for initializing the expectation–maximization (EM) algorithm in multivariate Gaussian mixture models with an unknown number of components. As the EM algorithm is often sensitive to the choice of the initial parameter vector, efficient initialization is an important preliminary process for the future convergence of the algorithm to the best local maximum of the likelihood function. We propose a strategy initializing mean vectors by choosing points with higher concentrations of neighbors and using a truncated normal distribution for the preliminary estimation of dispersion matrices. The suggested approach is illustrated on examples and compared with several other initialization methods.
Keywords: Gaussian mixture model; Initialization; EM algorithm; Eigenvalue decomposition; Truncated normal distribution (search for similar items in EconPapers)
Date: 2012
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (17)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167947311003963
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:56:y:2012:i:6:p:1381-1395
DOI: 10.1016/j.csda.2011.11.002
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().