Global Convergence of the EM Algorithm for Unconstrained Latent Variable Models with Categorical Indicators
Alexander Weissman ()
Psychometrika, 2013, vol. 78, issue 1, 134-153
Abstract:
Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by interpreting the EM algorithm as alternating minimization of the Kullback–Leibler divergence between two convex sets. It is shown that these conditions are satisfied by an unconstrained latent class model, yielding an optimal bound against which more highly constrained models may be compared. Copyright The Psychometric Society 2013
Keywords: EM algorithm; latent variable models; latent class models; information theory; Kullback–Leibler divergence; relative entropy; variational calculus; convex optimization; optimal bounds (search for similar items in EconPapers)
Date: 2013
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1007/s11336-012-9295-z (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:psycho:v:78:y:2013:i:1:p:134-153
Ordering information: This journal article can be ordered from
http://www.springer. ... gy/journal/11336/PS2
DOI: 10.1007/s11336-012-9295-z
Access Statistics for this article
Psychometrika is currently edited by Irini Moustaki
More articles in Psychometrika from Springer, The Psychometric Society
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().