Clustering sparse binary data with hierarchical Bayesian Bernoulli mixture model
Mao Ye,
Peng Zhang and
Lizhen Nie
Computational Statistics & Data Analysis, 2018, vol. 123, issue C, 32-49
Abstract:
Sparsity in features presents a big technical challenge to existing clustering methods for categorical data. Hierarchical Bayesian Bernoulli mixture model (HBBMM) incorporates constrained empirical Bayes priors for model parameters, so the resulting Expectation Maximization (EM) algorithm of estimator searching is confined in a proper region. The EM algorithm enables to obtain the maximum a posterior (MAP) estimation, in which cluster labels are simultaneously assigned. Three criteria are proposed to identify defining features of individual clusters, leading to understanding of the underlying data structures. Information based model selection criterion is applied to determine the number of clusters. Estimation consistency and performance of model selection criteria are investigated. Two real-world sparse categorical datasets are analyzed with the proposed method.
Keywords: Bayes factor; Categorical data; Defining features; Model selection (search for similar items in EconPapers)
Date: 2018
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S016794731830029X
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:123:y:2018:i:c:p:32-49
DOI: 10.1016/j.csda.2018.01.020
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().