The Kullback information criterion for mixture regression models
Bezza Hafidi and
Abdallah Mkhadri
Statistics & Probability Letters, 2010, vol. 80, issue 9-10, 807-815
Abstract:
We consider the problem of jointly selecting the number of components and variables in finite mixture regression models. The classical model selection criterion, AIC or BIC, may not be satisfactory in this setting, especially when the sample size is small or the number of variables is large. Specifically, they fit too many components and retain too many variables. An alternative mixture regression criterion, called MRC, which simultaneously determines the number of components and variables in mixture regression models, was proposed by Naik et al. (2007). In the same setting, we propose a new information criterion, called , for the simultaneous determination of the number of components and predictors. is based on the Kullback symmetric divergence instead of the Kullback directed divergence used for MRC. We show that the new criterion performs well than MRC in a small simulation study.
Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-7152(10)00026-X
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:80:y:2010:i:9-10:p:807-815
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().