A note on the identifiability of the conditional expectation for the mixtures of neural networks
Jean-Pierre Stockis,
Joseph Tadjuidje-Kamgaing and
Jürgen Franke
Statistics & Probability Letters, 2008, vol. 78, issue 6, 739-742
Abstract:
We consider a generalized mixture of nonlinear AR models, a hidden Markov model for which the autoregressive functions are single layer feedforward neural networks. The nontrivial problem of identifiability, which is usually postulated for hidden Markov models, is addressed here.
Keywords: Mixture; models; Neural; networks; Identifiability (search for similar items in EconPapers)
Date: 2008
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-7152(07)00326-4
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:78:y:2008:i:6:p:739-742
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().