Bayesian optimum designs for discriminating between models with any distribution
C. Tommasi and
J. López-Fidalgo
Computational Statistics & Data Analysis, 2010, vol. 54, issue 1, 143-150
Abstract:
The Bayesian KL-optimality criterion is useful for discriminating between any two statistical models in the presence of prior information. If the rival models are not nested then, depending on which model is true, two different Kullback-Leibler distances may be defined. The Bayesian KL-optimality criterion is a convex combination of the expected values of these two possible Kullback-Leibler distances between the competing models. These expectations are taken over the prior distributions of the parameters and the weights of the convex combination are given by the prior probabilities of the models. Concavity of the Bayesian KL-optimality criterion is proved, thus classical results of Optimal Design Theory can be applied. A standardized version of the proposed criterion is also given in order to take into account possible different magnitudes of the two Kullback-Leibler distances. Some illustrative examples are provided.
Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (7)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-9473(09)00265-5
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:54:y:2010:i:1:p:143-150
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().