Discriminating Between Weibull and Log-Normal Distributions Based on Kullback-Leibler Divergence
Ali Akbar Bromideh ()
Additional contact information
Ali Akbar Bromideh: Shahid Beheshti University
Istanbul University Econometrics and Statistics e-Journal, 2012, vol. 16, issue 1, 44-54
The Weibull and Log-Normal distributions are frequently used in reliability to analyze lifetime (or failure time) data. The ratio of maximized likelihood (RML) has been extensively used in choosing between the two distributions. The Kullback-Leibler information is a measure of uncertainty between two densities. We examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the Weibull or Log-Normal distribution. An advantage of the KLD is that it incorporates entropy of each model. We explain the applicability of the KLD by a real data set and the consistency of the KLD with the RML is established.
Keywords: Model discrimination; Weibull distribution; Log-Normal distribution; Kullback-Leibler divergence; Ratio of maximized likelihood (search for similar items in EconPapers)
References: View references in EconPapers View complete reference list from CitEc
Citations View citations in EconPapers (1) Track citations by RSS feed
Downloads: (external link)
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
Persistent link: https://EconPapers.repec.org/RePEc:ist:ancoec:v:16:y:2012:i:1:p:44-54
Access Statistics for this article
More articles in Istanbul University Econometrics and Statistics e-Journal from Department of Econometrics, Faculty of Economics, Istanbul University Contact information at EDIRC.
Series data maintained by Kutluk Kagan Sumer ().