Testing Parametric Distribution Family Assumptions via Differences in Differential Entropy
Ron Mittelhammer,
George Judge and
Miguel Henry
Papers from arXiv.org
Abstract:
We introduce a broadly applicable statistical procedure for testing which parametric distribution family generated a random sample of data. The method, termed the Difference in Differential Entropy (DDE) test, provides a unified framework applicable to a wide range of distributional families, with asymptotic validity grounded in established maximum likelihood, bootstrap, and kernel density estimation principles. The test is straightforward to implement, computationally efficient, and requires no tuning parameters or specialized regularity conditions. It compares an MLE-based estimate of differential entropy under the null hypothesis with a nonparametric bootstrapped kernel density estimate, using their divergence as an information-theoretic measure of model fit.
Date: 2025-12
References: Add references at CitEc
Citations:
Downloads: (external link)
http://arxiv.org/pdf/2512.11305 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2512.11305
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().