Entropy-based test for generalised Gaussian distributions
Mehmet Siddik Cadirci,
Dafydd Evans,
Nikolai Leonenko and
Vitalii Makogin
Computational Statistics & Data Analysis, 2022, vol. 173, issue C
Abstract:
The proof of L2 consistency for the kth nearest neighbour distance estimator of the Shannon entropy for an arbitrary fixed k≥1 is provided. It is constructed the non-parametric test of goodness-of-fit for a class of introduced generalised multivariate Gaussian distributions based on a maximum entropy principle. The theoretical results are followed by numerical studies on simulated samples. It is shown that increasing of k improves the power of the introduced goodness of fit tests. The asymptotic normality of the test statistics is experimentally proven.
Keywords: Maximum entropy principle; Generalised Gaussian distribution; Shannon entropy; Nearest neighbour estimator of entropy; Goodness-of-fit test (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167947322000822
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:173:y:2022:i:c:s0167947322000822
DOI: 10.1016/j.csda.2022.107502
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().