Asymptotic Properties of a Statistical Estimator of the Jeffreys Divergence: The Case of Discrete Distributions
Vladimir Glinskiy,
Artem Logachov,
Olga Logachova,
Helder Rojas,
Lyudmila Serga () and
Anatoly Yambartsev
Additional contact information
Vladimir Glinskiy: Department of Business Analytics, Siberian Institute of Management—Branch of the Russian Presidential Academy of National Economy and Public Administration, Novosibirsk State University of Economics and Management, 630102 Novosibirsk, Russia
Artem Logachov: Department of Business Analytics, Siberian Institute of Management—Branch of the Russian Presidential Academy of National Economy and Public Administration, Novosibirsk State University of Economics and Management, 630102 Novosibirsk, Russia
Olga Logachova: Department of Higher Mathematics, Siberian State University of Geosystems and Technologies (SSUGT), 630108 Novosibirsk, Russia
Helder Rojas: Escuela Profesional de Ingeniería Estadística, Universidad Nacional de Ingeniería, Lima 00051, Peru
Lyudmila Serga: Department of Business Analytics, Siberian Institute of Management—Branch of the Russian Presidential Academy of National Economy and Public Administration, Novosibirsk State University of Economics and Management, 630102 Novosibirsk, Russia
Anatoly Yambartsev: Department of Statistics, Institute of Mathematics and Statistics, University of São Paulo (USP), São Paulo 05508-220, Brazil
Mathematics, 2024, vol. 12, issue 21, 1-16
Abstract:
We investigate the asymptotic properties of the plug-in estimator for the Jeffreys divergence, the symmetric variant of the Kullback–Leibler (KL) divergence. This study focuses specifically on the divergence between discrete distributions. Traditionally, estimators rely on two independent samples corresponding to two distinct conditions. However, we propose a one-sample estimator where the condition results from a random event. We establish the estimator’s asymptotic unbiasedness (law of large numbers) and asymptotic normality (central limit theorem). Although the results are expected, the proofs require additional technical work due to the randomness of the conditions.
Keywords: KL divergence estimation; Jeffreys divergence; central limit theorem (CLT) (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/21/3319/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/21/3319/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:21:p:3319-:d:1504748
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().