Non-asymptotic sub-Gaussian error bounds for hypothesis testing
Yanpeng Li and
Boping Tian
Statistics & Probability Letters, 2022, vol. 189, issue C
Abstract:
Using the sub-Gaussian norm of the Bernoulli random variable, this paper presents the explicit and informative error lower bounds for binary and multiple hypothesis testing in terms of the KL divergence non-asymptotically. Some numerical comparisons are also demonstrated.
Keywords: Pinsker’s bound; KL divergence; Sub-Gaussian; Fano’s inequality (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167715222001377
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:189:y:2022:i:c:s0167715222001377
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
DOI: 10.1016/j.spl.2022.109586
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().