On the tight constant in the multivariate Dvoretzky–Kiefer–Wolfowitz inequality
Statistics & Probability Letters, 2021, vol. 173, issue C
We derive the tight constant in the multivariate version of the Dvoretzky–Kiefer–Wolfowitz inequality. The inequality is leveraged to construct the first fully non-parametric test for multivariate probability distributions including a simple formula for the test statistic. We also generalize the test under appropriate α-mixing conditions and describe applications of the tests to machine learning and representative sampling.
Keywords: Machine learning; Empirical process; Hypothesis test; Non-parametric (search for similar items in EconPapers)
References: View references in EconPapers View complete reference list from CitEc
Citations: Track citations by RSS feed
Downloads: (external link)
Full text for ScienceDirect subscribers only
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:173:y:2021:i:c:s016771522100050x
Ordering information: This journal article can be ordered from
https://shop.elsevie ... _01_ooc_1&version=01
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().