On the Convergence of Hypergeometric to Binomial Distributions
Upul Rupassara and
Bishnu Sedai
Computer and Information Science, 2023, vol. 16, issue 3, 15
Abstract:
This study presents a measure-theoretic approach to estimate the upper bound on the total variation of the difference between hypergeometric and binomial distributions using the Kullback-Leibler information divergence. The binomial distribution can be used to find the probabilities associated with the binomial experiments. But if the sample size is large relative to the population size, the experiment may not be binomial, and a binomial distribution is not a good choice to find the probabilities associated with the experiment. The hypergeometric probability distribution is the appropriate probability model to be used when the sample size is large compared to the population size. An upper bound for the total variation in the distance between the hypergeometric and binomial distributions is derived using only the sample and population sizes. This upper bound is used to demonstrate how the hypergeometric distribution uniformly converges to the binomial distribution when the population size increases relative to the sample size.
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://ccsenet.org/journal/index.php/cis/article/download/0/0/49045/52882 (application/pdf)
https://ccsenet.org/journal/index.php/cis/article/view/0/49045 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ibn:cisjnl:v:16:y:2023:i:3:p:15
Access Statistics for this article
More articles in Computer and Information Science from Canadian Center of Science and Education Contact information at EDIRC.
Bibliographic data for series maintained by Canadian Center of Science and Education ().