On the Convergence of Hypergeometric to Binomial Distributions

  •  Upul Rupassara    
  •  Bishnu Sedai    


This study presents a measure-theoretic approach to estimate the upper bound on the total variation of the difference between hypergeometric and binomial distributions using the Kullback-Leibler information divergence. The binomial distribution can be used to find the probabilities associated with the binomial experiments. But if the sample size is large relative to the population size, the experiment may not be binomial, and a binomial distribution is not a good choice to find the probabilities associated with the experiment. The hypergeometric probability distribution is the appropriate probability model to be used when the sample size is large compared to the population size. An upper bound for the total variation in the distance between the hypergeometric and binomial distributions is derived using only the sample and population sizes. This upper bound is used to demonstrate how the hypergeometric distribution uniformly converges to the binomial distribution when the population size increases relative to the sample size.

This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1913-8989
  • ISSN(Online): 1913-8997
  • Started: 2008
  • Frequency: semiannual

Journal Metrics

WJCI (2022): 0.636

Impact Factor 2022 (by WJCI):  0.419

h-index (January 2024): 43

i10-index (January 2024): 193

h5-index (January 2024): N/A

h5-median(January 2024): N/A

( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )