Negative hypergeometric distribution explained

In probability theory and statistics, the negative hypergeometric distribution describes probabilities for when sampling from a finite population without replacement in which each sample can be classified into two mutually exclusive categories like Pass/Fail or Employed/Unemployed. As random selections are made from the population, each subsequent draw decreases the population causing the probability of success to change with each draw. Unlike the standard hypergeometric distribution, which describes the number of successes in a fixed sample size, in the negative hypergeometric distribution, samples are drawn until

r

failures have been found, and the distribution describes the probability of finding

k

successes in such a sample. In other words, the negative hypergeometric distribution describes the likelihood of

k

successes in a sample with exactly

r

failures.

Definition

There are

N

elements, of which

K

are defined as "successes" and the rest are "failures".

Elements are drawn one after the other, without replacements, until

r

failures are encountered. Then, the drawing stops and the number

k

of successes is counted. The negative hypergeometric distribution,

NHGN,K,r(k)

is the discrete distribution of this

k

.

[1]

The negative hypergeometric distribution is a special case of the beta-binomial distribution[2] with parameters

\alpha=r

and

\beta=N-K-r+1

both being integers (and

n=K

).

The outcome requires that we observe

k

successes in

(k+r-1)

draws and the

(k+r)-th

bit must be a failure. The probability of the former can be found by the direct application of the hypergeometric distribution

(HGN,K,k+r-1(k))

and the probability of the latter is simply the number of failures remaining

(=N-K-(r-1))

divided by the size of the remaining population

(=N-(k+r-1)

. The probability of having exactly

k

successes up to the

r-th

failure (i.e. the drawing stops as soon as the sample includes the predefined number of

r

failures) is then the product of these two probabilities:
\binom{K
k

\binom{N-K}{k+r-1-k}}{\binom{N}{k+r-1}}

N-K-(r-1)=
N-(k+r-1)
{{k+r-1
\choose{k}}{{N-r-k}\choose{K-k}}}{N

\chooseK}.

X

follows the negative hypergeometric distribution if its probability mass function (pmf) is given by

f(k;N,K,r)\equiv\Pr(X=k)=

{{k+r-1
\choose{k}}{{N-r-k}\choose{K-k}}}{N

\chooseK}fork=0,1,2,...c,K

where

N

is the population size,

K

is the number of success states in the population,

r

is the number of failures,

k

is the number of observed successes,

a\chooseb

is a binomial coefficientBy design the probabilities sum up to 1. However, in case we want show it explicitly we have:
K
\sum
k=0

\Pr(X=k)=

K
\sum
k=0
{{k+r-1
\choose{k}}{{N-r-k}\choose{K-k}}}{N

\chooseK} =

1
N\chooseK
K
\sum
k=0

{{k+r-1}\choose{k}}{{N-r-k}\choose{K-k}} =

1
N\chooseK

{N\chooseK}=1,

where we have used that,

\begin{align}

k
\sum
j=0

\binom{j+m}{j}\binom{n-m-j}{k-j}

k
&=\sum
j=0

(-1)j\binom{-m-1}{j}(-1)k-j\binom{m+1+k-n-2}{k-j}\\ &=(-1)k

k
\sum
j=0

\binom{-m-1}{j}\binom{k-n-2-(-m-1)}{k-j}\\ &=(-1)k\binom{k-n-2}{k}\\ &=(-1)k\binom{k-(n+1)-1}{k}\\ &=\binom{n+1}{k}, \end{align}

which can be derived using the binomial identity,

{{n\choosek}=(-1)k{k-n-1\choosek}},

and the Chu–Vandermonde identity,

k
\sum
j=0

\binommj\binom{n-m}{k-j}=\binomnk,

which holds for any complex-values

m

and

n

and any non-negative integer

k

.

Expectation

When counting the number

k

of successes before

r

failures, the expected number of successes is
rK
N-K+1
and can be derived as follows.

\begin{align} E[X]&=

K
\sum
k=0

k\Pr(X=k)=

K
\sum
k=0

k

{{k+r-1
\choose{k}}{{N-r-k}\choose{K-k}}}{N

\chooseK} =

r
N\chooseK
K
\left[\sum
k=0
(k+r)
r

{{k+r-1}\choose{r-1}}{{N-r-k}\choose{K-k}}\right]-r\\ &=

r
N\chooseK
K
\left[\sum
k=0

{{k+r}\choose{r}}{{N-r-k}\choose{K-k}}\right]-r =

r
N\chooseK
K
\left[\sum
k=0

{{k+r}\choose{k}}{{N-r-k}\choose{K-k}}\right]-r\\ &=

r
N\chooseK

\left[{{N+1}\chooseK}\right]-r=

rK
N-K+1

, \end{align}

where we have used the relationship

k
\sum
j=0

\binom{j+m}{j}\binom{n-m-j}{k-j}=\binom{n+1}{k}

, that we derived above to show that the negative hypergeometric distribution was properly normalized.

Variance

The variance can be derived by the following calculation.

\begin{align} E[X2]&=

K
\sum
k=0

k2\Pr(X=k) =

K
\left[\sum
k=0

(k+r)(k+r+1)\Pr(X=k)\right]-(2r+1)E[X]-r2-r\\ &=

r(r+1)
N\chooseK
K
\left[\sum
k=0

{{k+r+1}\choose{r+1}}{{N+1-(r+1)-k}\choose{K-k}}\right]-(2r+1)E[X]-r2-r\\ &=

r(r+1)
N\chooseK

\left[{{N+2}\chooseK}\right]-(2r+1)E[X]-r2-r =

rK(N-r+Kr+1)
(N-K+1)(N-K+2)

\end{align}

Then the variance is

rm{Var}[X]=E[X2]-\left(E[X]\right)2=

rK(N+1)(N-K-r+1)
(N-K+1)2(N-K+2)

Related distributions

If the drawing stops after a constant number

n

of draws (regardless of the number of failures), then the number of successes has the hypergeometric distribution,

HGN,K,n(k)

. The two functions are related in the following way:[1]

NHGN,K,r(k)=1-HGN,N-K,k+r(r-1)

Negative-hypergeometric distribution (like the hypergeometric distribution) deals with draws without replacement, so that the probability of success is different in each draw. In contrast, negative-binomial distribution (like the binomial distribution) deals with draws with replacement, so that the probability of success is the same and the trials are independent. The following table summarizes the four distributions related to drawing items:

With replacements No replacements
  1. of successes in constant # of draws
hypergeometric distribution
  1. of successes in constant # of failures
negative hypergeometric distribution

Some authors[3] [4] define the negative hypergeometric distribution to be the number of draws required to get the

r

th failure. If we let

Y

denote this number then it is clear that

Y=X+r

where

X

is as defined above. Hence the PMF
\Pr(Y=y)=\binom{y-1}{r-1}\binom{N-y
N-K-r
}.

If we let the number of failures

N-K

be denoted by

M

means that we have
\Pr(Y=y)=\binom{y-1}{r-1}\binom{N-y
M-r
}.

The support of

Y

is the set

\{r,r+1,...,N-M+r\}

. It is clear that:

E[Y]=E[X]+r=

r(N+1)
M+1

and

rm{Var}[X]=rm{Var}[Y]

.

Notes and References

  1. https://www.encyclopediaofmath.org/index.php/Negative_hypergeometric_distribution Negative hypergeometric distribution
  2. Book: Johnson . Norman L. . Kemp . Adrienne W. . Kotz . Samuel . 2005 . Univariate Discrete Distributions . Wiley . 0-471-27246-9. §6.2.2 (p.253–254)
  3. Rohatgi, Vijay K., and AK Md Ehsanes Saleh. An introduction to probability and statistics. John Wiley & Sons, 2015.
  4. Khan, RA (1994). A note on the generating function of a negative hypergeometric distribution. Sankhya: The Indian Journal of Statistics B, 56(3), 309-313.