Skellam distribution explained

The Skellam distribution is the discrete probability distribution of the difference

N1-N2

of two statistically independent random variables

N1

and

N2,

each Poisson-distributed with respective expected values

\mu1

and

\mu2

. It is useful in describing the statistics of the difference of two images with simple photon noise, as well as describing the point spread distribution in sports where all scored points are equal, such as baseball, hockey and soccer.

The distribution is also applicable to a special case of the difference of dependent Poisson random variables, but just the obvious case where the two variables have a common additive random contribution which is cancelled by the differencing: see Karlis & Ntzoufras (2003) for details and an application.

The probability mass function for the Skellam distribution for a difference

K=N1-N2

between two independent Poisson-distributed random variables with means

\mu1

and

\mu2

is given by:

p(k;\mu1,\mu2)=\Pr\{K=k\}=

-(\mu1+\mu2)
e

\left({\mu1\over\mu

k/2
2}\right)

Ik(2\sqrt{\mu1\mu2})

where Ik(z) is the modified Bessel function of the first kind. Since k is an integer we have that Ik(z)=I|k|(z).

Derivation

The probability mass function of a Poisson-distributed random variable with mean μ is given by

p(k;\mu)={\muk\overk!}e-\mu.

for

k\ge0

(and zero otherwise). The Skellam probability mass function for the difference of two independent counts

K=N1-N2

is the convolution of two Poisson distributions: (Skellam, 1946)

\begin{align} p(k;\mu1,\mu2) &

infty
=\sum
n=-infty

p(k+n;\mu1)p(n;\mu2)\\ &

-(\mu1+\mu2)
=e
infty
\sum
n=max(0,-k)
k+n
{{\mu
1
n}\over{n!(k+n)!}} \end{align}
\mu
2

Since the Poisson distribution is zero for negative values of the count

(p(N<0;\mu)=0)

, the second sum is only taken for those terms where

n\ge0

and

n+k\ge0

. It can be shown that the above sum implies that
p(k;\mu1,\mu2)=\left(
p(-k;\mu1,\mu2)
\mu1
\mu2

\right)k

so that:

p(k;\mu1,\mu2)=

-(\mu1+\mu2)
e

\left({\mu1\over\mu

k/2
2}\right)

I|k|(2\sqrt{\mu1\mu2})

where I k(z) is the modified Bessel function of the first kind. The special case for

\mu1=\mu2(=\mu)

is given by Irwin (1937):

p\left(k;\mu,\mu\right)=e-2\muI|k|(2\mu).

Using the limiting values of the modified Bessel function for small arguments, we can recover the Poisson distribution as a special case of the Skellam distribution for

\mu2=0

.

Properties

As it is a discrete probability function, the Skellam probability mass function is normalized:

infty
\sum
k=-infty

p(k;\mu1,\mu2)=1.

We know that the probability generating function (pgf) for a Poisson distribution is:

G\left(t;\mu\right)=e\mu(t-1).

It follows that the pgf,

G(t;\mu1,\mu2)

, for a Skellam probability mass function will be:

\begin{align} G(t;\mu1,\mu2)&=

infty
\sum
k=-infty

p(k;\mu1,\mu

k
2)t

\\[4pt] &=G\left(t;\mu1\right)G\left(1/t;\mu2\right)\\[4pt] &=

-(\mu1+\mu2)+\mu1t+\mu2/t
e

. \end{align}

Notice that the form of the probability-generating function implies that the distribution of the sums or the differences of any number of independent Skellam-distributed variables are again Skellam-distributed. It is sometimes claimed that any linear combination of two Skellam distributed variables are again Skellam-distributed, but this is clearly not true since any multiplier other than

\pm1

would change the support of the distribution and alter the pattern of moments in a way that no Skellam distribution can satisfy.

The moment-generating function is given by:

M\left(t;\mu1,\mu2\right)=

t;\mu
G(e
1,\mu

2)=

infty
\sum
k=0

{tk\overk!}mk

which yields the raw moments mk . Define:

\Delta\stackrel{def

}\ \mu_1-\mu_2\,

\mu\stackrel{def

}\ (\mu_1+\mu_2)/2.\,

Then the raw moments mk are

m1=\left.\Delta\right.

2\right.
m
2=\left.2\mu+\Delta
2)\right.
m
3=\left.\Delta(1+6\mu+\Delta

The central moments M k are

M2=\left.2\mu\right.,

M3=\left.\Delta\right.,

2\right..
M
4=\left.2\mu+12\mu

The mean, variance, skewness, and kurtosis excess are respectively:

\begin{align} \operatornameE(n)&=\Delta,\\[4pt] \sigma2&=2\mu,\\[4pt] \gamma1&=\Delta/(2\mu)3/2,\\[4pt] \gamma2&=1/2. \end{align}

The cumulant-generating function is given by:

K(t;\mu1,\mu2)\stackrel{def

}\ \ln(M(t;\mu_1,\mu_2)) = \sum_^\infty \,\kappa_k

which yields the cumulants:

\kappa2k=\left.2\mu\right.

\kappa2k+1=\left.\Delta\right..

For the special case when μ1 = μ2, anasymptotic expansion of the modified Bessel function of the first kind yields for large μ:

p(k;\mu,\mu)\sim

infty
{1\over\sqrt{4\pi\mu}}\left[1+\sum
n=1

(-1)n{\{4k2-12\}\{4k2-32\}\{4k2-(2n-1)2\} \overn!23n(2\mu)n}\right].

(Abramowitz & Stegun 1972, p. 377). Also, for this special case, when k is also large, and of order of the square root of 2μ, the distribution tends to a normal distribution:

p(k;\mu,\mu)\sim

-k2/4\mu
{e

\over\sqrt{4\pi\mu}}.

These special results can easily be extended to the more general case of different means.

Bounds on weight above zero

If

X\sim\operatorname{Skellam}(\mu1,\mu2)

, with

\mu1<\mu2

, then
\exp(-(\sqrt{\mu1
-\sqrt{\mu
2
2})

)}{(\mu1+

2}
\mu
2)

-

-(\mu1+\mu2)
e
2\sqrt{\mu1\mu2
} - \frac \leq \Pr\ \leq \exp (- (\sqrt -\sqrt)^2)

Details can be found in Poisson distribution#Poisson races

References

See also