Zeta distribution explained

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the integer value k is given by the probability mass function

-s
f
s(k)=k

/\zeta(s)

where ζ(s) is the Riemann zeta function (which is undefined for s = 1).

The multiplicities of distinct prime factors of X are independent random variables.

The Riemann zeta function being the sum of all terms

k-s

for positive integer k, it appears thus as the normalization of the Zipf distribution. The terms "Zipf distribution" and the "zeta distribution" are often used interchangeably. But while the Zeta distribution is a probability distribution by itself, it is not associated to the Zipf's law with same exponent.

Definition

The Zeta distribution is defined for positive integers

k\geq1

, and its probability mass function is given by

P(x=k)=

1
\zeta(s)

k-s

,where

s>1

is the parameter, and

\zeta(s)

is the Riemann zeta function.

The cumulative distribution function is given by

P(x\leqk)=

Hk,s
\zeta(s)

,

where

Hk,s

is the generalized harmonic number

Hk,s=

k
\sum
i=1
1
is

.

Moments

The nth raw moment is defined as the expected value of Xn:

mn=E(Xn)=

1
\zeta(s)
infty
\sum
k=1
1
ks-n

The series on the right is just a series representation of the Riemann zeta function, but it only converges for values of

s-n

that are greater than unity. Thus:

mn=\left\{ \begin{matrix} \zeta(s-n)/\zeta(s)&rm{for}~n<s-1\\ infty&rm{for}~n\ges-1 \end{matrix} \right.

The ratio of the zeta functions is well-defined, even for n > s - 1 because the series representation of the zeta function can be analytically continued. This does not change the fact that the moments are specified by the series itself, and are therefore undefined for large n.

Moment generating function

The moment generating function is defined as

M(t;s)=E(etX)=

1
\zeta(s)
infty
\sum
k=1
etk
ks

.

The series is just the definition of the polylogarithm, valid for

et<1

so that

M(t;s)=

\operatorname{Li
s(e

t)}{\zeta(s)}fort<0.

Since this does not converge on an open interval containing

t=0

, the moment generating function does not exist.

The case s = 1

ζ(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if

\limn\toinfty

N(A,n)
n

exists where N(An) is the number of members of A less than or equal to n, then

\lim
s\to1+

P(X\inA)

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless the second limit given above exists and is proportional to

log(d+1)-log(d)=log\left(1+

1
d

\right),

which is Benford's law.

Infinite divisibility

The Zeta distribution can be constructed with a sequence of independent random variables with a geometric distribution. Let

p

be a prime number and

X(p-s)

be a random variable with a geometric distribution of parameter

p-s

, namely

      P\left(X(p-s)=k\right)=p-ks(1-p-s)

If the random variables

(X(p-s))p}

are independent, then, the random variable

Zs

defined by

      Zs=\prodp}

X(p-s)
p

has the zeta distribution:

P\left(Zs=n\right)=

1
ns\zeta(s)
.

Stated differently, the random variable

log(Zs)=\sump}X(p-s)log(p)

is infinitely divisible with Lévy measure given by the following sum of Dirac masses:

      \Pis(dx)=\sump}\sumk

p-k
k

\deltak(dx)

See also

Other "power-law" distributions

External links