Gamma process explained

Also known as the (Moran-)Gamma Process, the gamma process is a random process studied in mathematics, statistics, probability theory, and stochastics. The gamma process is a stochastic or random process consisting of independently distributed gamma distributions where

N(t)

represents the number of event occurrences from time 0 to time

t

. The gamma distribution has shape parameter

\gamma

and rate parameter

λ

, often written as

\Gamma(\gamma,λ)

. Both

\gamma

and

λ

must be greater than 0. The gamma process is often written as

\Gamma(t,\gamma,λ)

where

t

represents the time from 0. The process is a pure-jump increasing Lévy process with intensity measure

\nu(x)=\gammax-1\exp(x),

for all positive

x

. Thus jumps whose size lies in the interval

[x,x+dx)

occur as a Poisson process with intensity

\nu(x)dx.

The parameter

\gamma

controls the rate of jump arrivals and the scaling parameter

λ

inversely controls the jump size. It is assumed that the process starts from a value 0 at t = 0 meaning

N(0)=0

.  

The gamma process is sometimes also parameterised in terms of the mean (

\mu

) and variance (

v

) of the increase per unit time, which is equivalent to

\gamma=\mu2/v

and

λ=\mu/v

.

Plain English definition

The gamma process is a process which measures the number of occurrences of independent gamma-distributed variables over a span of time. This image below displays two different gamma processes on from time 0 until time 4. The red process has more occurrences in the timeframe compared to the blue process because its shape parameter is larger than the blue shape parameter.

Properties

We use the Gamma function in these properties, so the reader should distinguish between

\Gamma()

(the Gamma function) and

\Gamma(t;\gamma,λ)

(the Gamma process). We will sometimes abbreviate the process as

Xt\equiv\Gamma(t;\gamma,λ)

.

Some basic properties of the gamma process are:

Marginal distribution

The marginal distribution of a gamma process at time

t

is a gamma distribution with mean

\gammat/λ

and variance

\gammat/λ2.

That is, the probability distribution

f

of the random variable

Xt

is given by the density f(x;t, \gamma, \lambda) = \frac x^e^.

Scaling

Multiplication of a gamma process by a scalar constant

\alpha

is again a gamma process with different mean increase rate.

\alpha\Gamma(t;\gamma,λ)\simeq\Gamma(t;\gamma,λ/\alpha)

Adding independent processes

The sum of two independent gamma processes is again a gamma process.

\Gamma(t;\gamma1,λ)+\Gamma(t;\gamma2,λ)\simeq\Gamma(t;\gamma1+\gamma2,λ)

Moments

The moment function helps mathematicians find expected values, variances, skewness, and kurtosis.

\operatorname

n)
E(X
t

=λ-n

\Gamma(\gammat+n)
\Gamma(\gammat)

,    n\geq0,

where

\Gamma(z)

is the Gamma function.

Moment generating function

The moment generating function is the expected value of

\exp(tX)

where X is the random variable.

\operatornameE(\exp(\thetaXt))=\left(1-

\thetaλ\right)
-\gamma

,\theta<λ

Correlation

Correlation displays the statistical relationship between any two gamma processes.

\operatorname{Corr}(Xs,Xt)=\sqrt{

s
t},s<t
, for any gamma process

X(t).

The gamma process is used as the distribution for random time change in the variance gamma process.

Literature