Generalized inverse Gaussian distribution explained

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

f(x)=

(a/b)p/2
2Kp(\sqrt{ab

)}x(p-1)e-(ax,    x>0,

where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.[1] [2] [3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes.[4]

Properties

Alternative parametrization

By setting

\theta=\sqrt{ab}

and

η=\sqrt{b/a}

, we can alternatively express the GIG distribution as

f(x)=

1\left(
Kp(\theta)
x
η

\right)p-1e-\theta(x/η,

where

\theta

is the concentration parameter while

η

is the scaling parameter.

Summation

Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible.[5]

Entropy

The entropy of the generalized inverse Gaussian distribution is given as

\begin{align} H=

1
2

log\left(

b
a

\right)&{}+log\left(2Kp\left(\sqrt{ab}\right)\right)-(p-1)

\left[dK\nu\left(\sqrt{ab
d\nu
\right)\right]

\nu=p

} \\& + \frac\left(K_\left(\sqrt\right) + K_\left(\sqrt\right)\right)\end

where

\left[d
d\nu

K\nu\left(\sqrt{ab}\right)\right]\nu=p

is a derivative of the modified Bessel function of the second kind with respect to the order

\nu

evaluated at

\nu=p

Characteristic Function

The characteristic of a random variable

X\simGIG(p,a,b)

is given as(for a derivation of the characteristic function, see supplementary materials of [6])

E(eitX)=\left(

a
a-2it
p
2
\right)
Kp\left(\sqrt{(a-2it)b
\right)}{

Kp\left(\sqrt{ab}\right)}

for

t\inR

where

i

denotes the imaginary number.

Related distributions

Special cases

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = −1/2 and b = 0, respectively. Specifically, an inverse Gaussian distribution of the form

f(x;\mu,λ)=\left[

λ
2\pix3

\right]1/2\exp{\left(

(x-\mu)2
2\mu2x

\right)}

is a GIG with

a=λ/\mu2

,

b=λ

, and

p=-1/2

. A Gamma distribution of the form

g(x;\alpha,\beta)=\beta\alpha

1
\Gamma(\alpha)

x\alpha-1e-\beta

is a GIG with

a=2\beta

,

b=0

, and

p=\alpha

.

Other special cases include the inverse-gamma distribution, for a = 0.

Conjugate prior for Gaussian

The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.[7] [8] Let the prior distribution for some hidden variable, say

z

, be GIG:

P(z\mida,b,p)=\operatorname{GIG}(z\mida,b,p)

and let there be

T

observed data points,

X=x1,\ldots,xT

, with normal likelihood function, conditioned on

z:

P(X\midz,\alpha,\beta)=

T
\prod
i=1

N(xi\mid\alpha+\betaz,z)

where

N(x\mid\mu,v)

is the normal distribution, with mean

\mu

and variance

v

. Then the posterior for

z

, given the data is also GIG:

P(z\midX,a,b,p,\alpha,\beta)=GIG\left(z\mid

2,b+S,p-T
2
a+T\beta

\right)

where

styleS=

T
\sum
i=1
2
(x
i-\alpha)
.[9]

Sichel distribution

The Sichel distribution[10] [11] results when the GIG is used as the mixing distribution for the Poisson parameter

λ

.

See also

Notes and References

  1. Book: Seshadri , V. . Halphen's laws . Kotz . S. . Read . C. B. . Banks . D. L. . Encyclopedia of Statistical Sciences, Update Volume 1 . 302–306 . Wiley . New York . 1997 .
  2. Perreault . L. . Bobée . B. . Rasmussen . P. F. . 10.1061/(ASCE)1084-0699(1999)4:3(189) . Halphen Distribution System. I: Mathematical and Statistical Properties . Journal of Hydrologic Engineering . 4 . 3 . 189 . 1999 .
  3. Étienne Halphen was the grandson of the mathematician Georges Henri Halphen.
  4. Book: Jørgensen , Bent . Statistical Properties of the Generalized Inverse Gaussian Distribution . Springer-Verlag . 1982 . New York–Berlin . Lecture Notes in Statistics . 9 . 0-387-90665-7 . 0648107.
  5. O. Barndorff-Nielsen and Christian Halgreen, Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions, Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 1977
  6. Pal . Subhadip . Gaskins . Jeremy . Modified Pólya-Gamma data augmentation for Bayesian analysis of directional data . Journal of Statistical Computation and Simulation . 23 May 2022 . 92 . 16 . 3430–3451 . 10.1080/00949655.2022.2067853 . 249022546 . 0094-9655.
  7. Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.
  8. Barndorf-Nielsen, O.E., 1997. Normal Inverse Gaussian Distributions and stochastic volatility modelling. Scand. J. Statist. 24, 1–13.
  9. Due to the conjugacy, these details can be derived without solving integrals, by noting that

    P(z\midX,a,b,p,\alpha,\beta)\proptoP(z\mida,b,p)P(X\midz,\alpha,\beta)

    . Omitting all factors independent of

    z

    , the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.
  10. Sichel, Herbert S, 1975. "On a distribution law for word frequencies." Journal of the American Statistical Association 70.351a: 542-547.
  11. Stein, Gillian Z., Walter Zucchini, and June M. Juritz, 1987. "Parameter estimation for the Sichel distribution and its multivariate extension." Journal of the American Statistical Association 82.399: 938-944.