Generalized inverse Gaussian distribution explained
In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function
f(x)=
)}x(p-1)e-(ax, x>0,
where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.[1] [2] [3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes.[4]
Properties
Alternative parametrization
By setting
and
, we can alternatively express the GIG distribution as
f(x)=
\right)p-1e-\theta(x/η,
where
is the concentration parameter while
is the scaling parameter.
Summation
Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible.[5]
Entropy
The entropy of the generalized inverse Gaussian distribution is given as
\begin{align}
H=
log\left(
\right)&{}+log\left(2Kp\left(\sqrt{ab}\right)\right)-(p-1)
| \left[ | d | K\nu\left(\sqrt{ab | d\nu |
|
\right)\right] |
\nu=p
} \\& + \frac\left(K_\left(\sqrt\right) + K_\left(\sqrt\right)\right)\end
where
K\nu\left(\sqrt{ab}\right)\right]\nu=p
is a derivative of the modified Bessel function of the second kind with respect to the order
evaluated at
Characteristic Function
The characteristic of a random variable
is given as(for a derivation of the characteristic function, see supplementary materials of
[6])
E(eitX)=\left(
| Kp\left(\sqrt{(a-2it)b |
\right)}{ |
Kp\left(\sqrt{ab}\right)}
for
where
denotes the
imaginary number.
Related distributions
Special cases
The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = −1/2 and b = 0, respectively. Specifically, an inverse Gaussian distribution of the form
f(x;\mu,λ)=\left[
\right]1/2\exp{\left(
\right)}
is a GIG with
,
, and
. A Gamma distribution of the form
g(x;\alpha,\beta)=\beta\alpha
x\alpha-1e-\beta
is a GIG with
,
, and
.
Other special cases include the inverse-gamma distribution, for a = 0.
Conjugate prior for Gaussian
The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.[7] [8] Let the prior distribution for some hidden variable, say
, be GIG:
P(z\mida,b,p)=\operatorname{GIG}(z\mida,b,p)
and let there be
observed data points,
, with normal likelihood function, conditioned on
P(X\midz,\alpha,\beta)=
N(xi\mid\alpha+\betaz,z)
where
is the normal distribution, with mean
and variance
. Then the posterior for
, given the data is also GIG:
P(z\midX,a,b,p,\alpha,\beta)=GIG\left(z\mid
\right)
where
.
[9] Sichel distribution
The Sichel distribution[10] [11] results when the GIG is used as the mixing distribution for the Poisson parameter
.
See also
Notes and References
- Book: Seshadri
, V.
. Halphen's laws . Kotz . S. . Read . C. B. . Banks . D. L. . Encyclopedia of Statistical Sciences, Update Volume 1 . 302–306 . Wiley . New York . 1997 .
- Perreault . L. . Bobée . B. . Rasmussen . P. F. . 10.1061/(ASCE)1084-0699(1999)4:3(189) . Halphen Distribution System. I: Mathematical and Statistical Properties . Journal of Hydrologic Engineering . 4 . 3 . 189 . 1999 .
- Étienne Halphen was the grandson of the mathematician Georges Henri Halphen.
- Book: Jørgensen
, Bent
. Statistical Properties of the Generalized Inverse Gaussian Distribution . Springer-Verlag . 1982 . New York–Berlin . Lecture Notes in Statistics . 9 . 0-387-90665-7 . 0648107.
- O. Barndorff-Nielsen and Christian Halgreen, Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions, Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 1977
- Pal . Subhadip . Gaskins . Jeremy . Modified Pólya-Gamma data augmentation for Bayesian analysis of directional data . Journal of Statistical Computation and Simulation . 23 May 2022 . 92 . 16 . 3430–3451 . 10.1080/00949655.2022.2067853 . 249022546 . 0094-9655.
- Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.
- Barndorf-Nielsen, O.E., 1997. Normal Inverse Gaussian Distributions and stochastic volatility modelling. Scand. J. Statist. 24, 1–13.
- Due to the conjugacy, these details can be derived without solving integrals, by noting that
P(z\midX,a,b,p,\alpha,\beta)\proptoP(z\mida,b,p)P(X\midz,\alpha,\beta)
. Omitting all factors independent of
, the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.
- Sichel, Herbert S, 1975. "On a distribution law for word frequencies." Journal of the American Statistical Association 70.351a: 542-547.
- Stein, Gillian Z., Walter Zucchini, and June M. Juritz, 1987. "Parameter estimation for the Sichel distribution and its multivariate extension." Journal of the American Statistical Association 82.399: 938-944.