In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable. Inverse distributions arise in particular in the Bayesian context of prior distributions and posterior distributions for scale parameters. In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.
In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X. If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that
G(y)=\Pr(Y\leqy)=\Pr\left(X\geq
1 | |
y |
\right)=1-\Pr\left(X<
1 | |
y |
\right)=1-F\left(
1 | |
y |
\right).
Then the density function of Y is found as the derivative of the cumulative distribution function:
g(y)=
1 | |
y2 |
f\left(
1 | |
y |
\right).
The reciprocal distribution has a density function of the form[1]
f(x)\proptox-1 for0<a<x<b,
where
\propto
g(y)\proptoy-1 for0\leb-1<y<a-1,
If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b−1,a−1), and the probability density function in this range is
g(y)=y-2
1 | |
b-a |
,
and is zero elsewhere.
The cumulative distribution function of the reciprocal, within the same range, is
G(y)=
b-y-1 | |
b-a |
.
For example, if X is uniformly distributed on the interval (0,1), then Y = 1 / X has density
g(y)=y-2
G(y)={1-y-1}
y>1.
Let X be a t distributed random variate with k degrees of freedom. Then its density function is
f(x)=
1 | |
\sqrt{k\pi |
}
| ||||||
|
1 | ||||||||||||||
|
.
The density of Y = 1 / X is
g(y)=
1 | |
\sqrt{k\pi |
}
| ||||||
|
1 | ||||||||||||||
|
.
With k = 1, the distributions of X and 1 / X are identical (X is then Cauchy distributed (0,1)). If k > 1 then the distribution of 1 / X is bimodal.
If variable
X
l{N}(\mu,\sigma2)
Y= | 1 |
X |
f(y)=
1 | |
\sqrt{2\pi |
\sigmay2}
| ||||||||
e |
.
l{N}(0,1)
\pm\tfrac{1}{\sqrt{2}}
f(y)= |
| |||||||
\sqrt{2\pi |
y2}
and the first and higher-order moments do not exist.[2] For such inverse distributions and for ratio distributions, there can still be defined probabilities for intervals, which can be computed either by Monte Carlo simulation or, in some cases, by using the Geary–Hinkley transformation.[3]
However, in the more general case of a shifted reciprocal function
1/(p-B)
B=N(\mu,\sigma)
p
\mu
\sqrt{2 | |
In contrast, if the shift
p-\mu
\operatorname{Im}(p-\mu)
p-\mu
p-\mu
p1
p2
B
If
X
λ
Y=1/X
FY(y)=e-λ/y
y>0
If X is a Cauchy distributed (μ, σ) random variable, then 1 / X is a Cauchy (μ / C, σ / C) random variable where C = μ2 + σ2.
If X is an F(ν1, ν2) distributed random variable then 1 / X is an F(ν2, ν1) random variable.
If
X
n
p
E\left[ | 1 |
(1+X) |
\right]=
1 | |
p(n+1) |
\left(1-(1-p)n+1\right)
An asymptotic approximation for the non-central moments of the reciprocal distribution is known.[7]
E[(1+X)a]=O((np))+o(n)
where O and o are the big and little o order functions and
a
For a triangular distribution with lower limit a, upper limit b and mode c, where a < b and a ≤ c ≤ b, the mean of the reciprocal is given by
\mu=
| |||||||||||||||||||
a-b |
and the variance by
\sigma2=
| |||||||||||||||||||
a-b |
-\mu2
Both moments of the reciprocal are only defined when the triangle does not cross zero, i.e. when a, b, and c are either all positive or all negative.
Other inverse distributions include
inverse-chi-squared distribution
inverse matrix gamma distribution
Inverse distributions are widely used as prior distributions in Bayesian inference for scale parameters.