In probability theory, a hyperexponential distribution is a continuous probability distribution whose probability density function of the random variable X is given by
fX(x)=
n | |
\sum | |
i=1 |
f | |
Yi |
(x) pi,
where each Yi is an exponentially distributed random variable with rate parameter λi, and pi is the probability that X will take on the form of the exponential distribution with rate λi.[1] It is named the hyperexponential distribution since its coefficient of variation is greater than that of the exponential distribution, whose coefficient of variation is 1, and the hypoexponential distribution, which has a coefficient of variation smaller than one. While the exponential distribution is the continuous analogue of the geometric distribution, the hyperexponential distribution is not analogous to the hypergeometric distribution. The hyperexponential distribution is an example of a mixture density.
An example of a hyperexponential random variable can be seen in the context of telephony, where, if someone has a modem and a phone, their phone line usage could be modeled as a hyperexponential distribution where there is probability p of them talking on the phone with rate λ1 and probability q of them using their internet connection with rate λ2.
Since the expected value of a sum is the sum of the expected values, the expected value of a hyperexponential random variable can be shown as
E[X]=
infty | |
\int | |
-infty |
xf(x)dx=
n | |
\sum | |
i=1 |
pi\int
infty | |
0 |
xλi
-λix | |
e |
dx=
n | |
\sum | |
i=1 |
pi | |
λi |
and
E\left[X2\right]=
infty | |
\int | |
-infty |
x2f(x)dx=
n | |
\sum | |
i=1 |
pi\int
infty | |
0 |
2λ | |
x | |
i |
-λix | |
e |
dx=
n | |
\sum | |
i=1 |
2 | ||||||
|
pi,
from which we can derive the variance:[2]
\operatorname{Var}[X]=E\left[X2\right]-E\left[X\right]2=
n | |
\sum | |
i=1 |
2 | ||||||
|
pi-
n | |
\left[\sum | |
i=1 |
pi | |
λi |
\right]2=
n | |
\left[\sum | |
i=1 |
pi | |
λi |
\right]2+
n | |
\sum | |
i=1 |
n | |
\sum | |
j=1 |
pipj\left(
1 | |
λi |
-
1 | |
λj |
\right)2.
The standard deviation exceeds the mean in general (except for the degenerate case of all the λs being equal), so the coefficient of variation is greater than 1.
The moment-generating function is given by
E\left[etx\right]=
infty | |
\int | |
-infty |
etxf(x)dx=
n | |
\sum | |
i=1 |
pi
infty | |
\int | |
0 |
etxλi
-λix | |
e |
dx=
n | |
\sum | |
i=1 |
λi | |
λi-t |
pi.
A given probability distribution, including a heavy-tailed distribution, can be approximated by a hyperexponential distribution by fitting recursively to different time scales using Prony's method.[3]