Renewal theory is the branch of probability theory that generalizes the Poisson process for arbitrary holding times. Instead of exponentially distributed holding times, a renewal process may have any independent and identically distributed (IID) holding times that have finite mean. A renewal-reward process additionally has a random sequence of rewards incurred at each holding time, which are IID but need not be independent of the holding times.
A renewal process has asymptotic properties analogous to the strong law of large numbers and central limit theorem. The renewal function
m(t)
g(t)
m'(t)
Applications include calculating the best strategy for replacing worn-out machinery in a factory and comparing the long-term benefits of different insurance policies. The inspection paradox relates to the fact that observing a renewal interval at time t gives an interval with average value larger than that of an average renewal interval.
The renewal process is a generalization of the Poisson process. In essence, the Poisson process is a continuous-time Markov process on the positive integers (usually starting at zero) which has independent exponentially distributed holding times at each integer
i
i+1
Let
(Si)i
0<\operatorname{E}[Si]<infty.
We refer to the random variable
Si
i
Define for each n > 0 :
Jn=
n | |
\sum | |
i=1 |
Si,
each
Jn
n
[Jn,Jn+1]
Then
(Xt)t\geq0
Xt=
infty | |
\sum | |
n=1 |
\operatorname{I
where
\operatorname{I
\operatorname{I
(Xt)t\geq0
If one considers events occurring at random times, one may choose to think of the holding times
\{Si:i\geq1\}
The Poisson process is the unique renewal process with the Markov property, as the exponential distribution is the unique continuous random variable with the property of memorylessness.
Let
W1,W2,\ldots
\operatorname{E}|Wi|<infty.
Then the random variable
Yt=
Xt | |
\sum | |
i=1 |
Wi
is called a renewal-reward process. Note that unlike the
Si
Wi
The random variable
Yt
S1,S2,\ldots
W1,W2,\ldots
Wi
Si
In the context of the above interpretation of the holding times as the time between successive malfunctions of a machine, the "rewards"
W1,W2,\ldots
An alternative analogy is that we have a magic goose which lays eggs at intervals (holding times) distributed as
Si
Wi
Yt
We define the renewal function as the expected value of the number of jumps observed up to some time
t
m(t)=\operatorname{E}[Xt].
The renewal function satisfies
\limt
1 | |
t |
m(t)=
1 | |
\operatorname{E |
[S1]}.
We define the reward function:
g(t)=\operatorname{E}[Yt].
The reward function satisfies
\limt
1 | |
t |
g(t)=
\operatorname{E | |
[W |
1]}{\operatorname{E}[S1]}.
The renewal function satisfies
m(t)=FS(t)+
t | |
\int | |
0 |
m(t-s)fS(s)ds
where
FS
S1
fS
Let X be a renewal process with renewal function
m(t)
\mu
g:[0,infty) → [0,infty)
infty | |
\int | |
0 |
g(t)dt<infty
The key renewal theorem states that, as
t → infty
t | |
\int | |
0 |
g(t-x)m'(x)dx →
1 | |
\mu |
infty | |
\int | |
0 |
g(x)dx
Considering
g(x)=I[0,(x)
h>0
m(t+h)-m(t) →
h | |
\mu |
t → infty
The result can be proved using integral equations or by a coupling argument. Though a special case of the key renewal theorem, it can be used to deduce the full theorem, by considering step functions and then increasing sequences of step functions.
Renewal processes and renewal-reward processes have properties analogous to the strong law of large numbers, which can be derived from the same theorem. If
(Xt)t\geq0
(Yt)t\geq0
\limt
1 | |
t |
Xt=
1 | |
\operatorname{E |
[S1]}
\limt
1 | |
t |
Yt=
1 | |
\operatorname{E |
[S1]}\operatorname{E}[W1]
almost surely.
Renewal processes additionally have a property analogous to the central limit theorem:
Xt-t/\mu | |
\sqrt{t\sigma2/\mu3 |
A curious feature of renewal processes is that if we wait some predetermined time t and then observe how large the renewal interval containing t is, we should expect it to be typically larger than a renewal interval of average size.
Mathematically the inspection paradox states: for any t > 0 the renewal interval containing t is stochastically larger than the first renewal interval. That is, for all x > 0 and for all t > 0:
\operatorname{P}(S | |
Xt+1 |
>x)\geq\operatorname{P}(S1>x)=1-FS(x)
where FS is the cumulative distribution function of the IID holding times Si. A vivid example is the bus waiting time paradox: For a given random distribution of bus arrivals, the average rider at a bus stop observes more delays than the average operator of the buses.
The resolution of the paradox is that our sampled distribution at time t is size-biased (see sampling bias), in that the likelihood an interval is chosen is proportional to its size. However, a renewal interval of average size is not size-biased.
Unless the renewal process is a Poisson process, the superposition (sum) of two independent renewal processes is not a renewal process. However, such processes can be described within a larger class of processes called the Markov-renewal processes.[1] However, the cumulative distribution function of the first inter-event time in the superposition process is given by[2]
R(t)=1-
K | |
\sum | |
k=1 |
\alphak | |||||||||
|
(1-Rk(t))
K | |
\prod | |
j=1,j ≠ k |
\alphaj
infty | |
\int | |
t |
(1-Rj(u))du
where Rk(t) and αk > 0 are the CDF of the inter-event times and the arrival rate of process k.[3]
Eric the entrepreneur has n machines, each having an operational lifetime uniformly distributed between zero and two years. Eric may let each machine run until it fails with replacement cost €2600; alternatively he may replace a machine at any time while it is still functional at a cost of €200.
What is his optimal replacement policy?