Continuity in probability explained

In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge.

Definition

Let

X=(Xt)t

be a stochastic process in

\Rn

.The process

X

is continuous in probability when

Xr

converges in probability to

Xs

whenever

r

converges to

s

.

Examples and Applications

Feller processes are continuous in probability at

t=0

. Continuity in probability is a sometimes used as one of the defining property for Lévy process. Any process that is continuous in probability and has independent increments has a version that is càdlàg. As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.

References

[1] [2] [3]

Notes and References

  1. Book: Kallenberg . Olav . Olav Kallenberg . 2002 . Foundations of Modern Probability. New York . Springer . 2nd. 286.
  2. Web site: Lectures on Lévy processes and Stochastic calculus, Braunschweig; Lecture 2: Lévy processes. Applebaum, D.. 37–53. University of Sheffield.
  3. Book: Kallenberg . Olav . Olav Kallenberg . 2002 . Foundations of Modern Probability. New York . Springer . 2nd. 290.