In the mathematical study of stochastic processes, a Harris chain is a Markov chain where the chain returns to a particular part of the state space an unbounded number of times.[1] Harris chains are regenerative processes and are named after Theodore Harris. The theory of Harris chains and Harris recurrence is useful for treating Markov chains on general (possibly uncountably infinite) state spaces.
Let
\{Xn\}
\Omega
K
P(Xn+1\inC\midXn=x)=K(x,C)
x
\Omega
C\subseteq\Omega
\{Xn\}
A\subseteq\Omega,\varepsilon>0
\rho
\rho(\Omega)=1
\tauA:=inf\{n\geq0:Xn\inA\}
P(\tauA<infty\midX0=x)=1
x\in\Omega
x\inA
C\subseteq\Omega
C
K(x,C)\geq\varepsilon\rho(C)
The first part of the definition ensures that the chain returns to some state within
A
A
A
\varepsilon
C=\Omega
x
A
Xn=x
Xn+1
\varepsilon
Xn+1\in\Omega
\rho
\varepsilon<1
Xn+1
P(Xn+1\inC\midXn=x)=(K(x,C)-\varepsilon\rho(C))/(1-\varepsilon)
C\subseteq\Omega
Two random processes
\{Xn\}
\{Yn\}
Xn=x
Yn=y
x
y
A
\varepsilon
Let Ω be a countable state space. The kernel K is defined by the one-step conditional transition probabilities P[''X''<sub>''n''+1</sub> = ''y'' | ''X''<sub>''n''</sub> = ''x''] for x,y ∈ Ω. The measure ρ is a probability mass function on the states, so that ρ(x) ≥ 0 for all x ∈ Ω, and the sum of the ρ(x) probabilities is equal to one. Suppose the above definition is satisfied for a given set A ⊆ Ω and a given parameter ε > 0. Then P[''X''<sub>''n''+1</sub> = ''c'' | ''X''<sub>''n''</sub> = ''x''] ≥ ερ(c) for all x ∈ A and all c ∈ Ω.
Let, Xn ∈ Rd be a Markov chain with a kernel that is absolutely continuous with respect to Lebesgue measure:
K(x, dy) = K(x, y) dy
such that K(x, y) is a continuous function.
Pick (x0, y0) such that K(x0, y0) > 0, and let A and Ω be open sets containing x0 and y0 respectively that are sufficiently small so that K(x, y) ≥ ε > 0 on A × Ω. Letting ρ(C) = |Ω ∩ C|/|Ω| where |Ω| is the Lebesgue measure of Ω, we have that (2) in the above definition holds. If (1) holds, then is a Harris chain.
In the following
R:=inf\{n\ge1:Xn\inA\}
R
A
\mu
X0\sim\mu
Definition: If for all
\mu
P[R<infty|X0\inA]=1
Definition: A recurrent Harris chain
Xn
\existsN
\foralln\geN
\forall\mu,P[Xn\inA|X0\inA]>0
Theorem: Let
Xn
\pi
P[R<infty|X0\inA]=1
n → infty
||l{L}(Xn|X0)-\pi||TV → 0
|| ⋅ ||TV