Iterated filtering algorithms are a tool for maximum likelihood inference on partially observed dynamical systems. Stochastic perturbations to the unknown parameters are used to explore the parameter space. Applying sequential Monte Carlo (the particle filter) to this extended model results in the selection of the parameter values that are more consistent with the data. Appropriately constructed procedures, iterating with successively diminished perturbations, converge to the maximum likelihood estimate. Iterated filtering methods have so far been used most extensively to study infectious disease transmission dynamics. Case studies include cholera, Ebola virus, influenza, malaria, HIV, pertussis, poliovirus and measles. Other areas which have been proposed to be suitable for these methods include ecological dynamics[1] and finance.
The perturbations to the parameter space play several different roles. Firstly, they smooth out the likelihood surface, enabling the algorithm to overcome small-scale features of the likelihood during early stages of the global search. Secondly, Monte Carlo variation allows the search to escape from local minima. Thirdly, the iterated filtering update uses the perturbed parameter values to construct an approximation to the derivative of the log likelihood even though this quantity is not typically available in closed form. Fourthly, the parameter perturbations help to overcome numerical difficulties that can arise during sequential Monte Carlo.
The data are a time series
y1,...,yN
t1<t2<...<tN
X(t)
f(x,s,t,\theta,W)
X(t | |
n-1 |
),t | |
n-1 |
,t | |
n,\theta,W) |
where
\theta
W
f(.)
X(t0)
t0<t1
X(t0)=h(\theta)
g(yn|Xn,tn,\theta)
Input: A partially observed Markov model specified as above; Monte Carlo sample size
J
M
0<a<1
b
\Phi
\theta(1)
for
m | |
=1
M | |
draw
\Theta | |
0,j)\sim |
Normal(\theta(m),bam-1\Phi)
j=1,...,J
set
X | |
0,j)=h(\Theta |
0,j)) |
j=1,...,J
set
(m) | |
\bar\theta(t | |
0)=\theta |
for
n | |
=1
N | |
draw
\Theta | |
n,j)\sim |
Normal(\Theta | |
n-1 |
,j),am-1\Phi)
j=1,...,J
set
X | |
n,j)=f(X |
n-1 |
,j),t | |
n-1 |
,tn,\ThetaP(tn,j),W)
j=1,...,J
set
w(n,j)=g(yn|X
n,\Theta |
P(tn,j))
j=1,...,J
draw
k | |
J |
P(k | |
j=i)=w(n,i)/{\sum} |
\ellw(n,\ell)
set
X | |
n,j)=X |
j) |
\Theta | |
n,j)=\Theta |
j) |
j=1,...,J
set
\bar\theta | |
i |
(t | |
n |
)
\{\Theta | |
F,i |
(t | |
n |
,j),j=1,...,J\}
\Theta | |
F |
\{\Theta | |
F,i |
\}
set
V | |
i |
(t | |
n |
)
\{\Theta | |
P,i |
(t | |
n |
,j),j=1,...,J\}
set
(m+1) | |
\theta | |
i |
=
(m) | |
\theta | |
i |
+Vi(t1
N | |
)\sum | |
n=1 |
-1 | |
V | |
i |
(tn)(\bar\thetai(tn)-\bar\thetai(tn-1))
Output: Maximum likelihood estimate
\hat\theta=\theta(M+1)
X(t0)
Input: A partially observed Markov model specified as above; Monte Carlo sample size
J
M
0<a<1
\Phi
\{\Thetaj,j=1,...,J\}
for
m | |
=1
M | |
set
\Theta | |
0,j) |
\simNormal(\Thetaj,am-1\Phi)
j=1,...,J
set
X | |
0,j)=h(\Theta |
0,j)) |
j=1,...,J
for
n | |
=1
N | |
draw
\Theta | |
n,j)\sim |
Normal(\Theta | |
n-1 |
,k | |
j), |
am-1\Phi)
j=1,...,J
set
X | |
n,j)=f(X |
n-1 |
,j),t | |
n-1 |
,tn,\ThetaP(tn,j),W)
j=1,...,J
set
w(n,j)=g(yn|X
n,\Theta |
P(tn,j))
j=1,...,J
draw
k | |
J |
P(k | |
j=i)=w(n,i)/{\sum} |
\ellw(n,\ell)
set
X | |
n,j)=X |
j) |
\Theta | |
n,j)=\Theta |
j) |
j=1,...,J
set
\Thetaj=\Theta
N,j) |
j=1,...,J
Output: Parameter vectors approximating the maximum likelihood estimate,
\{\Thetaj,j=1,...,J\}
"pomp: statistical inference for observed Markov processes" : R package.