In computational statistics, reversible-jump Markov chain Monte Carlo is an extension to standard Markov chain Monte Carlo (MCMC) methodology, introduced by Peter Green, which allows simulation (the creation of samples) of the posterior distribution on spaces of varying dimensions.[1] Thus, the simulation is possible even if the number of parameters in the model is not known. The "jump" refers to the switching from one parameter space to another during the running of the chain. RJMCMC is useful to compare models of different dimension to see which one fits the data best. It is also useful for predictions of new data points, because we do not need to choose and fix a model, RJMCMC can directly predict the new values for all the models at the same time. Models that suit the data best will be chosen more frequently then the poorer ones.
Let
nm\inNm=\{1,2,\ldots,I\}
I | |
M=cup | |
nm=1 |
dm | |
\R |
dm
nm
(M,Nm)
(m,nm)
The proposal
m'
g1mm'
m
u
u
U
q
dmm' | |
\R |
(m',nm')
(m',nm')=(g1mm'(m,u),nm')
The function
gmm':=((m,u)\mapsto((m',u')=(g1mm'(m,u),g2mm'(m,u))))
must be one to one and differentiable, and have a non-zero support:
supp(gmm')\ne\varnothing
-1 | |
g | |
mm' |
=gm'm
that is differentiable. Therefore, the
(m,u)
(m',u')
dm+dmm'=dm'+dm'm
is met where
dmm'
u
If
dm | |
\R |
\subset
dm' | |
\R |
dm+dmm'=dm'
with
(m,u)=gm'm(m).
The acceptance probability will be given by
a(m,m')=min\left(1,
pm'mpm'fm'(m') | \left|\det\left( | |
pmm'qmm'(m,u)pmfm(m) |
\partialgmm'(m,u) | |
\partial(m,u) |
\right)\right|\right),
where
| ⋅ |
pmfm
pmf
-1 | |
m=c |
p(y|m,nm)p(m|nm)p(nm),
where
c
There is an experimental RJ-MCMC tool available for the open source BUGs package.
The Gen probabilistic programming system automates the acceptance probability computation for user-defined reversible jump MCMC kernels as part of its Involution MCMC feature.