Scaling particle Markov Chain Monte Carlo methods with local proposals
The iterated conditional Sequential Monte Carlo (cSMC) method is a particle MCMC method commonly used for state inference in non-linear, non-Gaussian state space models. Standard implementations of iterated cSMC provide an efficient way to sample state sequences in low-dimensional state space models. However, efficiently scaling iterated cSMC methods to perform well in models with a high-dimensional state remains a challenge. One reason for this is the use of a global proposal, without reference to the current state sequence. In high dimensions, such a proposal will typically not be well-matched to the posterior and impede efficient sampling. I will describe techniques based on the embedded HMM (Hidden Markov Model) framework to construct efficient proposals in high dimensions that are local relative to the current state sequence.