Unbiased inference for discretely observed hidden markov model diffusions

Neil K. Chada, Jordan Franks, Ajay Jasra, Kody J. Law, Matti Vihola

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

15 Citations (Scopus)

Abstract

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretization bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretized approximations of diffusions, such as the Eulerc-Maruyama scheme. Our approach is based on particle marginal Metropolis-Hastings, a particle filter, randomized multilevel Monte Carlo, and an importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretization as the number of Markov chain iterations increases. We give convergence results and recommend allocations for algorithm inputs. Our method admits a straightforward parallelization and can be computationally efficient. The user-friendly approach is illustrated on three examples, where the underlying diffusion is an Ornstein-Uhlenbeck process, a geometric Brownian motion, and a 2d nonreversible Langevin equation. © 2021 Society for Industrial and Applied Mathematics and American Statistical Association.
Original languageEnglish
Pages (from-to)763-787
JournalSIAM / ASA Journal on Uncertainty Quantification
Volume9
Issue number2
Online published8 Jun 2021
DOIs
Publication statusPublished - 2021
Externally publishedYes

Research Keywords

  • diffusion
  • importance sampling
  • Markov chain Monte Carlo
  • multilevel Monte Carlo
  • sequential Monte Carlo

Fingerprint

Dive into the research topics of 'Unbiased inference for discretely observed hidden markov model diffusions'. Together they form a unique fingerprint.

Cite this