TY - JOUR
T1 - Unbiased inference for discretely observed hidden markov model diffusions
AU - Chada, Neil K.
AU - Franks, Jordan
AU - Jasra, Ajay
AU - Law, Kody J.
AU - Vihola, Matti
PY - 2021
Y1 - 2021
N2 - We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretization bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretized approximations of diffusions, such as the Eulerc-Maruyama scheme. Our approach is based on particle marginal Metropolis-Hastings, a particle filter, randomized multilevel Monte Carlo, and an importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretization as the number of Markov chain iterations increases. We give convergence results and recommend allocations for algorithm inputs. Our method admits a straightforward parallelization and can be computationally efficient. The user-friendly approach is illustrated on three examples, where the underlying diffusion is an Ornstein-Uhlenbeck process, a geometric Brownian motion, and a 2d nonreversible Langevin equation. © 2021 Society for Industrial and Applied Mathematics and American Statistical Association.
AB - We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretization bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretized approximations of diffusions, such as the Eulerc-Maruyama scheme. Our approach is based on particle marginal Metropolis-Hastings, a particle filter, randomized multilevel Monte Carlo, and an importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretization as the number of Markov chain iterations increases. We give convergence results and recommend allocations for algorithm inputs. Our method admits a straightforward parallelization and can be computationally efficient. The user-friendly approach is illustrated on three examples, where the underlying diffusion is an Ornstein-Uhlenbeck process, a geometric Brownian motion, and a 2d nonreversible Langevin equation. © 2021 Society for Industrial and Applied Mathematics and American Statistical Association.
KW - diffusion
KW - importance sampling
KW - Markov chain Monte Carlo
KW - multilevel Monte Carlo
KW - sequential Monte Carlo
UR - http://www.scopus.com/inward/record.url?scp=85115835298&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85115835298&origin=recordpage
U2 - 10.1137/20M131549X
DO - 10.1137/20M131549X
M3 - RGC 21 - Publication in refereed journal
SN - 2166-2525
VL - 9
SP - 763
EP - 787
JO - SIAM / ASA Journal on Uncertainty Quantification
JF - SIAM / ASA Journal on Uncertainty Quantification
IS - 2
ER -