Approximate Inference for Generic Likelihoods via Density-Preserving GMM Simplification

Research output: Conference PapersRGC 32 - Refereed conference paper (without host publication)peer-review

View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Publication statusPublished - Dec 2016

Conference

TitleNIPS 2016 Workshop on Advances in Approximate Bayesian Inference
LocationBarcelona
City
Period9 December 2016

Abstract

We consider recursive Bayesian filtering where the posterior is represented as a
Gaussian mixture model (GMM), and the likelihood function as a sum of scaled
Gaussians (SSG). In each iteration of filtering, the number of components increases.
We propose an algorithm for simplifying a GMM into a reduced mixture model
with fewer components, which is based on maximizing a variational lower bound
of the expected log-likelihood of a set of virtual samples. We also propose an
efficient algorithm for approximating an arbitrary likelihood function as an SSG.
Experiments on synthetic 2D GMMs, simulated belief propagation and visual
tracking show that our algorithm can be widely used for approximate inference.

Citation Format(s)

Approximate Inference for Generic Likelihoods via Density-Preserving GMM Simplification. / YU, Lei; YANG, Tianyu; Chan, Antoni B.
2016. Paper presented at NIPS 2016 Workshop on Advances in Approximate Bayesian Inference.

Research output: Conference PapersRGC 32 - Refereed conference paper (without host publication)peer-review