Parametric Manifold Learning of Gaussian Mixture Models

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

2 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationProceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19)
EditorsSarit Kraus
Place of PublicationMacau
PublisherInternational Joint Conferences on Artificial Intelligence
Pages3073-3079
ISBN (electronic)978-0-9992411-4-1
Publication statusPublished - Aug 2019

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
Volume2019-August
ISSN (Print)1045-0823

Conference

Title28th International Joint Conference on Artificial Intelligence (IJCAI-19)
PlaceMacao
Period10 - 16 August 2019

Abstract

The Gaussian Mixture Model (GMM) is among the most widely used parametric probability distributions for representing data. However, it is complicated to analyze the relationship among GMMs since they lie on a high-dimensional manifold. Previous works either perform clustering of GMMs, which learns a limited discrete latent representation, or kernel-based embedding of GMMs, which is not interpretable due to difficulty in computing the inverse mapping. In this paper, we propose Parametric Manifold Learning of GMMs (PMLGMM), which learns a parametric mapping from a low-dimensional latent space to a high-dimensional GMM manifold. Similar to PCA, the proposed mapping is parameterized by the principal axes for the component weights, means, and covariances, which are optimized to minimize the reconstruction loss
measured using Kullback-Leibler divergence (KLD). As the KLD between two GMMs is intractable, we approximate the objective function by a variational upper bound, which is optimized by an EM-style algorithm. Moreover, We derive an efficient solver by alternating optimization of subproblems and exploit Monte Carlo sampling to escape from local minima. We demonstrate the effectiveness of PML-GMM through experiments on synthetic, eye-fixation, flow cytometry, and social check-in data.

Citation Format(s)

Parametric Manifold Learning of Gaussian Mixture Models. / Liu, Ziquan; Yu, Lei; Hsiao, Janet H. et al.
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19). ed. / Sarit Kraus. Macau: International Joint Conferences on Artificial Intelligence, 2019. p. 3073-3079 (IJCAI International Joint Conference on Artificial Intelligence; Vol. 2019-August).

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review