Constrained Clustering With Dissimilarity Propagation-Guided Graph-Laplacian PCA
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Article number | 9178787 |
Pages (from-to) | 3985-3997 |
Number of pages | 13 |
Journal / Publication | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 32 |
Issue number | 9 |
Online published | 27 Aug 2020 |
Publication status | Published - Sept 2021 |
Link(s)
Abstract
In this article, we propose a novel model for constrained clustering, namely, the dissimilarity propagation-guided graph-Laplacian principal component analysis (DP-GLPCA). By fully utilizing a limited number of weakly supervisory information in the form of pairwise constraints, the proposed DP-GLPCA is capable of capturing both the local and global structures of input samples to exploit their characteristics for excellent clustering. More specifically, we first formulate a convex semisupervised low-dimensional embedding model by incorporating a new dissimilarity regularizer into GLPCA (i.e., an unsupervised dimensionality reduction model), in which both the similarity and dissimilarity between low-dimensional representations are enforced with the constraints to improve their discriminability. An efficient iterative algorithm based on the inexact augmented Lagrange multiplier is designed to solve it with the global convergence guaranteed. Furthermore, we innovatively propose to propagate the cannot-link constraints (i.e., dissimilarity) to refine the dissimilarity regularizer to be more informative. The resulting DP model is iteratively solved, and we also prove that it can converge to a Karush-Kuhn-Tucker point. Extensive experimental results over nine commonly used benchmark data sets show that the proposed DP-GLPCA can produce much higher clustering accuracy than state-of-the-art constrained clustering methods. Besides, the effectiveness and advantage of the proposed DP model are experimentally verified. To the best of our knowledge, it is the first time to investigate DP, which is contrast to existing pairwise constraint propagation that propagates similarity. The code is publicly available at https://github.com/jyh-learning/DP-GLPCA.
Research Area(s)
- Constrained clustering, convergence, convex relaxation, dissimilarity propagation (DP), Karush–Kuhn–Tucker (KKT)
Citation Format(s)
Constrained Clustering With Dissimilarity Propagation-Guided Graph-Laplacian PCA. / Jia, Yuheng; Hou, Junhui; Kwong, Sam.
In: IEEE Transactions on Neural Networks and Learning Systems, Vol. 32, No. 9, 9178787, 09.2021, p. 3985-3997.
In: IEEE Transactions on Neural Networks and Learning Systems, Vol. 32, No. 9, 9178787, 09.2021, p. 3985-3997.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review