Maximum Entropy Subspace Clustering Network

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

2 Scopus Citations
View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Number of pages12
Journal / PublicationIEEE Transactions on Circuits and Systems for Video Technology
Online published15 Jun 2021
Publication statusOnline published - 15 Jun 2021

Abstract

Deep subspace clustering networks have attracted much attention in subspace clustering, in which an auto-encoder non-linearly maps the input data into a latent space, and a fully connected layer named self-expressiveness module is introduced to learn the affinity matrix via a typical regularization term (e.g., sparse or low-rank). However, the adopted regularization terms ignore the connectivity within each subspace, limiting their clustering performance. In addition, the adopted framework suffers from the coupling issue between the auto-encoder module and the self-expressiveness module, making the network training non-trivial. To tackle these two issues, we propose a novel deep subspace clustering method named Maximum Entropy Subspace Clustering Network (MESC-Net). Specifically, MESC-Net maximizes the entropy of the affinity matrix to promote the connectivity within each subspace, in which its elements corresponding to the same subspace are uniformly and densely distributed. Meanwhile, we design a novel framework to explicitly decouple the auto-encoder module and the self-expressiveness module. Besides, we also theoretically prove that the learned affinity matrix satisfies the block-diagonal property under the assumption of independent subspaces. Extensive quantitative and qualitative results on commonly used benchmark datasets validate MESC-Net significantly outperforms state-of-the-art methods. The code is publicly available at https://github.com/ZhihaoPENG-CityU/MESC.

Research Area(s)

  • Deep learning, subspace clustering, maximum entropy regularization, decoupling