Accelerated low-rank representation for subspace clustering and semi-supervised classification on large-scale data

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journal

8 Scopus Citations
View graph of relations

Related Research Unit(s)


Original languageEnglish
Pages (from-to)39-48
Journal / PublicationNeural Networks
Online published2 Feb 2018
Publication statusPublished - Apr 2018


The scalability of low-rank representation (LRR) to large-scale data is still a major research issue, because it is extremely time-consuming to solve singular value decomposition (SVD) in each optimization iteration especially for large matrices. Several methods were proposed to speed up LRR, but they are still computationally heavy, and the overall representation results were also found degenerated. In this paper, a novel method, called accelerated LRR (ALRR) is proposed for large-scale data. The proposed accelerated method integrates matrix factorization with nuclear-norm minimization to find a low-rank representation. In our proposed method, the large square matrix of representation coefficients is transformed into a significantly smaller square matrix, on which SVD can be efficiently implemented. The size of the transformed matrix is not related to the number of data points and the optimization of ALRR is linear with the number of data points. The proposed ALRR is convex, accurate, robust, and efficient for large-scale data. In this paper, ALRR is compared with state-of-the-art in subspace clustering and semi-supervised classification on real image datasets. The obtained results verify the effectiveness and superiority of the proposed ALRR method.

Research Area(s)

  • Large-scale data, Low-rank representation, Matrix factorization, Nuclear norm, Semi-supervised classification, Subspace clustering