On feature selection with principal component analysis for one-class SVM
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 1027-1031 |
Journal / Publication | Pattern Recognition Letters |
Volume | 33 |
Issue number | 9 |
Publication status | Published - 1 Jul 2012 |
Externally published | Yes |
Link(s)
Abstract
In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection directions, here it is the eigenvectors associated with small eigenvalues that are of interests, and in particular the null of the eigenspace, since the null space in fact characterizes the common features of the training samples. Image retrieval examples are used to illustrate the effectiveness of dimension reduction. © 2012 Elsevier B.V. All rights reserved.
Research Area(s)
- Dimension reduction, Image retrieval, Support vector machine
Citation Format(s)
On feature selection with principal component analysis for one-class SVM. / Lian, Heng.
In: Pattern Recognition Letters, Vol. 33, No. 9, 01.07.2012, p. 1027-1031.
In: Pattern Recognition Letters, Vol. 33, No. 9, 01.07.2012, p. 1027-1031.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review