Progressive Semisupervised Learning of Multiple Classifiers

Zhiwen Yu*, Ye Lu, Jun Zhang, Jane You, Hau-San Wong, Yide Wang, Guoqiang Han

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

42 Citations (Scopus)

Abstract

Semisupervised learning methods are often adopted to handle datasets with very small number of labeled samples. However, conventional semisupervised ensemble learning approaches have two limitations: 1) most of them cannot obtain satisfactory results on high dimensional datasets with limited labels and 2) they usually do not consider how to use an optimization process to enlarge the training set. In this paper, we propose the progressive semisupervised ensemble learning approach (PSEMISEL) to address the above limitations and handle datasets with very small number of labeled samples. When compared with traditional semisupervised ensemble learning approaches, PSEMISEL is characterized by two properties: 1) it adopts the random subspace technique to investigate the structure of the dataset in the subspaces and 2) a progressive training set generation process and a self evolutionary sample selection process are proposed to enlarge the training set. We also use a set of nonparametric tests to compare different semisupervised ensemble learning methods over multiple datasets. The experimental results on 18 real-world datasets from the University of California, Irvine machine learning repository show that PSEMISEL works well on most of the real-world datasets, and outperforms other state-of-the-art approaches on 10 out of 18 datasets.
Original languageEnglish
Article number7827073
Pages (from-to)689-702
JournalIEEE Transactions on Cybernetics
Volume48
Issue number2
Online published19 Jan 2017
DOIs
Publication statusPublished - Feb 2018

Research Keywords

  • Ensemble learning
  • machine learning
  • optimization
  • random subspace
  • semisupervised learning

Fingerprint

Dive into the research topics of 'Progressive Semisupervised Learning of Multiple Classifiers'. Together they form a unique fingerprint.

Cite this