Progressive subspace ensemble learning
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 692-705 |
Journal / Publication | Pattern Recognition |
Volume | 60 |
Online published | 21 Jun 2016 |
Publication status | Published - Dec 2016 |
Link(s)
Abstract
There are not many classifier ensemble approaches which investigate the data sample space and the feature space at the same time, and this multi-pronged approach will be helpful for constructing more powerful learning models. For example, the AdaBoost approach only investigates the data sample space, while the random subspace technique only focuses on the feature space. To address this limitation, we propose the progressive subspace ensemble learning approach (PSEL) which takes into account the data sample space and the feature space at the same time. Specifically, PSEL first adopts the random subspace technique to generate a set of subspaces. Then, a progressive selection process based on new cost functions that incorporate current and long-term information to select the classifiers sequentially will be introduced. Finally, a weighted voting scheme is used to summarize the predicted labels and obtain the final result. We also adopt a number of non-parametric tests to compare PSEL and its competitors over multiple datasets. The results of the experiments show that PSEL works well on most of the real datasets, and outperforms a number of state-of-the-art classifier ensemble approaches.
Research Area(s)
- AdaBoost, Classifier ensemble, Decision tree, Ensemble learning, Random subspace
Citation Format(s)
Progressive subspace ensemble learning. / Yu, Zhiwen; Wang, Daxing; You, Jane et al.
In: Pattern Recognition, Vol. 60, 12.2016, p. 692-705.
In: Pattern Recognition, Vol. 60, 12.2016, p. 692-705.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review