Semantic-Gap-Oriented Feature Selection and Classifier Construction in Multilabel Learning

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

6 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)101-115
Number of pages15
Journal / PublicationIEEE Transactions on Cybernetics
Volume52
Issue number1
Online published18 Mar 2020
Publication statusPublished - Jan 2022

Abstract

Multilabel learning focuses on assigning instances with different labels. In essence, the multilabel learning aims at learning a predictive function from feature space to a label space. The predictive function learning procedure can be regarded as a feature selection procedure and as a classifier construction procedure. For feature selection, we extract features for each label based on the learned positive and negative feature-label correlations. The positive and negative relationships can illustrate which labels can and cannot be well presented by the corresponding features, respectively, due to the semantic gap. For classifier construction, we perform sample-specific and label-specific classifications. The interlabel and interinstance correlations are combined in these two kinds of classifications. These two correlations are learned from both input features and output labels when the output labels are too sparse to reveal the informative correlation. However, there exists the semantic gap when combining input and output spaces to mine the labelwise relationship. The semantic gap can be bridged by the learned feature-label correlation. Finally, extensive experimental results on several benchmarks under four domains are presented to show the effectiveness of the proposed framework.

Research Area(s)

  • Feature-label correlation, interlabel and interinstance correlations, multilabel learning, sample-specific and label-specific classifications, semantic gap