TY - JOUR
T1 - Improving multi-label contrastive learning by leveraging label distribution
AU - Chen, Ning
AU - Lyu, Shen-Huan
AU - Wu, Tian-Shuang
AU - Wang, Yanyan
AU - Tang, Bin
PY - 2025/12/27
Y1 - 2025/12/27
N2 - In multi-label learning, leveraging contrastive learning to learn better representations faces a key challenge: selecting positive and negative samples and effectively utilizing label information. Previous studies address the former through differential overlap degrees between positive and negative samples, while existing approaches typically employ logical labels for the latter. However, directly using logical labels fails to fully utilize inter-label information, as they ignore the varying importance among labels. To address this problem, we propose a novel method that improves multi-label contrastive learning through label distribution. Specifically, the framework first leverages contrastive loss to estimate label distributions from logical labels, then integrates label-aware information from these distributions into the loss function. We conduct evaluations on multiple widely-used multi-label datasets, including image and vector datasets, and additionally validate the feasibility of learning latent label distributions from logical labels using contrastive loss on label distribution datasets. The results demonstrate that our method outperforms state-of-the-art methods in six evaluation metrics. © 2025 Elsevier Ltd.
AB - In multi-label learning, leveraging contrastive learning to learn better representations faces a key challenge: selecting positive and negative samples and effectively utilizing label information. Previous studies address the former through differential overlap degrees between positive and negative samples, while existing approaches typically employ logical labels for the latter. However, directly using logical labels fails to fully utilize inter-label information, as they ignore the varying importance among labels. To address this problem, we propose a novel method that improves multi-label contrastive learning through label distribution. Specifically, the framework first leverages contrastive loss to estimate label distributions from logical labels, then integrates label-aware information from these distributions into the loss function. We conduct evaluations on multiple widely-used multi-label datasets, including image and vector datasets, and additionally validate the feasibility of learning latent label distributions from logical labels using contrastive loss on label distribution datasets. The results demonstrate that our method outperforms state-of-the-art methods in six evaluation metrics. © 2025 Elsevier Ltd.
KW - Multi-label learning
KW - Contrastive learning
KW - Label distribution
UR - https://www.webofscience.com/wos/woscc/full-record/WOS:001662480800001
U2 - 10.1016/j.patcog.2025.113011
DO - 10.1016/j.patcog.2025.113011
M3 - RGC 21 - Publication in refereed journal
SN - 0031-3203
VL - 174
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 113011
ER -