TY - JOUR
T1 - A bi-level metric learning framework via self-paced learning weighting
AU - Yan, Jing
AU - Wei, Wei
AU - Guo, Xinyao
AU - Dang, Chuangyin
AU - Liang, Jiye
PY - 2023/7
Y1 - 2023/7
N2 - Distance metric learning (DML) has achieved great success in many real-world applications. However, most existing DML models characterize the quality of tuples on the tuple level while ignoring the an-chor level. Therefore, the models are less accurate to portray the quality of tuples and tend to be over -fitting when anchors are noisy samples. In this paper, we devise a bi-level metric learning framework (BMLF), which characterizes the quality of tuples more finely on both levels, enhancing the generaliza-tion performance of the DML model. Furthermore, we present an implementation of BMLF based on a self-paced learning regular term and design the corresponding optimization algorithm. By weighing tu-ples on the anchor level and training the model using tuples with higher weights preferentially, the side effect of low-quality noisy samples will be alleviated. We empirically demonstrate that the effectiveness and robustness of the proposed method outperform the state-of-the-art methods on several benchmark datasets. © 2023 Elsevier Ltd. All rights reserved.
AB - Distance metric learning (DML) has achieved great success in many real-world applications. However, most existing DML models characterize the quality of tuples on the tuple level while ignoring the an-chor level. Therefore, the models are less accurate to portray the quality of tuples and tend to be over -fitting when anchors are noisy samples. In this paper, we devise a bi-level metric learning framework (BMLF), which characterizes the quality of tuples more finely on both levels, enhancing the generaliza-tion performance of the DML model. Furthermore, we present an implementation of BMLF based on a self-paced learning regular term and design the corresponding optimization algorithm. By weighing tu-ples on the anchor level and training the model using tuples with higher weights preferentially, the side effect of low-quality noisy samples will be alleviated. We empirically demonstrate that the effectiveness and robustness of the proposed method outperform the state-of-the-art methods on several benchmark datasets. © 2023 Elsevier Ltd. All rights reserved.
KW - Metric learning
KW - Self -paced learning
KW - Adaptive neighborhood
KW - Weighting tuples
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85149434940&origin=recordpage
UR - http://www.scopus.com/inward/record.url?scp=85149434940&partnerID=8YFLogxK
UR - http://gateway.isiknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=LinksAMR&SrcApp=PARTNER_APP&DestLinkType=FullRecord&DestApp=WOS&KeyUT=000954791900001
U2 - 10.1016/j.patcog.2023.109446
DO - 10.1016/j.patcog.2023.109446
M3 - RGC 21 - Publication in refereed journal
SN - 0031-3203
VL - 139
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 109446
ER -