TY - JOUR
T1 - Metric learning via perturbing hard-to-classify instances
AU - Guo, Xinyao
AU - Wei, Wei
AU - Liang, Jianqing
AU - Dang, Chuangyin
AU - Liang, Jiye
PY - 2022/12
Y1 - 2022/12
N2 - Constraint selection is an effective means to alleviate the problem of a massive amount of constraints in metric learning. However, it is difficult to find and deal with all association constraints with the same hard-to-classify instance (i.e., an instance surrounded by dissimilar instances), negatively affecting metric learning algorithms. To address this problem, we propose a new metric learning algorithm from the perspective of selecting instances, Metric Learning via Perturbing of Hard-to-classify Instances (ML-PHI), which directly perturbs the hard-to-classify instances to reduce over-fitting for the hard-to-classify instances. ML-PHI perturbs hard-to-classify instances to be closer to similar instances while keeping the positions of the remaining instances as constant as possible. As a result, the negative impacts of hard-to-classify instances are effectively reduced. We have conducted extensive experiments on real data sets, and the results show that ML-PHI is effective and outperforms state-of-the-art methods.
AB - Constraint selection is an effective means to alleviate the problem of a massive amount of constraints in metric learning. However, it is difficult to find and deal with all association constraints with the same hard-to-classify instance (i.e., an instance surrounded by dissimilar instances), negatively affecting metric learning algorithms. To address this problem, we propose a new metric learning algorithm from the perspective of selecting instances, Metric Learning via Perturbing of Hard-to-classify Instances (ML-PHI), which directly perturbs the hard-to-classify instances to reduce over-fitting for the hard-to-classify instances. ML-PHI perturbs hard-to-classify instances to be closer to similar instances while keeping the positions of the remaining instances as constant as possible. As a result, the negative impacts of hard-to-classify instances are effectively reduced. We have conducted extensive experiments on real data sets, and the results show that ML-PHI is effective and outperforms state-of-the-art methods.
KW - Alternating minimization
KW - Hard-to-classify instances
KW - Instance perturbation
KW - Metric learning
UR - http://www.scopus.com/inward/record.url?scp=85135784046&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85135784046&origin=recordpage
U2 - 10.1016/j.patcog.2022.108928
DO - 10.1016/j.patcog.2022.108928
M3 - RGC 21 - Publication in refereed journal
SN - 0031-3203
VL - 132
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 108928
ER -