TY - JOUR
T1 - Guide Subspace Learning for Unsupervised Domain Adaptation
AU - Zhang, Lei
AU - Fu, Jingru
AU - Wang, Shanshan
AU - Zhang, David
AU - Dong, Zhaoyang
AU - Philip Chen, C. L.
PY - 2020/9
Y1 - 2020/9
N2 - A prevailing problem in many machine learning tasks is that the training (i.e., source domain) and test data (i.e., target domain) have different distribution [i.e., non-independent identical distribution (i.i.d.)]. Unsupervised domain adaptation (UDA) was proposed to learn the unlabeled target data by leveraging the labeled source data. In this article, we propose a guide subspace learning (GSL) method for UDA, in which an invariant, discriminative, and domain-agnostic subspace is learned by three guidance terms through a two-stage progressive training strategy. First, the subspace-guided term reduces the discrepancy between the domains by moving the source closer to the target subspace. Second, the data-guided term uses the coupled projections to map both domains to a unified subspace, where each target sample can be represented by the source samples with a low-rank coefficient matrix that can preserve the global structure of data. In this way, the data from both domains can be well interlaced and the domain-invariant features can be obtained. Third, for improving the discrimination of the subspaces, the label-guided term is constructed for prediction based on source labels and pseudo-target labels. To further improve the model tolerance to label noise, a label relaxation matrix is introduced. For the solver, a two-stage learning strategy with teacher teaches and student feedbacks mode is proposed to obtain the discriminative domain-agnostic subspace. In addition, for handling nonlinear domain shift, a nonlinear GSL (NGSL) framework is formulated with kernel embedding, such that the unified subspace is imposed with nonlinearity. Experiments on various cross-domain visual benchmark databases show that our methods outperform many state-of-the-art UDA methods. The source code is available at https://github.com/Fjr9516/GSL. © 2012 IEEE.
AB - A prevailing problem in many machine learning tasks is that the training (i.e., source domain) and test data (i.e., target domain) have different distribution [i.e., non-independent identical distribution (i.i.d.)]. Unsupervised domain adaptation (UDA) was proposed to learn the unlabeled target data by leveraging the labeled source data. In this article, we propose a guide subspace learning (GSL) method for UDA, in which an invariant, discriminative, and domain-agnostic subspace is learned by three guidance terms through a two-stage progressive training strategy. First, the subspace-guided term reduces the discrepancy between the domains by moving the source closer to the target subspace. Second, the data-guided term uses the coupled projections to map both domains to a unified subspace, where each target sample can be represented by the source samples with a low-rank coefficient matrix that can preserve the global structure of data. In this way, the data from both domains can be well interlaced and the domain-invariant features can be obtained. Third, for improving the discrimination of the subspaces, the label-guided term is constructed for prediction based on source labels and pseudo-target labels. To further improve the model tolerance to label noise, a label relaxation matrix is introduced. For the solver, a two-stage learning strategy with teacher teaches and student feedbacks mode is proposed to obtain the discriminative domain-agnostic subspace. In addition, for handling nonlinear domain shift, a nonlinear GSL (NGSL) framework is formulated with kernel embedding, such that the unified subspace is imposed with nonlinearity. Experiments on various cross-domain visual benchmark databases show that our methods outperform many state-of-the-art UDA methods. The source code is available at https://github.com/Fjr9516/GSL. © 2012 IEEE.
KW - Domain adaptation
KW - subspace learning
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85086067029&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85086067029&origin=recordpage
U2 - 10.1109/TNNLS.2019.2944455
DO - 10.1109/TNNLS.2019.2944455
M3 - RGC 21 - Publication in refereed journal
SN - 2162-237X
VL - 31
SP - 3374
EP - 3388
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 9
M1 - 8890009
ER -