TY - GEN
T1 - Online Bagging for Anytime Transfer Learning
AU - CHI, Guokun
AU - JIANG, Min
AU - GAO, Xing
AU - HU, Weizhen
AU - GUO, Shihui
AU - TAN, Kay Chen
PY - 2019/12
Y1 - 2019/12
N2 - Transfer learning techniques have been widely used in the reality that it is difficult to obtain sufficient labeled data in the target domain, but a large amount of auxiliary data can be obtained in the relevant source domain. But most of the existing methods are based on offline data. In practical applications, it is often necessary to face online learning problems in which the data samples are achieved sequentially. In this paper, We are committed to applying the ensemble approach to solving the problem of online transfer learning so that it can be used in anytime setting. More specifically, we propose a novel online transfer learning framework, which applies the idea of online bagging methods to anytime transfer learning problems, and constructs strong classifiers through online iterations of the usefulness of multiple weak classifiers. Further, our algorithm also provides two extension schemes to reduce the impact of negative transfer. Experiments on three real data sets show that the effectiveness of our proposed algorithms.
AB - Transfer learning techniques have been widely used in the reality that it is difficult to obtain sufficient labeled data in the target domain, but a large amount of auxiliary data can be obtained in the relevant source domain. But most of the existing methods are based on offline data. In practical applications, it is often necessary to face online learning problems in which the data samples are achieved sequentially. In this paper, We are committed to applying the ensemble approach to solving the problem of online transfer learning so that it can be used in anytime setting. More specifically, we propose a novel online transfer learning framework, which applies the idea of online bagging methods to anytime transfer learning problems, and constructs strong classifiers through online iterations of the usefulness of multiple weak classifiers. Further, our algorithm also provides two extension schemes to reduce the impact of negative transfer. Experiments on three real data sets show that the effectiveness of our proposed algorithms.
KW - ensemble learning
KW - negative transfer
KW - online bagging
KW - online transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85080921983&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85080921983&origin=recordpage
U2 - 10.1109/SSCI44817.2019.9002755
DO - 10.1109/SSCI44817.2019.9002755
M3 - RGC 32 - Refereed conference paper (with host publication)
T3 - IEEE Symposium Series on Computational Intelligence, SSCI
SP - 941
EP - 947
BT - 2019 IEEE Symposium Series on Computational Intelligence
PB - IEEE
T2 - 2019 IEEE Symposium Series on Computational Intelligence, SSCI 2019
Y2 - 6 December 2019 through 9 December 2019
ER -