Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

8 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Pages (from-to)1209-1219
Journal / PublicationPattern Recognition
Volume46
Issue number4
Publication statusPublished - Apr 2013

Abstract

In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. © 2012 Elsevier Ltd. All rights reserved.

Research Area(s)

  • Alternative optimization, Image classification, KL divergence, Linear regression, Weighted learning

Citation Format(s)

Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence. / Liang, Zhizheng; Li, Youfu; Xia, Shixiong.
In: Pattern Recognition, Vol. 46, No. 4, 04.2013, p. 1209-1219.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review