Skip to main navigation Skip to search Skip to main content

Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence

Zhizheng Liang, Youfu Li, Shixiong Xia

    Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

    Abstract

    In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. © 2012 Elsevier Ltd. All rights reserved.
    Original languageEnglish
    Pages (from-to)1209-1219
    JournalPattern Recognition
    Volume46
    Issue number4
    DOIs
    Publication statusPublished - Apr 2013

    Research Keywords

    • Alternative optimization
    • Image classification
    • KL divergence
    • Linear regression
    • Weighted learning

    Fingerprint

    Dive into the research topics of 'Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence'. Together they form a unique fingerprint.

    Cite this