Abstract
In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. © 2012 Elsevier Ltd. All rights reserved.
| Original language | English |
|---|---|
| Pages (from-to) | 1209-1219 |
| Journal | Pattern Recognition |
| Volume | 46 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - Apr 2013 |
Research Keywords
- Alternative optimization
- Image classification
- KL divergence
- Linear regression
- Weighted learning
Fingerprint
Dive into the research topics of 'Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver