Projects per year
Abstract
In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo-inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.
| Original language | English |
|---|---|
| Article number | e17992 |
| Journal | AICHE Journal |
| Volume | 69 |
| Issue number | 4 |
| Online published | 10 Dec 2022 |
| DOIs | |
| Publication status | Published - Apr 2023 |
Funding
The work described in this paper is partially supported by a Collaborative Research Fund grant (Project No. CityU C1115-20G) and a General Research Fund grant (No. 11303421) from the Research Grants Council of the Hong Kong Special Administrative Region, China, a Natural Science Foundation of China grant (U20A20189), and a City University of Hong Kong project (9380123), is gratefully acknowledged.
Research Keywords
- conjugate gradient
- latent variable analysis
- partial least squares analysis
- partial least squares regression
- regularized regression
- steepest descent
Publisher's Copyright Statement
- COPYRIGHT TERMS OF DEPOSITED POSTPRINT FILE: This is the peer reviewed version of the following article: Qin, S. J., Liu, Y., & Tang, S. (2022). Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling. AICHE Journal, 69(4), [e17992], which has been published in final form at https://doi.org/10.1002/aic.17992. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions. This article may not be enhanced, enriched or otherwise transformed into a derivative work, without express permission from Wiley or by statutory rights under applicable legislation. Copyright notices must not be removed, obscured or modified. The article must be linked to Wiley’s version of record on Wiley Online Library and any embedding, framing or otherwise making available the article or pages thereof by third parties from platforms, services and websites other than Wiley Online Library must be prohibited.
Fingerprint
Dive into the research topics of 'Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling'. Together they form a unique fingerprint.Projects
- 1 Finished
-
GRF: Dimension Reduction Modeling Methods for High Dimensional Dynamic Data in Smart Manufacturing and Operations
QIN, S. J. (Principal Investigator / Project Coordinator)
1/09/21 → 6/07/23
Project: Research