Projects per year
Abstract
Sharpness-aware minimization (SAM) aims to enhance model generalization by minimizing the sharpness of the loss function landscape, leading to a robust model performance. To protect sensitive information and enhance privacy, prevailing approaches add noise to models. However, additive noises would inevitably degrade the generalization and robustness of the model. In this paper, we propose a noise-resistant SAM method, based on a noise-resistant parameter update rule. We analyze the convergence and noise resistance properties of the proposed method under noisy conditions. We elaborate on experimental results with several networks on various benchmark datasets to demonstrate the advantages of the proposed method with respect to model generalization and privacy protection. © 2024 Elsevier Ltd.
Original language | English |
---|---|
Article number | 106829 |
Journal | Neural Networks |
Volume | 181 |
Online published | 24 Oct 2024 |
DOIs | |
Publication status | Published - Jan 2025 |
Funding
This work was supported in part by the National Natural Science Foundation of China under Grant 62176109 and the Hong Kong Research Grants Council under Grant 11203721.
Research Keywords
- Deep neural networks
- Noise resistance
- Sharpness-aware minimization
Fingerprint
Dive into the research topics of 'Noise-resistant sharpness-aware minimization in deep learning'. Together they form a unique fingerprint.Projects
- 1 Active
-
GRF: Neurodynamics-driven Optimization and Control of Intelligent Heating, Ventilation and Air Conditioning Systems
WANG, J. (Principal Investigator / Project Coordinator), LIN, J. Z. (Co-Investigator) & LU, W. Z. (Co-Investigator)
1/01/22 → …
Project: Research