Abstract
We analyze the influence of adversarial training on the loss landscape of machine learning models. To this end, we first provide analytical studies of the properties of adversarial loss functions under different adversarial budgets. We then demonstrate that the adversarial loss landscape is less favorable to optimization, due to increased curvature and more scattered gradients. Our conclusions are validated by numerical analyses, which show that training under large adversarial budgets impede the escape from suboptimal random initialization, cause non-vanishing gradients and make the models' minima found sharper. Based on these observations, we show that a periodic adversarial scheduling (PAS) strategy can effectively overcome these challenges, yielding better results than vanilla adversarial training while being much less sensitive to the choice of learning rate.
| Original language | English |
|---|---|
| Title of host publication | NeurIPS Proceedings |
| Subtitle of host publication | Advances in Neural Information Processing Systems 33 (NeurIPS 2020) |
| Editors | H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, H. Lin |
| Publisher | Neural Information Processing Systems (NeurIPS) |
| Volume | 33 |
| ISBN (Print) | 9781713829546 |
| Publication status | Published - Dec 2020 |
| Externally published | Yes |
| Event | 34th Conference on Neural Information Processing Systems (NeurIPS 2020) - Virtual, Vancouver, Canada Duration: 6 Dec 2020 → 12 Dec 2020 https://nips.cc/Conferences/2020 |
Conference
| Conference | 34th Conference on Neural Information Processing Systems (NeurIPS 2020) |
|---|---|
| Place | Canada |
| City | Vancouver |
| Period | 6/12/20 → 12/12/20 |
| Internet address |