Meta Continual Learning Revisited: Implicitly Enhancing Online Hessian Approximation via Variance Reduction

Yichen Wu, Long-Kai Huang*, Renzhen Wang, Deyu Meng, Ying Wei*

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

9 Citations (Scopus)

Abstract

Regularization-based methods have so far been among the de facto choices for continual learning. Recent theoretical studies have revealed that these methods all boil down to relying on the Hessian matrix approximation of model weights. However, these methods suffer from suboptimal trade-offs between knowledge transfer and forgetting due to fixed and unchanging Hessian estimations during training. Another seemingly parallel strand of Meta-Continual Learning (Meta-CL) algorithms enforces alignment between gradients of previous tasks and that of the current task. In this work we revisit Meta-CL and for the first time bridge it with regularization-based methods. Concretely, Meta-CL implicitly approximates Hessian in an online manner, which enjoys the benefits of timely adaptation but meantime suffers from high variance induced by random memory buffer sampling. We are thus highly motivated to combine the best of both worlds, through the proposal of Variance Reduced Meta-CL (VR-MCL) to achieve both timely and accurate Hessian approximation. Through comprehensive experiments across three datasets and various settings, we consistently observe that VR-MCL outperforms other SOTA methods, which further validates the effectiveness of VR-MCL. © 2024 12th International Conference on Learning Representations, ICLR 2024. All rights reserved.
Original languageEnglish
Title of host publicationThe Twelfth International Conference on Learning Representations
Subtitle of host publicationICLR 2024
PublisherInternational Conference on Learning Representations, ICLR
Publication statusPublished - 2024
Event12th International Conference on Learning Representations (ICLR 2024) - Messe Wien Exhibition and Congress Center, Vienna, Austria
Duration: 7 May 202411 May 2024
https://iclr.cc/Conferences/2024
https://openreview.net/group?id=ICLR.cc/2024/Conference

Publication series

NameInternational Conference on Learning Representations, ICLR

Conference

Conference12th International Conference on Learning Representations (ICLR 2024)
Country/TerritoryAustria
CityVienna
Period7/05/2411/05/24
Internet address

Bibliographical note

Research Unit(s) information for this publication is provided by the author(s) concerned.

Fingerprint

Dive into the research topics of 'Meta Continual Learning Revisited: Implicitly Enhancing Online Hessian Approximation via Variance Reduction'. Together they form a unique fingerprint.

Cite this