Mixture of Adversarial LoRAs : Boosting Robust Generalization in Meta-tuning

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationNeurIPS 2024
Subtitle of host publicationProceedings of the 38 Annual Conference on Neural Information Processing Systems
PublisherNeural Information Processing Systems (NeurIPS)
Publication statusOnline published - 26 Sept 2024

Conference

Title38th Annual Conference on Neural Information Processing Systems (NeurIPS 2024)
LocationVancouver Convention Center
PlaceCanada
CityVancouver
Period10 - 15 December 2024

Abstract

This paper introduces AMT, an Adversarial Meta-Tuning methodology, to boost the robust generalization of pre-trained models in the out-of-domain (OOD) few-shot learning. To address the challenge of transferring knowledge from source domains to unseen target domains, we construct the robust LoRAPool by meta-tuning LoRAs with double perturbations on both inputs and singular values and vectors at varying robustness levels. On top of that, we introduce a simple yet effective test-time merging mechanism for adaptively merging discriminative LoRAs for test-time task customization. Extensive evaluations demonstrate that the AMT brings substantial improvements over previous state-of-the-art methods across a range of OOD few-shot image classification tasks on three benchmarks, confirming the effectiveness of our approach to boost the robust generalization of pre-trained models.

Citation Format(s)

Mixture of Adversarial LoRAs: Boosting Robust Generalization in Meta-tuning. / Yang, Xu; Liu, Chen; Wei, Ying.
NeurIPS 2024: Proceedings of the 38 Annual Conference on Neural Information Processing Systems. Neural Information Processing Systems (NeurIPS), 2024.

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review