DEeR: Deviation Eliminating and Noise Regulating for Privacy-preserving Federated Low-rank Adaptation

Meilu Zhu, Axiu Mao, Jun Liu*, Yixuan Yuan*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

Integrating low-rank adaptation (LoRA) with federated learning (FL) has received widespread attention recently, aiming to adapt pretrained foundation models (FMs) to downstream medical tasks via privacy-preserving decentralized training. However, owing to the direct combination of LoRA and FL, current methods generally undergo two problems, i.e., aggregation deviation, and differential privacy (DP) noise amplification effect. To address these problems, we propose a novel privacy-preserving federated finetuning framework called Deviation Eliminating and Noise Regulating (DEeR). Specifically, we firstly theoretically prove that the necessary condition to eliminate aggregation deviation is guaranteeing the equivalence between LoRA parameters of clients. Based on the theoretical insight, a deviation eliminator is designed to utilize alternating minimization algorithm to iteratively optimize the zero-initialized and non-zero-initialized parameter matrices of LoRA, ensuring that aggregation deviation always be zeros during training. Furthermore, we also conduct an in-depth analysis of the noise amplification effect and find that this problem is mainly caused by the “linear relationship” between DP noise and LoRA parameters. To suppress the noise amplification effect, we propose a noise regulator that exploits two regulator factors to decouple relationship between DP and LoRA, thereby achieving robust privacy protection and excellent finetuning performance. Additionally, we perform comprehensive ablated experiments to verify the effectiveness of the deviation eliminator and noise regulator. DEeR shows better performance on public medical datasets in comparison with state-of-the-art approaches. The code is available at https://github.com/CUHK-AIM-Group/DEeR. © 2024 IEEE.
Original languageEnglish
Pages (from-to)1783-1795
Number of pages13
JournalIEEE Transactions on Medical Imaging
Volume44
Issue number4
Online published19 Dec 2024
DOIs
Publication statusPublished - Apr 2025

Funding

This work was supported by the Hong Kong Research Grants Council under Grant 11212321, 11217922, and ECS-21212720, the HKSAR Innovation and Technology Commission (ITC) under ITF Project MHP/109/19, ITS/229/22, and the Science, Technology and Innovation Committee of Shenzhen under Grant SGDX20210823104001011.

Research Keywords

  • Federated Learning
  • Foundation Models
  • Low-rank Adaptation
  • Parameter-efficient Tuning

Fingerprint

Dive into the research topics of 'DEeR: Deviation Eliminating and Noise Regulating for Privacy-preserving Federated Low-rank Adaptation'. Together they form a unique fingerprint.

Cite this