SBoRA: Low-Rank Adaptation with Regional Weight Updates

Lai-Man Po*, Yuyang Liu, Haoxuan Wu, Tianqi Zhang, Wing-Yin Yu, Zhuohan Wang, Zeyu Jiang, Kun Li

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

Abstract

This paper introduces Standard Basis LoRA (SBoRA), a novel parameter-efficient fine-tuning approach for Large Language Models that builds upon the pioneering works of Low-Rank Adaptation (LoRA) and Orthogonal Adaptation. SBoRA reduces the number of trainable parameters by half or doubles the rank with the similar number of trainable parameters as LoRA, while improving learning performance. By utilizing orthogonal standard basis vectors to initialize one of the low-rank matrices (either A or B), SBoRA facilitates regional weight updates and memory-efficient fine-tuning. This results in two variants, SBoRA-FA and SBoRA-FB, where only one of the matrices is updated, leading to a sparse update matrix ΔW with predominantly zero rows or columns. Consequently, most of the fine-tuned model’s weights (W0W) remain unchanged from the pre-trained weights, akin to the modular organization of the human brain, which efficiently adapts to new tasks. Our empirical results demonstrate the superiority of SBoRA-FA over LoRA in various fine-tuning tasks, including commonsense reasoning and arithmetic reasoning. Furthermore, we evaluate the effectiveness of QSBoRA on quantized LLaMA models of varying scales, highlighting its potential for efficient adaptation to new tasks. Code is available at https://github.com/cityuhkai/SBoRA. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2026.
Original languageEnglish
Title of host publicationNeural Information Processing - 31st International Conference, ICONIP 2024, Proceedings, Part XIII
EditorsMufti Mahmud, Maryam Doborjeh, Kevin Wong, Andrew Chi Sing Leung, Zohreh Doborjeh, M. Tanveer
PublisherSpringer Singapore
Pages387-401
Number of pages15
ISBN (Electronic)9789819670086
ISBN (Print)9789819670079
DOIs
Publication statusPublished - 2026
Event31st International Conference on Neural Information Processing (ICONIP 2024) - Auckland University of Technology, Auckland, New Zealand
Duration: 2 Dec 20246 Dec 2024

Publication series

NameCommunications in Computer and Information Science
Volume2294
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference31st International Conference on Neural Information Processing (ICONIP 2024)
PlaceNew Zealand
CityAuckland
Period2/12/246/12/24

Research Keywords

  • Large Language Models
  • LoRA
  • Parameter-Efficient Fine-Tuning

Fingerprint

Dive into the research topics of 'SBoRA: Low-Rank Adaptation with Regional Weight Updates'. Together they form a unique fingerprint.

Cite this