[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

PRILoRA: Pruned and Rank-Increasing Low-Rank Adaptation

Nadav Benedek, Lior Wolf


Abstract
With the proliferation of large pre-trained language models (PLMs), fine-tuning all model parameters becomes increasingly inefficient, particularly when dealing with numerous downstream tasks that entail substantial training and storage costs. Several approaches aimed at achieving parameter-efficient fine-tuning (PEFT) have been proposed. Among them, Low-Rank Adaptation (LoRA) stands out as an archetypal method, incorporating trainable rank decomposition matrices into each target module. Nevertheless, LoRA does not consider the varying importance of each layer. To address these challenges, we introduce PRILoRA, which linearly allocates a different rank for each layer, in an increasing manner, and performs pruning throughout the training process, considering both the temporary magnitude of weights and the accumulated statistics of the input to any given layer. We validate the effectiveness of PRILoRA through extensive experiments on eight GLUE benchmarks, setting a new state of the art.
Anthology ID:
2024.findings-eacl.18
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
252–263
Language:
URL:
https://aclanthology.org/2024.findings-eacl.18
DOI:
Bibkey:
Cite (ACL):
Nadav Benedek and Lior Wolf. 2024. PRILoRA: Pruned and Rank-Increasing Low-Rank Adaptation. In Findings of the Association for Computational Linguistics: EACL 2024, pages 252–263, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
PRILoRA: Pruned and Rank-Increasing Low-Rank Adaptation (Benedek & Wolf, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.18.pdf
Software:
 2024.findings-eacl.18.software.zip
Video:
 https://aclanthology.org/2024.findings-eacl.18.mp4