[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

ADEPT: Adapter-based Efficient Prompt Tuning Approach for Language Models

Aditya Shah, Surendrabikram Thapa, Aneesh Jain, Lifu Huang


Anthology ID:
2023.sustainlp-1.8
Volume:
Proceedings of The Fourth Workshop on Simple and Efficient Natural Language Processing (SustaiNLP)
Month:
July
Year:
2023
Address:
Toronto, Canada (Hybrid)
Editors:
Nafise Sadat Moosavi, Iryna Gurevych, Yufang Hou, Gyuwan Kim, Young Jin Kim, Tal Schuster, Ameeta Agrawal
Venue:
sustainlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
121–128
Language:
URL:
https://aclanthology.org/2023.sustainlp-1.8
DOI:
10.18653/v1/2023.sustainlp-1.8
Bibkey:
Cite (ACL):
Aditya Shah, Surendrabikram Thapa, Aneesh Jain, and Lifu Huang. 2023. ADEPT: Adapter-based Efficient Prompt Tuning Approach for Language Models. In Proceedings of The Fourth Workshop on Simple and Efficient Natural Language Processing (SustaiNLP), pages 121–128, Toronto, Canada (Hybrid). Association for Computational Linguistics.
Cite (Informal):
ADEPT: Adapter-based Efficient Prompt Tuning Approach for Language Models (Shah et al., sustainlp 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.sustainlp-1.8.pdf
Video:
 https://aclanthology.org/2023.sustainlp-1.8.mp4