[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Kronecker Decomposition for GPT Compression

Ali Edalati, Marzieh Tahaei, Ahmad Rashid, Vahid Nia, James Clark, Mehdi Rezagholizadeh


Abstract
GPT is an auto-regressive Transformer-based pre-trained language model which has attracted a lot of attention in the natural language processing (NLP) domain. The success of GPT is mostly attributed to its pre-training on huge amount of data and its large number of parameters. Despite the superior performance of GPT, this overparameterized nature of GPT can be very prohibitive for deploying this model on devices with limited computational power or memory. This problem can be mitigated using model compression techniques; however, compressing GPT models has not been investigated much in the literature. In this work, we use Kronecker decomposition to compress the linear mappings of the GPT-2 model. Our Kronecker GPT-2 model (KnGPT2) is initialized based on the Kronecker decomposed version of the GPT-2 model and then is undergone a very light pre- training on only a small portion of the training data with intermediate layer knowledge distillation (ILKD). Finally, our KnGPT2 is fine-tuned on downstream tasks using ILKD as well. We evaluate our model on both language modeling and General Language Understanding Evaluation benchmark tasks and show that with more efficient pre-training and similar number of parameters, our KnGPT2 outperforms the existing DistilGPT2 model significantly.
Anthology ID:
2022.acl-short.24
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
219–226
Language:
URL:
https://aclanthology.org/2022.acl-short.24
DOI:
10.18653/v1/2022.acl-short.24
Bibkey:
Cite (ACL):
Ali Edalati, Marzieh Tahaei, Ahmad Rashid, Vahid Nia, James Clark, and Mehdi Rezagholizadeh. 2022. Kronecker Decomposition for GPT Compression. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 219–226, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Kronecker Decomposition for GPT Compression (Edalati et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.24.pdf
Video:
 https://aclanthology.org/2022.acl-short.24.mp4
Data
GLUEOpenWebTextQNLIWebTextWikiText-103WikiText-2