[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Unpacking Tokenization: Evaluating Text Compression and its Correlation with Model Performance

Omer Goldman, Avi Caciularu, Matan Eyal, Kris Cao, Idan Szpektor, Reut Tsarfaty


Abstract
Despite it being the cornerstone of BPE, the most common tokenization algorithm, the importance of compression in the tokenization process is still unclear. In this paper, we argue for the theoretical importance of compression, that can be viewed as 0-gram language modeling where equal probability is assigned to all tokens. We also demonstrate the empirical importance of compression for downstream success of pre-trained language models. We control the compression ability of several BPE tokenizers by varying the amount of documents available during their training: from 1 million documents to a character-based tokenizer equivalent to no training data at all. We then pre-train English language models based on those tokenizers and fine-tune them over several tasks. We show that there is a correlation between tokenizers’ compression and models’ downstream performance, suggesting that compression is a reliable intrinsic indicator of tokenization quality. These correlations are more pronounced for generation tasks (over classification) or for smaller models (over large ones). We replicated a representative part of our experiments on Turkish and found similar results, confirming that our results hold for languages with typological characteristics dissimilar to English. We conclude that building better compressing tokenizers is a fruitful avenue for further research and for improving overall model performance.
Anthology ID:
2024.findings-acl.134
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2274–2286
Language:
URL:
https://aclanthology.org/2024.findings-acl.134
DOI:
10.18653/v1/2024.findings-acl.134
Bibkey:
Cite (ACL):
Omer Goldman, Avi Caciularu, Matan Eyal, Kris Cao, Idan Szpektor, and Reut Tsarfaty. 2024. Unpacking Tokenization: Evaluating Text Compression and its Correlation with Model Performance. In Findings of the Association for Computational Linguistics: ACL 2024, pages 2274–2286, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Unpacking Tokenization: Evaluating Text Compression and its Correlation with Model Performance (Goldman et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.134.pdf