%0 Conference Proceedings %T MEDs for PETs: Multilingual Euphemism Disambiguation for Potentially Euphemistic Terms %A Lee, Patrick %A Chirino Trujillo, Alain %A Cuevas Plancarte, Diana %A Ojo, Olumide %A Liu, Xinyi %A Shode, Iyanuoluwa %A Zhao, Yuan %A Feldman, Anna %A Peng, Jing %Y Graham, Yvette %Y Purver, Matthew %S Findings of the Association for Computational Linguistics: EACL 2024 %D 2024 %8 March %I Association for Computational Linguistics %C St. Julian’s, Malta %F lee-etal-2024-meds %X Euphemisms are found across the world‘s languages, making them a universal linguistic phenomenon. As such, euphemistic data may have useful properties for computational tasks across languages. In this study, we explore this premise by training a multilingual transformer model (XLM-RoBERTa) to disambiguate potentially euphemistic terms (PETs) in multilingual and cross-lingual settings. In line with current trends, we demonstrate that zero-shot learning across languages takes place. We also show cases where multilingual models perform better on the task compared to monolingual models by a statistically significant margin, indicating that multilingual data presents additional opportunities for models to learn about cross-lingual, computational properties of euphemisms. In a follow-up analysis, we focus on universal euphemistic “categories” such as death and bodily functions among others. We test to see whether cross-lingual data of the same domain is more important than within-language data of other domains to further understand the nature of the cross-lingual transfer. %U https://aclanthology.org/2024.findings-eacl.59/ %P 875-881