[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Benefiting from Language Similarity in the Multilingual MT Training: Case Study of Indonesian and Malaysian

Alberto Poncelas, Johanes Effendi


Abstract
The development of machine translation (MT) has been successful in breaking the language barrier of the world’s top 10-20 languages. However, for the rest of it, delivering an acceptable translation quality is still a challenge due to the limited resource. To tackle this problem, most studies focus on augmenting data while overlooking the fact that we can borrow high-quality natural data from the closely-related language. In this work, we propose an MT model training strategy by increasing the language directions as a means of augmentation in a multilingual setting. Our experiment result using Indonesian and Malaysian on the state-of-the-art MT model showcases the effectiveness and robustness of our method.
Anthology ID:
2022.loresmt-1.11
Volume:
Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022)
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Atul Kr. Ojha, Chao-Hong Liu, Ekaterina Vylomova, Jade Abbott, Jonathan Washington, Nathaniel Oco, Tommi A Pirinen, Valentin Malykh, Varvara Logacheva, Xiaobing Zhao
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
84–92
Language:
URL:
https://aclanthology.org/2022.loresmt-1.11
DOI:
Bibkey:
Cite (ACL):
Alberto Poncelas and Johanes Effendi. 2022. Benefiting from Language Similarity in the Multilingual MT Training: Case Study of Indonesian and Malaysian. In Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022), pages 84–92, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
Benefiting from Language Similarity in the Multilingual MT Training: Case Study of Indonesian and Malaysian (Poncelas & Effendi, LoResMT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.loresmt-1.11.pdf
Data
CCMatrix