[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Syllable-level Neural Language Model for Agglutinative Language

Seunghak Yu, Nilesh Kulkarni, Haejun Lee, Jihie Kim


Abstract
We introduce a novel method to diminish the problem of out of vocabulary words by introducing an embedding method which leverages the agglutinative property of language. We propose additional embedding derived from syllables and morphemes for the words to improve the performance of language model. We apply the above method to input prediction tasks and achieve state of the art performance in terms of Key Stroke Saving (KSS) w.r.t. to existing device input prediction methods.
Anthology ID:
W17-4113
Volume:
Proceedings of the First Workshop on Subword and Character Level Models in NLP
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Manaal Faruqui, Hinrich Schuetze, Isabel Trancoso, Yadollah Yaghoobzadeh
Venue:
SCLeM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
92–96
Language:
URL:
https://aclanthology.org/W17-4113
DOI:
10.18653/v1/W17-4113
Bibkey:
Cite (ACL):
Seunghak Yu, Nilesh Kulkarni, Haejun Lee, and Jihie Kim. 2017. Syllable-level Neural Language Model for Agglutinative Language. In Proceedings of the First Workshop on Subword and Character Level Models in NLP, pages 92–96, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Syllable-level Neural Language Model for Agglutinative Language (Yu et al., SCLeM 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-4113.pdf
Attachment:
 W17-4113.Attachment.zip