[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

DisSent: Learning Sentence Representations from Explicit Discourse Relations

Allen Nie, Erin Bennett, Noah Goodman


Abstract
Learning effective representations of sentences is one of the core missions of natural language understanding. Existing models either train on a vast amount of text, or require costly, manually curated sentence relation datasets. We show that with dependency parsing and rule-based rubrics, we can curate a high quality sentence relation task by leveraging explicit discourse relations. We show that our curated dataset provides an excellent signal for learning vector representations of sentence meaning, representing relations that can only be determined when the meanings of two sentences are combined. We demonstrate that the automatically curated corpus allows a bidirectional LSTM sentence encoder to yield high quality sentence embeddings and can serve as a supervised fine-tuning dataset for larger models such as BERT. Our fixed sentence embeddings achieve high performance on a variety of transfer tasks, including SentEval, and we achieve state-of-the-art results on Penn Discourse Treebank’s implicit relation prediction task.
Anthology ID:
P19-1442
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4497–4510
Language:
URL:
https://aclanthology.org/P19-1442
DOI:
10.18653/v1/P19-1442
Bibkey:
Cite (ACL):
Allen Nie, Erin Bennett, and Noah Goodman. 2019. DisSent: Learning Sentence Representations from Explicit Discourse Relations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4497–4510, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
DisSent: Learning Sentence Representations from Explicit Discourse Relations (Nie et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1442.pdf
Code
 windweller/DisExtract
Data
BookCorpusMultiNLISNLI