[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Top-down Text-Level Discourse Rhetorical Structure Parsing with Bidirectional Representation Learning

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Early studies on discourse rhetorical structure parsing mainly adopt bottom-up approaches, limiting the parsing process to local information. Although current top-down parsers can better capture global information and have achieved particular success, the importance of local and global information at various levels of discourse parsing is different. This paper argues that combining local and global information for discourse parsing is more sensible. To prove this, we introduce a top-down discourse parser with bidirectional representation learning capabilities. Existing corpora on Rhetorical Structure Theory (RST) are known to be much limited in size, which makes discourse parsing very challenging. To alleviate this problem, we leverage some boundary features and a data augmentation strategy to tap the potential of our parser. We use two methods for evaluation, and the experiments on the RST-DT corpus show that our parser can primarily improve the performance due to the effective combination of local and global information. The boundary features and the data augmentation strategy also play a role. Based on gold standard elementary discourse units (EDUs), our parser significantly advances the baseline systems in nuclearity detection, with the results on the other three indicators (span, relation, and full) being competitive. Based on automatically segmented EDUs, our parser still outperforms previous state-of-the-art work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Mann W C, Thompson S A. Rhetorical structure theory: Toward a functional theory of text organization. Text-Interdisciplinary Journal for the Study of Discourse, 1988, 8(3): 243–281. https://doi.org/10.1515/text.1.1988.8.3.243.

    Article  Google Scholar 

  2. Goyal N, Eisenstein J. A joint model of rhetorical discourse structure and summarization. In Proc. the Workshop on Structured Prediction for NLP, Nov. 2016, pp.25–34. https://doi.org/10.18653/v1/W16-5903.

  3. Ji Y F, Smith N A. Neural discourse structure for text categorization. In Proc. the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Jul. 30–Aug. 4, 2017, pp.996–1005. https://doi.org/10.18653/v1/P17-1092.

  4. Joty S, Guzmán F, Màrquez L, Nakov P. Discourse structure in machine translation evaluation. Computational Linguistics, 2017, 43(4): 683–722. https://doi.org/10.1162/COLI_a_00298.

    Article  MathSciNet  Google Scholar 

  5. Carlson L, Marcu D, Okurovsky M. Building a discoursetagged corpus in the framework of Rhetorical Structure Theory. In Proc. the 2nd SIGdial Workshop on Discourse and Dialogue, Sept. 2001.

  6. Hernault H, Prendinger H, Du Verle D A, Ishizuka M. Hilda: A discourse parser using support vector machine classification. Dialogue & Discourse, 2010, 1(3): 1–33. https://doi.org/10.5087/dad.2010.003.

    Article  Google Scholar 

  7. Feng V W, Hirst G. A linear-time bottom-up discourse parser with constraints and post-editing. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Jun. 2014, pp.511–521. https://doi.org/10.3115/v1/P14-1048.

  8. Ji Y F, Eisenstein J. Representation learning for text-level discourse parsing. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Jun. 2014, pp.13–24. https://doi.org/10.3115/v1/P14-1002.

  9. Li J W, Li R M, Hovy E. Recursive deep models for discourse parsing. In Proc. the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Oct. 2014, pp.2061–2069. https://doi.org/10.3115/v1/D14-1220.

  10. Li Q, Li T S, Chang B B. Discourse parsing with attention-based hierarchical neural networks. In Proc. the 2016 Conference on Empirical Methods in Natural Language Processing, Nov. 2016, pp.362–371. https://doi.org/10.18653/v1/D16-1035.

  11. Braud C, Plank B, Søgaard A. Multi-view and multi-task training of RST discourse parsers. In Proc. the 26th International Conference on Computational Linguistics: Technical Papers, Dec. 2016, pp.1903–1913.

  12. Lin X, Joty S, Jwalapuram P, Bari M S. A unified lineartime framework for sentence-level discourse parsing. In Proc. the 57th Annual Meeting of the Association for Computational Linguistics, Jul. 2019, pp.4190–4200. https://doi.org/10.18653/v1/P19-1410.

  13. Zhang L Y, Xing Y Q, Kong F, Li P F, Zhou G D. A topdown neural architecture towards text-level parsing of discourse rhetorical structure. In Proc. the 58th Annual Meeting of the Association for Computational Linguistics, Jul. 2020, pp.6386–6395. https://doi.org/10.18653/v1/2020.acl-main.569.

  14. Kobayashi N, Hirao T, Kamigaito H, Okumura M, Nagata M. Top-down RST parsing utilizing granularity levels in documents. In Proc. the 34th Association for the Advancement of Artificial Intelligence, Feb. 2020, pp.8099–8106. https://doi.org/10.1609/aaai.v34i05.6321.

  15. Marcu D. The Theory and Practice of Discourse Parsing and Summarization. The MIT Press, 2000.

    Book  MATH  Google Scholar 

  16. Morey M, Muller P, Asher N. How much progress have we made on RST discourse parsing? A replication study of recent results on the RST-DT. In Proc. the 2017 Conference on Empirical Methods in Natural Language Processing, Sept. 2017, pp.1319–1324. https://doi.org/10.18653/v1/D17-1136.

  17. Zhang L Y, Kong F, Zhou G D. Syntax-guided sequence to sequence modeling for discourse segmentation. In Proc. the 9th CCF International Conference on Natural Language Processing and Chinese Computing, Oct. 2020, pp.95–107. https://doi.org/10.1007/978-3-030-60457-8_8.

  18. Joty S, Carenini G, Ng R, Mehdad Y. Combining intraand multi-sentential rhetorical parsing for document-level discourse analysis. In Proc. the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Aug. 2013, pp.486–496.

  19. Yu N, Zhang M S, Fu G H. Transition-based neural RST parsing with implicit syntax features. In Proc. the 27th International Conference on Computational Linguistics, Aug. 2018, pp.559–570.

  20. Braud C, Coavoux M, Søgaard A. Cross-lingual RST discourse parsing. In Proc. the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, Apr. 2017, pp.292–304.

  21. Wang Y Z, Li S J, Wang H F. A two-stage parsing method for text-level discourse analysis. In Proc. the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), Jul. 30–Aug. 4, 2017, pp.184–188. https://doi.org/10.18653/v1/P17-2029.

  22. Cho K, Van Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Proc. the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Oct. 2014, pp.1724–1734. https://doi.org/10.3115/v1/D14-1179.

  23. Dozat T, Manning C D. Deep biaffine attention for neural dependency parsing. In Proc. the 5th International Conference on Learning Representations, Apr. 2017.

  24. Zhu X D, Sobhani P, Guo H Y. Long short-term memory over recursive structures. In Proc. the 32nd International Conference on International Conference on Machine Learning, Jul. 2015, pp.1604–1612.

  25. Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. In Proc. the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Jul. 2015, pp.1556–1566. https://doi.org/10.3115/v1/P15-1150.

  26. Tan X, Zhang L Y, Xiong D Y, Zhou G D. Hierarchical modeling of global context for document-level neural machine translation. In Proc. the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Nov. 2019, pp.1576–1585. https://doi.org/10.18653/v1/D19-1168.

  27. Vinyals O, Fortunato M, Jaitly N. Pointer networks. In Proc. the 28th International Conference on Neural Information Processing Systems, Dec. 2015, pp.2692–2700.

  28. Wang Y Z, Li S J, Yang J F. Toward fast and accurate neural discourse segmentation. In Proc. the 2018 Conference on Empirical Methods in Natural Language Processing, Oct. 31–Nov. 4, 2018, pp.962–967. https://doi.org/10.18653/v1/D18-1116.

  29. Li J, Sun A X, Joty S. SegBot: A generic neural text segmentation model with pointer network. In Proc. the 27th International Joint Conference on Artificial Intelligence, Jul. 2018, pp.4166–4172. https://doi.org/10.24963/ijcai.2018/579.

  30. Marcheggiani D, Titov I. Encoding sentences with graph convolutional networks for semantic role labeling. In Proc. the 2017 Conference on Empirical Methods in Natural Language Processing, Sept. 2017, pp.1506–1515. https://doi.org/10.18653/v1/D17-1159.

  31. Sagae K, Lavie A. A classifier-based parser with linear run-time complexity. In Proc. the 9th International Workshop on Parsing Technology, Oct. 2005, pp.125–132.

  32. Li Y C, Feng W H, Sun J, Kong F, Zhou G D. Building Chinese discourse corpus with connective-driven dependency tree structure. In Proc. the 2014 Conference on Empirical Methods in Natural Language Processing, Oct. 2014, pp.2105–2114. https://doi.org/10.3115/v1/D14-1224.

  33. Pennington J, Socher R, Manning C. GloVe: Global vectors for word representation. In Proc. the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Oct. 2014, pp.1532–1543. https://doi.org/10.3115/v1/D14-1162.

  34. Peters M, Neumann M, Iyyer M, Gardner M, Clark C, Lee K, Zettlemoyer L. Deep contextualized word representations. In Proc. the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun. 2018, pp.2227–2237. https://doi.org/10.18653/v1/N18-1202.

  35. Manning C, Surdeanu M, Bauer J, Finkel J, Bethard S, McClosky D. The Stanford CoreNLP natural language processing toolkit. In Proc. the 52nd Annual Meeting of the Association for Computational Linguistics: System Demonstrations, Jun. 2014, pp.55–60. https://doi.org/10.3115/v1/P14-5010.

  36. Mabona A, Rimell L, Clark S, Vlachos A. Neural generative rhetorical structure parsing. In Proc. the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Nov. 2019, pp.2284–2295. https://doi.org/10.18653/v1/D19-1233.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fang Kong.

Supplementary Information

ESM 1

(PDF 147 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, LY., Tan, X., Kong, F. et al. Top-down Text-Level Discourse Rhetorical Structure Parsing with Bidirectional Representation Learning. J. Comput. Sci. Technol. 38, 985–1001 (2023). https://doi.org/10.1007/s11390-022-1167-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-022-1167-0

Keywords

Navigation