Abstract
With the advent of Deep Learning (DL), Natural Language Processing (NLP) has progressed at a high speed in the past few decades. Some DL models have been established for Relation Extraction and outperform than the traditional Machine Learning (ML) methods. In this paper, we built four DL models: Piecewise Convolutional Neural Network (PCNN), Convolutional Neural Network (CNN), Recurrent Neural network (RNN) and Bidirectional Recurrent Neural Network (Bi-RNN) to tackle the task. Using PCNN, we outperform than other three models, and achieve Area under Curve (AUC) of 0.154 with just 24 epochs. And we use some selector mechanism to improve the model. Our experimental results show that: (1) Attention mechanism got the best compatibility with all models, but in some case, max pooling may perform better than it. (2) Using only word embeddings, the performance of the model will discount a lot.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Zhang, Z., Zweigenbaum, P.: GNEG: graph-based negative sampling for word2vec. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 2, pp. 566–571 (2018)
Zhang, B., Xiong, D., Su, J.: Accelerating neural transformer via an average attention network. arXiv preprint arXiv:1805.00631 (2018)
Liu, W., Wen, Y., Yu, Z., et al.: Large-margin softmax loss for convolutional neural networks. In: Proceedings of the 33rd International Conference on Machine Learning, pp. 507–516 (2016)
Zeng, D., Liu, K., Lai, S., et al.: Relation classification via convolutional deep neural network (2014)
Kumar, S.: A survey of deep learning methods for relation extraction. arXiv preprint arXiv:1705.03645 (2017)
Liu, C.Y., Sun, W.B., Chao, W.H., et al.: Convolution neural network for relation extraction. In: Motoda, H., Wu, Z., Cao, L., Zaiane, O., Yao, M., Wang, W. (eds.) ADMA 2013. LNCS, vol. 8347, pp. 231–242. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-53917-6_21
Socher, R., Huval, B., Manning, C.D., et al.: Semantic compositionality through recursive matrix-vector spaces. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Association for Computational Linguistics, pp. 1201–1211 (2012)
Lin, Y., Shen, S., Liu, Z., et al.: Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 2124–2133 (2016)
Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. arXiv preprint arXiv:1601.00770 (2016)
Zeng, D., Liu, K., Chen, Y., et al.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)
Wu, S., Pan, H.: Chinese word segmentation based on hidden Markov model. Modern Comput. (Prof. Edn.) 33, 25–28 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, K. et al. (2019). Chinese Agricultural Entity Relation Extraction via Deep Learning. In: Huang, DS., Huang, ZK., Hussain, A. (eds) Intelligent Computing Methodologies. ICIC 2019. Lecture Notes in Computer Science(), vol 11645. Springer, Cham. https://doi.org/10.1007/978-3-030-26766-7_48
Download citation
DOI: https://doi.org/10.1007/978-3-030-26766-7_48
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26765-0
Online ISBN: 978-3-030-26766-7
eBook Packages: Computer ScienceComputer Science (R0)