[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3625403.3625425acmotherconferencesArticle/Chapter ViewAbstractPublication PagesadmitConference Proceedingsconference-collections
research-article

Text Sentiment Classification Model based on Fusion of DualChannel Features of CNN and BiLSTM

Published: 17 November 2023 Publication History

Abstract

Text sentiment analysis is an important task in natural language processing (NLP), which aims to determine people's emotional tendency towards a certain topic or event by analyzing the language and emotion in the text. Aiming at the traditional emotion classification model can't fully capture the semantic information implied in short text comments, a two-channel emotion classification model based on CNN and BiLSTMl is proposed.Dynamic allocation weights introduced since the attention mechanism, build fusion BiLSTM and CNN's dual channel neural network architecture, and extract the bureau of emotional characteristics and emotional characteristics as global pay attention to the input feature fusion layer, through your emotions full text feature fusion module integration characteristic information and emotional polarity to break.Compared to the experimental results show that the model of emotion classification performance of the optimum Transformer model, this model (CNN-BiLSTM-AFF) on a public data set senti_weibo_100k accuracy, F1 value, the recall rate of 1.034%, 1.265% and 1.045% respectively.

References

[1]
Zhou Chao, Yan Xin, Yu Zheng-tao, Weibo new word recognition combining frequency characteristic and accessor variety[J]. Journal of Shandong University (Natural Science), 2015,50 (03): 6-10.
[2]
Wang Ting, Yang Wen-zhong. Review of Text Sentiment Analysis Methods[J]. Computer Engineering and Applications, 2021,57 (12): 11-24.
[3]
Gu Zheng-jia, Yao Tian-fang. Extraction and Discrimination of Evaluated Object and Its Orientation [J] Journal of Chinese Information Processing, 2012, 26 (4): 91-98.
[4]
Qian Q, Huang M, Lei J, Linguistically regularized lstms for sentiment classification[C]// 55th Annual Meeting of the Association for Conputational Linguistics, 2016: 1117-1154.
[5]
Abdi A, Shamsuddin S M, Hasan S, Deep learning-based sentiment classification of evaluative text based on Multi-feature fusion[J]. Information Processing & Management, 2019, 56(4):1245-1259.
[6]
Du Yong-ping, Zhao Xiao-zheng, Pei Bing-bing. Short Text Sentiment Classification Based on CNN-LSTM Model [J]. Journal of Beijing University of Technology, 2019,45 (07): 662-670.
[7]
Madasu A, Rao V A . Sequential Learning of Convolutional Features for Effective Text Classification[J]. arXiv:1909.00080, 2019.
[8]
Wu Xiao-hua, Chen Li, Wei Tian-tian, Sentiment Analysis of Chinese Short Text Based on Self-Attention and Bi-LSTM [J]. Journal of Chinese Information Processing, 2019,33 (06): 100-107.
[9]
Dai Y, Gieseke F, Oehmcke S, Attentional feature fusion[C]//Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision. 2021: 3560-3569.
[10]
Mnih V, Heess N, Graves A. Recurrent models of visual attention[J]. Advances in Neural Information Processing Systems, 2014:2204-2212.
[11]
Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[J]. arXiv preprint arXiv:1409.0473, 2014.
[12]
Wang B, Liu K, Zhao J. Inner attention based recurrent neural networks for answer selection[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016: 1288-1297.
[13]
Chen H, Sun M, Tu C, Neural sentiment classification with user and product attention[C]//Proceedings of the 2016 conference on empirical methods in natural language processing. 2016: 1650-1659.
[14]
Kim  Y.  Convolutional  neural  networks  for  sentence classification[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, 2014: 1746-1751.
[15]
Zhu X, Sobihani P, Guo H. Long short-term memory over recursive structures[C]//International conference on machine learning. PMLR, 2015: 1604-1612.
[16]
Zhou P, Shi W, Tian J, Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th annual meeting of the association for computational linguistics (volume 2: Short papers). 2016: 207-212.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ADMIT '23: Proceedings of the 2023 2nd International Conference on Algorithms, Data Mining, and Information Technology
September 2023
227 pages
ISBN:9798400707629
DOI:10.1145/3625403
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 November 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dual-channel
  2. feature fusion
  3. self-attention
  4. sentiment analysis

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ADMIT 2023

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 45
    Total Downloads
  • Downloads (Last 12 months)44
  • Downloads (Last 6 weeks)2
Reflects downloads up to 19 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media