[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3573428.3573465acmotherconferencesArticle/Chapter ViewAbstractPublication PageseitceConference Proceedingsconference-collections
research-article

News classifications based on CBA-PreambleCNN Model

Published: 15 March 2023 Publication History

Abstract

In order to solve the problems of insufficient information extraction and poor classification effect of a single deep learning model, this paper proposes a hybrid multi-neural network CBOW-BiLSTM-Attention-PreambleCNN model(The CBA-PreambleCNN for short, PreambleCNN is the name of the improved TextCNN). The model uses Word2Vec as the word embedding layer to obtain the vector representation of the word, and then feeds the Bidirectional Long Short-Term Memory (BI-LSTM) network to capture the global information of the text, and then uses the Attention mechanism to make the word get different weights. Finally, it is fed into the improved Text Convolutional Neural Network (TextCNN fused with previous information, named PreambleCNN) to obtain the topic features of the Text, and the obtained feature vectors are fed into the softmax function for classification. After comparison with other models, this model has achieved good classification effect, and achieved 93.47% and 84.65% accuracy on Sina news dataset and Sohu news dataset, respectively. To some extent, this model solves the problem of insufficient information captured by a single model.

References

[1]
TAI K S, SOCHER R, MANNING C D. Improved semantic rep-resentations from tree-structured long short-term memory net-work [C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing ACL, 2015: 1556-1566.
[2]
WANG Z Q, SUN X, ZHANG D X, An optimal SVMbased text classification algorithm [C]//Proceedings of 2006 International Conference on Machine Learning and Cybernetics. Washington D. C., USA: IEEE Press, 2006: 1378-1381.
[3]
ZHANG L G, JIANG L X, LI C Q. Two feature weighting approaches for naive Bayes text classifiers [J]. Knowledge-Based Systems. 2016. 100: 137-144.
[4]
Bengio Y, De M R, Flammia G, Kompe R. Global optimization of a neural network-hidden markov model hybrid [J]. IEEE transactions on neural networks and learning systems, 1992, (2): 252-9.
[5]
Mikolov T, Bojanowski p, Grave E. Enriching word vectors with subword Information [J]. Transactions of the association for computational linguistics, 2017, (5): 15-16.
[6]
Xiong Z, Shen Q, Xiong Y, New generation model of word vector representation based on CBOW or Skip-Gram [J]. 2019, 60(1): 259-273.
[7]
Zhang J, Li z, QU d. An improved recurrent neural network language model with context vector features [C]. Proceedings of 2014 IEEE 5th International Conference on Software Engineering and Service Science, 2014.
[8]
PENNINCTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation [C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg. PA: Association for Computational Linguistics, 2014: 1532-1543.
[9]
DEVLIN J, CHANG M W, LEE K, BERT: pre-training of deep bidirectional transformers for language understanding [C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Stroudsburg, PA: Association for Computational Linguistics, 2019: 4171-4186.
[10]
HOCHREITER S, SCHMIDHUBER J. Long short-term memory [J]. Neural Computation, 1997, 9(8): 1735-1780.
[11]
Kalchbrenner N, Grefenstette E, Blunsom P. A convolutional neural network for modelling sentences [J]. ArXiv preprint arXiv:1404.2188, 2014.
[12]
Cui Z, Chen W, Chen Y. Multi-scale convolutional neural networks for time series classification [J]. ArXiv preprint arXiv: 1603.06995, 2016.
[13]
LONG F, ZHOU K, OU W H. Sentiment analysis of text based on bidirectional LSTM with multi-head attention [J]. IEEE Access, 2019, 7: 141960-141969.
[14]
Graves A, Jaitly n, Mohamed A. Hybrid speech recognition with deep Bidirectional LSTM [J]. Olomouc, Czech Republic, 2013.
[15]
Vaswani A, Shazeer N, Parmar N, Attention is all you need [J]. arXiv preprint arXiv:1706.03762, 2017.
[16]
KIM Y. Convolutional ncural networks for sentence classification [C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014: 1746-1751.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
EITCE '22: Proceedings of the 2022 6th International Conference on Electronic Information Technology and Computer Engineering
October 2022
1999 pages
ISBN:9781450397148
DOI:10.1145/3573428
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 March 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Attention, Convolutional Neural Network, Recurrent Neural Network
  2. Text classification, Word embedding

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

EITCE 2022

Acceptance Rates

Overall Acceptance Rate 508 of 972 submissions, 52%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 18
    Total Downloads
  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 18 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media