>> Nature Journal >> 2022 >> Issue 6 >> 正文
CNN-BICIFG-Attention model for text sentiment classification
Time: 2022-11-10 Counts:

ZOU B R, WANG Y C, WANG W D, et al.CNN-BICIFG-Attention model for text sentiment classification[J].Journal of Henan Polytechnic University(Natural Science) ,2022,41(6):155-162.

doi:10.16186/j.cnki.1673-9787.2021060007

Received:2021/06/02

Revised:2021/07/10

Published:2022/11/25

CNN-BICIFG-Attention model for text sentiment classification

ZOU Borong1, WANG Yicheng2, WANG Weidong1, HOU Qinghua1, WU Huibin1

1.School of Physics and Electronic InformationHenan Polytechnic UniversityJiaozuo  454000HenanChina2.School of Electrical Engineering and AutomationHenan Polytechnic UniversityJiaozuo  454000HenanChina

Abstract:Aiming at the problems of incomplete extraction of complex information features of text data sets in processing Chinese short text sentiment classification tasksand insufficient learning of contextual texts in common neural networksa dual-channel composite network model combined with attention mechanism was proposed.The corpus was preprocessed to form a text vector matrix.In the two channelsthe convolutional neural network layerthe bidirectional coupling input and the forgetting gate network layer were used to extract the local feature information of the sample vector to learn the connection between the previous and the next word vectors Thenthe attention mechanism network layer was added separately to assign weights to the text information of different emotion densities for improving the intensity of the impact of key information on the sentiment classification of sentences Finallythe two channel feature vectors were merged.The multi-layer hybrid network model proposed in this paper was tested on the crawled Jingdong product review set and SogouCS data setthe accuracy rate reached 93.17%and the F-SCORE value reached 93.12%.The results verified the effectiveness of the dual-channel composite model.

Key words:sentiment classification;convolutional neural network;two-way coupling input and forget gate network;attention mechanism;accuracy;F-SCORE value

  018_2021060007_邹波蓉_H.pdf

Lastest