Bi-GRU Sentiment Classification for Chinese Based on Grammar Rules and BERT
- https://doi.org/10.2991/ijcis.d.200423.001How to use a DOI?
- Sentiment classification, Grammar rules, BERT, Bi-GRU
Sentiment classification is a fundamental task in NLP, and its aim to predict the sentiment polarities of the given texts. Recent researches show great interest in modeling Chinese sentiment classification. However, the complexity of Chinese grammar makes the performance of the existing Chinese sentiment classification model not perform well. In order to address the above problem, we propose a sentiment classification method based on grammar rules and bidirectional encoder representation from transformers (BERT). We first preprocess data through BERT model. Then we combine the Chinese grammar rules with Bi-gated recurrent neural network (GRU) in the form of constraints, and simulate the linguistic functions at the sentence by standardizing the output of adjacent positions. Extensive experiments on two public datasets demonstrate the effectiveness of our proposed method, and our findings in the experiment provide new insights for the future development of Chinese sentiment classification.
- © 2020 The Authors. Published by Atlantis Press SARL.
- Open Access
- This is an open access article distributed under the CC BY-NC 4.0 license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - JOUR AU - Qiang Lu AU - Zhenfang Zhu AU - Fuyong Xu AU - Dianyuan Zhang AU - Wenqing Wu AU - Qiangqiang Guo PY - 2020 DA - 2020/05 TI - Bi-GRU Sentiment Classification for Chinese Based on Grammar Rules and BERT JO - International Journal of Computational Intelligence Systems SP - 538 EP - 548 VL - 13 IS - 1 SN - 1875-6883 UR - https://doi.org/10.2991/ijcis.d.200423.001 DO - https://doi.org/10.2991/ijcis.d.200423.001 ID - Lu2020 ER -