Chinese Short Text Summary Generation Model Integrating Multi-Level Semantic Information
- DOI
- 10.2991/ncce-18.2018.66How to use a DOI?
- Keywords
- Multi-level; Self-attention mechanism; Selective network; Global information; Seq2Seq; Joint semantic vector; Text abstract
- Abstract
Short text information is less, and short text comprehension abstract generation is currently a hot and difficult issue. We proposed an understanding-based short text summary generation model that combines multi-level semantic information. We improved the structure of encoder in the framework of encoder-decoder which consists of self-attention mechanism and selective network to focuses on the multi-level semantic information. Then our model fully exploits high-level global semantics and shallow semantic information of internal words of the text, and organically fuses the hidden state of the decoder and the original through two different attention mechanisms. The high-level and shallow semantic information adaptively provide the decoder with a syntactic semantic vector with abstract characteristics, so that the decoder can more accurately focus on the core content of the article. This paper selects the LCSTS data set for model training and testing. The experimental results show that compared with Seq2Seq, Seq2Seq with standard Attention, and Transformer model, the proposed model generates a Chinese short text summary with higher quality and performs better evaluation value in ROUGE
- Copyright
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Guanqin Chen PY - 2018/05 DA - 2018/05 TI - Chinese Short Text Summary Generation Model Integrating Multi-Level Semantic Information BT - Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018) PB - Atlantis Press SP - 414 EP - 425 SN - 1951-6851 UR - https://doi.org/10.2991/ncce-18.2018.66 DO - 10.2991/ncce-18.2018.66 ID - Chen2018/05 ER -