A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism
WeiJun Yang, ZhiCheng Tang, XinHuai Tang
Available Online May 2018.
- https://doi.org/10.2991/amcce-18.2018.89How to use a DOI?
- Neural Abstractive Summarization, Self-Attention Mechanism
- Recently, the attentional seq2seq model had made a remarkable progress on the abstractive summarization. But most of these models do not considers the relation between original sentences, which is the important feature in extractive method. In this work, we proposed a Hierarchical Neural model to address problem. First, we use a self-attention to discovers the relation between original sentences. Secondly, we use a copy mechanism to solve the OOV problem. The experiment demonstrates that our model achieves state-of-the-art ROUGE scores on LCSTS dataset.
- Open Access
- This is an open access article distributed under the CC BY-NC license.
Cite this article
TY - CONF AU - WeiJun Yang AU - ZhiCheng Tang AU - XinHuai Tang PY - 2018/05 DA - 2018/05 TI - A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism BT - 2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018) PB - Atlantis Press SP - 514 EP - 518 SN - 2352-5401 UR - https://doi.org/10.2991/amcce-18.2018.89 DO - https://doi.org/10.2991/amcce-18.2018.89 ID - Yang2018/05 ER -