Proceedings of the 2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018)

A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism

Authors
WeiJun Yang, ZhiCheng Tang, XinHuai Tang
Corresponding Author
WeiJun Yang
Available Online May 2018.
DOI
https://doi.org/10.2991/amcce-18.2018.89How to use a DOI?
Keywords
Neural Abstractive Summarization, Self-Attention Mechanism
Abstract
Recently, the attentional seq2seq model had made a remarkable progress on the abstractive summarization. But most of these models do not considers the relation between original sentences, which is the important feature in extractive method. In this work, we proposed a Hierarchical Neural model to address problem. First, we use a self-attention to discovers the relation between original sentences. Secondly, we use a copy mechanism to solve the OOV problem. The experiment demonstrates that our model achieves state-of-the-art ROUGE scores on LCSTS dataset.
Open Access
This is an open access article distributed under the CC BY-NC license.

Download article (PDF)

Proceedings
2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018)
Part of series
Advances in Engineering Research
Publication Date
May 2018
ISBN
978-94-6252-508-5
ISSN
2352-5401
DOI
https://doi.org/10.2991/amcce-18.2018.89How to use a DOI?
Open Access
This is an open access article distributed under the CC BY-NC license.

Cite this article

TY  - CONF
AU  - WeiJun Yang
AU  - ZhiCheng Tang
AU  - XinHuai Tang
PY  - 2018/05
DA  - 2018/05
TI  - A Hierarchical Neural Abstractive Summarization with Self-Attention Mechanism
BT  - 2018 3rd International Conference on Automation, Mechanical Control and Computational Engineering (AMCCE 2018)
PB  - Atlantis Press
SN  - 2352-5401
UR  - https://doi.org/10.2991/amcce-18.2018.89
DO  - https://doi.org/10.2991/amcce-18.2018.89
ID  - Yang2018/05
ER  -