Chinese Short Text Summary Generation Model Combining Global and Local Information
- DOI
- 10.2991/ncce-18.2018.64How to use a DOI?
- Keywords
- Dual encoder; Attention mechanism; Global information; Local information; Text summary; Seq2Seq
- Abstract
Short text comprehension summary generation is currently a hot issue. In this paper, we improve the attention mechanism under the framework of encoder-decoder and proposes a comprehensible short text abstract generation model that integrates the global and local semantic information. The model consists of a dual encoder and a decoder. The dual encoder structure can combine the global and local semantic information and fully obtain the abstract features of the original text. And the improved mechanism can adaptively combine all information of short text to provide the input with summary characteristics for the decoder, so that the decoder can more accurately focus on the core content of the source text. In this paper, LCSTS dataset is used to train and test the model. The experimental results show that compared with the Seq2Seq and Seq2Seq with standard attention models, the proposed method can produce high-quality summary which consists of less repetitive words and performs better evaluation value in ROUGE
- Copyright
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Guanqin Chen PY - 2018/05 DA - 2018/05 TI - Chinese Short Text Summary Generation Model Combining Global and Local Information BT - Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018) PB - Atlantis Press SP - 396 EP - 407 SN - 1951-6851 UR - https://doi.org/10.2991/ncce-18.2018.64 DO - 10.2991/ncce-18.2018.64 ID - Chen2018/05 ER -