Abstractive summarization of long texts by representing multiple compositionalities with temporal hierarchical pointer generator network.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

In order to tackle the problem of abstractive summarization of long multi-sentence texts, it is critical to construct an efficient model, which can learn and represent multiple compositionalities better. In this paper, we introduce a temporal hierarchical pointer generator network that can represent multiple compositionalities in order to handle longer sequences of texts with a deep structure. We demonstrate how a multilayer gated recurrent neural network organizes itself with the help of an adaptive timescale in order to represent the compositions. The temporal hierarchical network is implemented with a multiple timescale architecture where the timescale of each layer is also learned during the training process through error backpropagation through time. We evaluate our proposed model using an Introduction-Abstract summarization dataset from scientific articles and the CNN/Daily Mail summarization benchmark dataset. The results illustrate that, we successfully implement a summary generation system for long texts by using the multiple timescale with adaptation concept. We also show that we have improved the summary generation system with our proposed model on the benchmark dataset.

Authors

  • Dennis Singh Moirangthem
    School of Electronics Engineering, IT1, Kyungpook National University, 80 Daehakro, Bukgu, Daegu - 41566, South Korea. Electronic address: mdennissingh@gmail.com.
  • Minho Lee
    School of Electronics Engineering, Kyungpook National University, 1370 Sankyuk-Dong, Puk-Gu, Taegu 702-701, Republic of Korea. Electronic address: mholee@gmail.com.