Mutual-Attention Net: A Deep Attentional Neural Network for Keyphrase Generation.

Journal: Computational intelligence and neuroscience
Published Date:

Abstract

Neural keyphrase generation (NKG) is a recently proposed approach to automatically extract keyphrase from a document. Unlike the traditional keyphrase extraction, the NKG can generate keyphrases that do not appear in the document. However, as a supervised method, NKG is hindered by noise. In order to solve the problem that the existing NKG model does not consider denoising the source document, in this work, this paper introduces a new denoising architecture mutual-attention network (MA-net). Considering the structure of documents in popular datasets, the multihead attention is applied to dig out the relevance between title and abstract, which aids denoising. To further accurate generation of high-quality keyphrases, we use multihead attention to compute the content vector instead of Bahdanau attention. Finally, we employ a hybrid network that augments the proposed architecture to solve OOV (out-of-vocabulary) problem. It can not only generate words from the decoder but also copy words from the source document. Evaluation using five benchmark datasets shows that our model significantly outperforms the state-of-the-art ones currently in the research field.

Authors

  • Wenying Duan
    School of Mathematics and Computer Sciences, Nanchang University, Nanchang, Jiangxi 330031, China.
  • Hong Rao
    School of Software, Nanchang University, Nanchang, Jiangxi 330031, China.
  • Longzhen Duan
    School of Mathematics and Computer Sciences, Nanchang University, Nanchang, Jiangxi 330031, China.
  • Ning Wang
    Qilu Hospital of Shandong University Dezhou Hospital, Dezhou, Shandong, China.