Distant supervision for neural relation extraction integrated with word attention and property features.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Distant supervision for neural relation extraction is an efficient approach to extracting massive relations with reference to plain texts. However, the existing neural methods fail to capture the critical words in sentence encoding and meanwhile lack useful sentence information for some positive training instances. To address the above issues, we propose a novel neural relation extraction model. First, we develop a word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words. Second, we investigate the semantic information from word embeddings of target entities, which can be developed as a supplementary feature for the extractor. Experimental results show that our model outperforms previous state-of-the-art baselines.

Authors

  • Jianfeng Qu
    College of Computer Science and Technology, Jilin University, Changchun 130012, China; Key laboratory of Symbolic Computation and Knowledge Engineering (Jilin University), Ministry of Education, Changchun, 130012, China.
  • Dantong Ouyang
    College of Computer Science and Technology, Jilin University, Changchun 130012, China; Key laboratory of Symbolic Computation and Knowledge Engineering (Jilin University), Ministry of Education, Changchun, 130012, China.
  • Wen Hua
    School of Information Technology and Electrical Engineering, The University of Queensland, Australia.
  • Yuxin Ye
    College of Computer Science and Technology, Jilin University, Changchun 130012, China; Key laboratory of Symbolic Computation and Knowledge Engineering (Jilin University), Ministry of Education, Changchun, 130012, China. Electronic address: yeyx@jlu.edu.cn.
  • Ximing Li
    Tianjin Cardiovascular Institute, Tianjin Chest Hospital, Tianjin, China.