Distant supervision for relation extraction with hierarchical selective attention.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods. One is that they take all sentences of an entity pair as input, which would result in a large computational cost. But in fact, few of most relevant sentences are enough to recognize the relation of an entity pair. To tackle these problems, we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate these sentence representations. Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods.

Authors

  • Peng Zhou
    School of International Studies, Zhejiang University, Hangzhou, China.
  • Jiaming Xu
    Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, PR China.
  • Zhenyu Qi
    Institute of Automation, Chinese Academy of Sciences (CAS), China. Electronic address: zhenyu.qi@ia.ac.cn.
  • Hongyun Bao
    Institute of Automation, Chinese Academy of Sciences (CAS), China.
  • Zhineng Chen
    Institute of Automation, Chinese Academy of Sciences (CAS), China.
  • Bo Xu
    State Key Laboratory of Cardiovascular Disease, Fuwai Hospital, National Center for Cardiovascular Diseases, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100037, China.