An Attention-Aware Long Short-Term Memory-Like Spiking Neural Model for Sentiment Analysis.

Journal: International journal of neural systems
Published Date:

Abstract

LSTM-SNP model is a recently developed long short-term memory (LSTM) network, which is inspired from the mechanisms of spiking neural P (SNP) systems. In this paper, LSTM-SNP is utilized to propose a novel model for aspect-level sentiment analysis, termed as ALS model. The LSTM-SNP model has three gates: reset gate, consumption gate and generation gate. Moreover, attention mechanism is integrated with LSTM-SNP model. The ALS model can better capture the sentiment features in the text to compute the correlation between context and aspect words. To validate the effectiveness of the ALS model for aspect-level sentiment analysis, comparison experiments with 17 baseline models are conducted on three real-life data sets. The experimental results demonstrate that the ALS model has a simpler structure and can achieve better performance compared to these baseline models.

Authors

  • Qian Liu
    State Key Laboratory of Environmental Chemistry and Ecotoxicology, Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, Beijing 100085, China.
  • Yanping Huang
    School of Computer and Software Engineering, Xihua University, Chengdu, 610039, China.
  • Qian Yang
    Center for Advanced Scientific Instrumentation, University of Wyoming, Laramie, WY, United States.
  • Hong Peng
    1 Center for Radio Administration and Technology Development, School of Computer and Software Engineering, Xihua University, Chengdu 610039, P. R. China.
  • Jun Wang
    Department of Speech, Language, and Hearing Sciences and the Department of Neurology, The University of Texas at Austin, Austin, TX 78712, USA.