Learning interaction dynamics with an interactive LSTM for conversational sentiment analysis.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Conversational sentiment analysis is an emerging, yet challenging subtask of the sentiment analysis problem. It aims to discover the affective state and sentimental change in each person in a conversation based on their opinions. There exists a wealth of interaction information that affects speaker sentiment in conversations. However, existing sentiment analysis approaches are insufficient in dealing with this subtask due to two primary reasons: the lack of benchmark conversational sentiment datasets and the inability to model interactions between individuals. To address these issues, in this paper, we first present a new conversational dataset that we created and made publicly available, named ScenarioSA, to support the development of conversational sentiment analysis models. Then, we investigate how interaction dynamics are associated with conversations and study the multidimensional nature of interactions, which is understandability, credibility and influence. Finally, we propose an interactive long short-term memory (LSTM) network for conversational sentiment analysis to model interactions between speakers in a conversation by (1) adding a confidence gate before each LSTM hidden unit to estimate the credibility of the previous speakers and (2) combining the output gate with the learned influence scores to incorporate the influences of the previous speakers. Extensive experiments are conducted on ScenarioSA and IEMOCAP, and the results show that our model outperforms a wide range of strong baselines and achieves competitive results with the state-of-art approaches.

Authors

  • Yazhou Zhang
    Software Engineering College, Zhengzhou University of Light Industry, No.136 Science Avenue, Zhengzhou, PR China. Electronic address: yzzhang@zzuli.edu.cn.
  • Prayag Tiwari
    Department of Information Engineering, University of Padova, Italy. Electronic address: prayag.tiwari@dei.unipd.it.
  • Dawei Song
    School of Computer Science and Technology, Beijing Institute of Technology, 5 South Zhongguancun Street, Beijing, PR China. Electronic address: dawei.song2010@gmail.com.
  • Xiaoliu Mao
    Hardware Technology Research Department, Beijing Research Center, Huawei Technologies Co, Ltd, Beijing, PR China. Electronic address: 1017883038@qq.com.
  • Panpan Wang
    College of Intelligence and Computing, Tianjin University, No.135 Yaguan Road, Tianjin, PR China. Electronic address: panpan_tju@tju.edu.cn.
  • Xiang Li
    Department of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA, United States.
  • Hari Mohan Pandey
    Department of Computer Science, Edge Hill University, Ormskirk, Lancashire, UK. Electronic address: pandeyh@edgehill.ac.uk.