Long-term causal effects estimation via latent surrogates representation learning.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Estimating long-term causal effects based on short-term surrogates is a significant but challenging problem in many real-world applications such as marketing and medicine. Most existing methods estimate causal effects in an idealistic and simplistic manner - disregarding unobserved surrogates and treating all short-term outcomes as surrogates. However, such methods are not well-suited to real-world scenarios where the partially observed surrogates are mixed with the proxies of unobserved surrogates among short-term outcomes. To address this issue, we develop our flexible method called LASER to estimate long-term causal effects in a more realistic situation where the surrogates are either observed or have observed proxies. In LASER, we employ an identifiable variational autoencoder to learn the latent surrogate representation by using all the surrogate candidates without the need to distinguish observed surrogates or proxies of unobserved surrogates. With the learned representation, we further devise a theoretically guaranteed and unbiased estimation of long-term causal effects. Extensive experimental results on the real-world and semi-synthetic datasets demonstrate the effectiveness of our proposed method.

Authors

  • Ruichu Cai
    Faculty of Computer Science, Guangdong University of Technology, Guangzhou, People's Republic of China. Electronic address: cairuichu@gmail.com.
  • Weilin Chen
    Department of Micro/Nano Electronics, School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai, 200240, China.
  • Zeqin Yang
    School of Computer Science, Guangdong University of Technology, Guangzhou, China.
  • Shu Wan
    Brain center, Zhejiang Hospital, Hangzhou, China.
  • Chen Zheng
  • Xiaoqing Yang
    Didi Chuxing, Beijing, China.
  • Jiecheng Guo
    Didi Chuxing, Beijing, China.