Representation learning via Dual-Autoencoder for recommendation.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Recommendation has provoked vast amount of attention and research in recent decades. Most previous works employ matrix factorization techniques to learn the latent factors of users and items. And many subsequent works consider external information, e.g., social relationships of users and items' attributions, to improve the recommendation performance under the matrix factorization framework. However, matrix factorization methods may not make full use of the limited information from rating or check-in matrices, and achieve unsatisfying results. Recently, deep learning has proven able to learn good representation in natural language processing, image classification, and so on. Along this line, we propose a new representation learning framework called Recommendation via Dual-Autoencoder (ReDa). In this framework, we simultaneously learn the new hidden representations of users and items using autoencoders, and minimize the deviations of training data by the learnt representations of users and items. Based on this framework, we develop a gradient descent method to learn hidden representations. Extensive experiments conducted on several real-world data sets demonstrate the effectiveness of our proposed method compared with state-of-the-art matrix factorization based methods.

Authors

  • Fuzhen Zhuang
    Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China. Electronic address: zhuangfz@ics.ict.ac.cn.
  • Zhiqiang Zhang
    Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China.
  • Mingda Qian
    Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China. Electronic address: qianmd@ics.ict.ac.cn.
  • Chuan Shi
    Beijing University of Posts and Telecommunications, Beijing, China. Electronic address: shichuan@bupt.edu.cn.
  • Xing Xie
    Microsoft Research, China. Electronic address: xing.xie@microsoft.com.
  • Qing He