Differentially private knowledge transfer for federated learning.

Journal: Nature communications
Published Date:

Abstract

Extracting useful knowledge from big data is important for machine learning. When data is privacy-sensitive and cannot be directly collected, federated learning is a promising option that extracts knowledge from decentralized data by learning and exchanging model parameters, rather than raw data. However, model parameters may encode not only non-private knowledge but also private information of local data, thereby transferring knowledge via model parameters is not privacy-secure. Here, we present a knowledge transfer method named PrivateKT, which uses actively selected small public data to transfer high-quality knowledge in federated learning with privacy guarantees. We verify PrivateKT on three different datasets, and results show that PrivateKT can maximally reduce 84% of the performance gap between centralized learning and existing federated learning methods under strict differential privacy restrictions. PrivateKT provides a potential direction to effective and privacy-preserving knowledge transfer in machine intelligent systems.

Authors

  • Tao Qi
    Department of Laboratory Medicine, Nangfang Hospital, Southern Medical University, GuangDong, 510515, China.
  • Fangzhao Wu
    Microsoft Research Asia, Beijing, 100080, China. fangzwu@microsoft.com.
  • Chuhan Wu
    Department of Electronic Engineering, Tsinghua University, Beijing, 100084, China.
  • Liang He
    Cancer Biology Research Center (Key Laboratory of the Ministry of Education), Tongji Medical College, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, China.
  • Yongfeng Huang
    Department of Electronic Engineering, Tsinghua University, Beijing 100084, China.
  • Xing Xie
    Microsoft Research, China. Electronic address: xing.xie@microsoft.com.