Adaptive Dropout Method Based on Biological Principles.

Journal: IEEE transactions on neural networks and learning systems
Published Date:

Abstract

Dropout is one of the most widely used methods to avoid overfitting neural networks. However, it rigidly and randomly activates neurons according to a fixed probability, which is not consistent with the activation mode of neurons in the human cerebral cortex. Inspired by gene theory and the activation mechanism of brain neurons, we propose a more intelligent adaptive dropout, in which a variational self-encoder (VAE) overlaps to an existing neural network to regularize its hidden neurons by adaptively setting activities to zero. Through alternating iterative training, the discarding probability of each hidden neuron can be learned according to the weights and thus effectively avoid the shortcomings of the standard dropout method. The experimental results in multiple data sets illustrate that this method can better suppress overfitting in various neural networks than can the standard dropout. Additionally, this adaptive dropout technique can reduce the number of neurons and improve training efficiency.

Authors

  • Hailiang Li
    Department of Minimally Invasive Intervention, Henan Cancer Hospital, The Affiliated Cancer Hospital of Zhengzhou University, ZhengZhou, 450008, China.
  • Jian Weng
  • Yijun Mao
    College of Mathematics and Informatics, South China Agricultural University, Guangzhou, 510642, China.
  • Yonghua Wang
    School of Automation, Guangdong University of Technology, Guangzhou, 510006, China.
  • Yiju Zhan
  • Qingling Cai
  • Wanrong Gu
    College of Agriculture, Northeast Agricultural University, Harbin, China.