The Relationship between Sparseness and Energy Consumption of Neural Networks.

Journal: Neural plasticity
Published Date:

Abstract

About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of a neural network, that is, the sparseness, affects the energy consumption of a neural network. Laughlin's studies show that the sparseness of an energy-efficient code depends on the balance between signaling and fixed costs. Laughlin did not give an exact ratio of signaling to fixed costs, nor did they give the ratio of active neurons to all neurons in most energy-efficient neural networks. In this paper, we calculated the ratio of signaling costs to fixed costs by the data from physiology experiments. The ratio of signaling costs to fixed costs is between 1.3 and 2.1. We calculated the ratio of active neurons to all neurons in most energy-efficient neural networks. The ratio of active neurons to all neurons in neural networks is between 0.3 and 0.4. Our results are consistent with the data from many relevant physiological experiments, indicating that the model used in this paper may meet neural coding under real conditions. The calculation results of this paper may be helpful to the study of neural coding.

Authors

  • Guanzheng Wang
    Institute for Cognitive Neurodynamics, School of Science, East China University of Science and Technology, Meilong Road 130 Shanghai 200237, China.
  • Rubin Wang
  • Wanzeng Kong
    School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, Zhejiang 310018, China.
  • Jianhai Zhang
    School of Computer Science and Technology, Dalian University of Technology, Dalian, China.