Feature flow regularization: Improving structured sparsity in deep neural networks.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks (DNNs) while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this paper, we propose a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. Experiments on CIFAR-10 and ImageNet datasets show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods.

Authors

  • Yue Wu
    Key Laboratory of Luminescence and Real-Time Analytical Chemistry (Ministry of Education), College of Pharmaceutical Sciences, Southwest University, Chongqing 400716, China.
  • Yuan Lan
    Department of Mathematics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong.
  • Luchan Zhang
    College of Mathematics and Statistics, Shenzhen University, Shenzhen 518060, China. Electronic address: zhanglc@szu.edu.cn.
  • Yang Xiang
    Department of Biomedical Informatics, The Ohio State University, Columbus, OH 43210, USA.