Efficient inexact proximal gradient algorithms for structured sparsity-inducing norm.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Structured-sparsity regularization is popular for sparse learning because of its flexibility of encoding the feature structures. This paper considers a generalized version of structured-sparsity regularization (especially for l∕l norm) with arbitrary group overlap. Due to the group overlap, it is time-consuming to solve the associated proximal operator. Although Mairal et al. have proposed a network-flow algorithm to solve the proximal operator, it is still time-consuming, especially in the high-dimensional setting. To address this challenge, in this paper, we have developed a more efficient solution for l∕l group lasso with arbitrary group overlap using inexact proximal gradient method. In each iteration, our algorithm only requires to calculate an inexact solution to the proximal sub-problem, which can be done efficiently. On the theoretic side, the proposed algorithm enjoys the same global convergence rate as the exact proximal methods. Experiments demonstrate that our algorithm is much more efficient than the network-flow algorithm while retaining similar generalization performance.

Authors

  • Bin Gu
    Department of Critical Care Medicine, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.
  • Xiang Geng
    Department of Radiology and Research Institute of Radiology, The Affiliated Cancer Hospital of Zhengzhou University, Henan Cancer Hospital, Zhengzhou, China.
  • Xiang Li
    Department of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA, United States.
  • Guansheng Zheng
    School of Computer and Software, Nanjing University of Information Science & Technology, Nanjing, PR China.