Sparse Bayesian Learning Based on Collaborative Neurodynamic Optimization.

Journal: IEEE transactions on cybernetics
Published Date:

Abstract

Regression in a sparse Bayesian learning (SBL) framework is usually formulated as a global optimization problem with a nonconvex objective function and solved in a majorization-minimization framework where the solution quality and consistency depend heavily on the initial values of the used algorithm. In view of the shortcomings, this article presents an SBL algorithm based on collaborative neurodynamic optimization (CNO) for searching global optimal solutions to the global optimization problem. The CNO system consists of a population of recurrent neural networks (RNNs) where each RNN is convergent to a local optimum to the global optimization problem. Reinitialized repetitively via particle swarm optimization with exchanged local optima information, the RNNs iteratively improve their searching performance until reaching global convergence. The proposed CNO-based SBL algorithm is almost surely convergent to a global optimal solution to the formulated global optimization problem. Two applications with experimental results on sparse signal reconstruction and partial differential equation identification are elaborated to substantiate the superiority and efficacy of the proposed method in terms of solution optimality and consistency.

Authors

  • Wei Zhou
    Department of Eye Function Laboratory, Eye Hospital, China Academy of Chinese Medical Sciences, Beijing, China.
  • Hai-Tao Zhang
    School of Artificial Intelligence and Automation, MOE Key Lab of Intelligent Control and Image Processing, Huazhong University of Science and Technology, Wuhan 430074, China.
  • Jun Wang
    Department of Speech, Language, and Hearing Sciences and the Department of Neurology, The University of Texas at Austin, Austin, TX 78712, USA.