A global neural network learning machine: Coupled integer and fractional calculus operator with an adaptive learning scheme.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Find the global optimal solution of the model is one promising research topic in computational intelligent community. Dependent on analogies to natural processes, the evolutionary swarm intelligent algorithms are widely used for solving global optimization problems which directed by the fitness values. In this paper, we propose one efficient fractional global learning machine (Fragmachine) which includes two stages (descending and ascending) to determine the optimal search path. The neural network model is used to approach the given fitness value. Specifically, for the descending stage, the integer gradient of the network output with respect the current location is employed to find the next descending point, while for the ascending stage, the fractional gradient is implemented to climb and escape from the local optimal point. We further propose one adaptive learning rate during training which relies on both the current gradient (integer or fractional) information and the fitness value. Finally, a series of numerical experiments verify the effectiveness of the proposed algorithm, Fragmachine.

Authors

  • Huaqing Zhang
    College of Control Science and Engineering, China University of Petroleum (East China), Qingdao, 266580, China; College of Science, China University of Petroleum (East China), Qingdao, 266580, China.
  • Yi-Fei Pu
    1 College of Computer Science, Sichuan University, Chengdu 610065, P. R. China.
  • Xuetao Xie
    College of Computer Science, Sichuan University, Chengdu, 610065, China. Electronic address: xuetao_xie@163.com.
  • Bingran Zhang
    Department of Mathematics, University College London, London, WC1E 6BT, UK.
  • Jian Wang
    Veterinary Diagnostic Center, Shanghai Animal Disease Control Center, Shanghai, China.
  • Tingwen Huang