A global neural network learning machine: Coupled integer and fractional calculus operator with an adaptive learning scheme.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Nov 1, 2021
Abstract
Find the global optimal solution of the model is one promising research topic in computational intelligent community. Dependent on analogies to natural processes, the evolutionary swarm intelligent algorithms are widely used for solving global optimization problems which directed by the fitness values. In this paper, we propose one efficient fractional global learning machine (Fragmachine) which includes two stages (descending and ascending) to determine the optimal search path. The neural network model is used to approach the given fitness value. Specifically, for the descending stage, the integer gradient of the network output with respect the current location is employed to find the next descending point, while for the ascending stage, the fractional gradient is implemented to climb and escape from the local optimal point. We further propose one adaptive learning rate during training which relies on both the current gradient (integer or fractional) information and the fitness value. Finally, a series of numerical experiments verify the effectiveness of the proposed algorithm, Fragmachine.