Fractional-order stochastic gradient descent method with momentum and energy for deep neural networks.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
39447432
Abstract
In this paper, a novel fractional-order stochastic gradient descent with momentum and energy (FOSGDME) approach is proposed. Specifically, to address the challenge of converging to a real extreme point encountered by the existing fractional gradient algorithms, a novel fractional-order stochastic gradient descent (FOSGD) method is presented by modifying the definition of the Caputo fractional-order derivative. A FOSGD with moment (FOSGDM) is established by incorporating momentum information to accelerate the convergence speed and accuracy further. In addition, to improve the robustness and accuracy, a FOSGD with moment and energy is established by further introducing energy formation. The extensive experimental results on the image classification CIFAR-10 dataset obtained with ResNet and DenseNet demonstrate that the proposed FOSGD, FOSGDM and FOSGDME algorithms are superior to the integer order optimization algorithms, and achieve state-of-the-art performance.