FxTS-Net: Fixed-time stable learning framework for Neural ODEs.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Neural Ordinary Differential Equations (Neural ODEs), as a novel category of modeling big data methods, cleverly link traditional neural networks and dynamical systems. However, it is challenging to ensure the dynamics system reaches a correctly predicted state within a user-defined fixed time. To address this problem, we propose a new method for training Neural ODEs using fixed-time stability (FxTS) Lyapunov conditions. Our framework, called FxTS-Net, is based on the novel FxTS loss (FxTS-Loss) designed on Lyapunov functions, which aims to encourage convergence to accurate predictions in a user-defined fixed time. We also provide an innovative approach for constructing Lyapunov functions to meet various tasks and network architecture requirements, achieved by leveraging supervised information during training. By developing a more precise time upper bound estimation for bounded non-vanishingly perturbed systems, we demonstrate that minimizing FxTS-Loss not only guarantees FxTS behavior of the dynamics but also input perturbation robustness. For optimizing FxTS-Loss, we also propose a learning algorithm, in which the simulated perturbation sampling method can capture sample points in critical regions to approximate FxTS-Loss. Experimentally, we find that FxTS-Net provides better prediction performance and better robustness under input perturbation.

Authors

  • Chaoyang Luo
    Department of Mathematics, Sichuan University, Chengdu, Sichuan, 610064, China.
  • Yan Zou
    National Clinical Research Center of Oral Diseases, Shanghai 200011, China.
  • Wanying Li
    Academy of Military Medical Sciences, BeijingĀ 100850, China.
  • Nanjing Huang
    Department of Mathematics, Sichuan University, Chengdu, Sichuan, 610064, China. Electronic address: njhuang@scu.edu.cn.