The butterfly effect in neural networks: Unveiling hyperbolic chaos through parameter sensitivity.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Neural networks often excel in short-horizon tasks, but their long-term reliability is less assured. We demonstrate that even a minimal architecture, trained on near-periodic data, can exhibit hyperbolic chaotic behavior after a small parameter perturbation. Drawing on classical dynamical systems - especially Lyapunov exponents and structural stability - we show that borderline-zero exponents do not shield multi-step forecasts from instability when genuine structural stability is absent. A weight change on the order of 10 can radically alter long-horizon forecasting, contradicting the notion that strong local metrics ensure global robustness. We propose a simple "pinning" strategy to curb runaway expansions by constraining certain outputs, yet borderline orbits remain a common pitfall in larger networks. Our findings underscore that short-horizon validation may fail to detect critical multi-step vulnerabilities, and that global diagnostics alongside structural stability are essential for reliable long-term forecasting.

Authors

  • Jingyi Luo
    School of Mathematical Sciences & Center for Dynamical Systems and Differential Equations, Soochow University, Suzhou, 215006, Jiangsu, China.
  • Jianyu Chen
  • Hong-Kun Zhang
    Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: hongkunz@umass.edu.