Global exponential stability and dissipativity of generalized neural networks with time-varying delay signals.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Dec 23, 2016
Abstract
This paper investigates the problems of exponential stability and dissipativity of generalized neural networks (GNNs) with time-varying delay signals. By constructing a novel Lyapunov-Krasovskii functionals (LKFs) with triple integral terms that contain more advantages of the state vectors of the neural networks, and the upper bound on the time-varying delay signals are formulated. We employ a new integral inequality technique (IIT), free-matrix-based (FMB) integral inequality approach, and Wirtinger double integral inequality (WDII) technique together with the reciprocally convex combination (RCC) approach to bound the time derivative of the LKFs. An improved exponential stability and strictly (Q,S,R)-γ-dissipative conditions of the addressed systems are represented by the linear matrix inequalities (LMIs). Finally, four interesting numerical examples are developed to verify the usefulness of the proposed method with a practical application to a biological network.