AIMC Topic: Systems Theory

Clear Filters Showing 1 to 10 of 13 articles

Learning in deep neural networks and brains with similarity-weighted interleaved learning.

Proceedings of the National Academy of Sciences of the United States of America
Understanding how the brain learns throughout a lifetime remains a long-standing challenge. In artificial neural networks (ANNs), incorporating novel information too rapidly results in catastrophic interference, i.e., abrupt loss of previously acquir...

A complementary learning systems approach to temporal difference learning.

Neural networks : the official journal of the International Neural Network Society
Complementary Learning Systems (CLS) theory suggests that the brain uses a 'neocortical' and a 'hippocampal' learning system to achieve complex behaviour. These two systems are complementary in that the 'neocortical' system relies on slow learning of...

DynMat, a network that can learn after learning.

Neural networks : the official journal of the International Neural Network Society
To survive in the dynamically-evolving world, we accumulate knowledge and improve our skills based on experience. In the process, gaining new knowledge does not disrupt our vigilance to external stimuli. In other words, our learning process is 'accum...

The stability of memristive multidirectional associative memory neural networks with time-varying delays in the leakage terms via sampled-data control.

PloS one
In this paper, we propose a new model of memristive multidirectional associative memory neural networks, which concludes the time-varying delays in leakage terms via sampled-data control. We use the input delay method to turn the sampling system into...

Resilience of and recovery strategies for weighted networks.

PloS one
The robustness and resilience of complex networks have been widely studied and discussed in both research and industry because today, the diversity of system components and the complexity of the connection between units are increasingly influencing t...

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.

IEEE transactions on neural networks and learning systems
Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntr...

A symbolic network-based nonlinear theory for dynamical systems observability.

Scientific reports
When the state of the whole reaction network can be inferred by just measuring the dynamics of a limited set of nodes the system is said to be fully observable. However, as the number of all possible combinations of measured variables and time deriva...

A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process.

Computational intelligence and neuroscience
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostl...

Structure Identification of Uncertain Complex Networks Based on Anticipatory Projective Synchronization.

PloS one
This paper investigates a method to identify uncertain system parameters and unknown topological structure in general complex networks with or without time delay. A complex network, which has uncertain topology and unknown parameters, is designed as ...

Robust stability of stochastic fuzzy delayed neural networks with impulsive time window.

Neural networks : the official journal of the International Neural Network Society
The urgent problem of impulsive moments which cannot be determined in advance brings new challenges beyond the conventional impulsive systems theory. In order to solve this problem, the novel concept of impulsive time window is proposed in this paper...