AIMC Topic: Models, Neurological

Clear Filters Showing 51 to 60 of 1142 articles

Reconstruction of Adaptive Leaky Integrate-and-Fire Neuron to Enhance the Spiking Neural Networks Performance by Establishing Complex Dynamics.

IEEE transactions on neural networks and learning systems
Since digital spiking signals can carry rich information and propagate with low computational consumption, spiking neural networks (SNNs) have received great attention from neuroscientists and are regarded as the future development object of neural n...

Memory-Dependent Computation and Learning in Spiking Neural Networks Through Hebbian Plasticity.

IEEE transactions on neural networks and learning systems
Spiking neural networks (SNNs) are the basis for many energy-efficient neuromorphic hardware systems. While there has been substantial progress in SNN research, artificial SNNs still lack many capabilities of their biological counterparts. In biologi...

Bio-plausible reconfigurable spiking neuron for neuromorphic computing.

Science advances
Biological neurons use diverse temporal expressions of spikes to achieve efficient communication and modulation of neural activities. Nonetheless, existing neuromorphic computing systems mainly use simplified neuron models with limited spiking behavi...

Spatio-temporal transformers for decoding neural movement control.

Journal of neural engineering
. Deep learning tools applied to high-resolution neurophysiological data have significantly progressed, offering enhanced decoding, real-time processing, and readability for practical applications. However, the design of artificial neural networks to...

Schizophrenia recognition based on three-dimensional adaptive graph convolutional neural network.

Scientific reports
Previous deep learning-based brain network research has made significant progress in understanding the pathophysiology of schizophrenia. However, it ignores the three-dimensional spatial characteristics of EEG signals and cannot dynamically learn the...

Hybrid neural networks for continual learning inspired by corticohippocampal circuits.

Nature communications
Current artificial systems suffer from catastrophic forgetting during continual learning, a limitation absent in biological systems. Biological mechanisms leverage the dual representation of specific and generalized memories within corticohippocampal...

Temporal pavlovian conditioning of a model spiking neural network for discrimination sequences of short time intervals.

Journal of computational neuroscience
The brain's ability to learn and distinguish rapid sequences of events is essential for timing-dependent tasks, such as those in sports and music. However, the mechanisms underlying this ability remain an active area of research. Here, we present a P...

The calcitron: A simple neuron model that implements many learning rules via the calcium control hypothesis.

PLoS computational biology
Theoretical neuroscientists and machine learning researchers have proposed a variety of learning rules to enable artificial neural networks to effectively perform both supervised and unsupervised learning tasks. It is not always clear, however, how t...

Recurrent neural networks with transient trajectory explain working memory encoding mechanisms.

Communications biology
Whether working memory (WM) is encoded by persistent activity using attractors or by dynamic activity using transient trajectories has been debated for decades in both experimental and modeling studies, and a consensus has not been reached. Even thou...

Low-power artificial neuron networks with enhanced synaptic functionality using dual transistor and dual memristor.

PloS one
Artificial neurons with bio-inspired firing patterns have the potential to significantly improve the performance of neural network computing. The most significant component of an artificial neuron circuit is a large amount of energy consumption. Rece...