AIMC Topic: Models, Neurological

Clear Filters Showing 1011 to 1020 of 1163 articles

Replay as a Basis for Backpropagation Through Time in the Brain.

Neural computation
How episodic memories are formed in the brain is a continuing puzzle for the neuroscience community. The brain areas that are critical for episodic learning (e.g., the hippocampus) are characterized by recurrent connectivity and generate frequent off...

Role of short-term plasticity and slow temporal dynamics in enhancing time series prediction with a brain-inspired recurrent neural network.

Chaos (Woodbury, N.Y.)
Typical reservoir networks are based on random connectivity patterns that differ from brain circuits in two important ways. First, traditional reservoir networks lack synaptic plasticity among recurrent units, whereas cortical networks exhibit plasti...

Combining meta reinforcement learning with neural plasticity mechanisms for improved AI performance.

PloS one
This research explores the potential of combining Meta Reinforcement Learning (MRL) with Spike-Timing-Dependent Plasticity (STDP) to enhance the performance and adaptability of AI agents in Atari game settings. Our methodology leverages MRL to swiftl...

Intelligent Control to Suppress Epileptic Seizures in the Amygdala: In Silico Investigation Using a Network of Izhikevich Neurons.

IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society
Closed-loop electricalstimulation of brain structures is one of the most promising techniques to suppress epileptic seizures in drug-resistant refractory patients who are also ineligible to ablative neurosurgery. In this work, an intelligent controll...

Characterization of Machine Learning-Based Surrogate Models of Neural Activation Under Electrical Stimulation.

Bioelectromagnetics
Electrical stimulation of peripheral nerves via implanted electrodes has been shown to be a promising approach to restore sensation, movement, and autonomic functions across a wide range of illnesses and injuries. While in principle computational mod...

Toward biologically realistic models of the motor system.

Neuron
In this issue of Neuron, Chiappa et al. describe how neural networks can be trained to perform complex hand motor skills. A key to their approach is curriculum learning, breaking learning into stages, leading to good control.

Sparse-Coding Variational Autoencoders.

Neural computation
The sparse coding model posits that the visual system has evolved to efficiently code natural stimuli using a sparse set of features from an overcomplete dictionary. The original sparse coding model suffered from two key limitations; however: (1) com...

Fault-tolerant neural networks from biological error correction codes.

Physical review. E
It has been an open question in deep learning if fault-tolerant computation is possible: can arbitrarily reliable computation be achieved using only unreliable neurons? In the grid cells of the mammalian cortex, analog error correction codes have bee...

Trainable Reference Spikes Improve Temporal Information Processing of SNNs With Supervised Learning.

Neural computation
Spiking neural networks (SNNs) are the next-generation neural networks composed of biologically plausible neurons that communicate through trains of spikes. By modifying the plastic parameters of SNNs, including weights and time delays, SNNs can be t...

Inference on the Macroscopic Dynamics of Spiking Neurons.

Neural computation
The process of inference on networks of spiking neurons is essential to decipher the underlying mechanisms of brain computation and function. In this study, we conduct inference on parameters and dynamics of a mean-field approximation, simplifying th...