AIMC Topic: Models, Neurological

Clear Filters Showing 31 to 40 of 1142 articles

Interpretable deep learning for deconvolutional analysis of neural signals.

Neuron
The widespread adoption of deep learning to model neural activity often relies on "black-box" approaches that lack an interpretable connection between neural activity and network parameters. Here, we propose using algorithm unrolling, a method for in...

FPGA implementation of a complete digital spiking silicon neuron for circuit design and network approach.

Scientific reports
When attempting to replicate the same biological spiking neuron model actions of the human brain, the spiking neuron model methodology and hardware realization design for the nervous system of the brain are crucial considerations. This work provides ...

A unified acoustic-to-speech-to-language embedding space captures the neural basis of natural language processing in everyday conversations.

Nature human behaviour
This study introduces a unified computational framework connecting acoustic, speech and word-level linguistic structures to study the neural basis of everyday conversations in the human brain. We used electrocorticography to record neural signals acr...

Structure of activity in multiregion recurrent neural networks.

Proceedings of the National Academy of Sciences of the United States of America
Neural circuits comprise multiple interconnected regions, each with complex dynamics. The interplay between local and global activity is thought to underlie computational flexibility, yet the structure of multiregion neural activity and its origins i...

A general framework for interpretable neural learning based on local information-theoretic goal functions.

Proceedings of the National Academy of Sciences of the United States of America
Despite the impressive performance of biological and artificial networks, an intuitive understanding of how their local learning dynamics contribute to network-level task solutions remains a challenge to this date. Efforts to bring learning to a more...

Temporal Contrastive Learning through implicit non-equilibrium memory.

Nature communications
The backpropagation method has enabled transformative uses of neural networks. Alternatively, for energy-based models, local learning methods involving only nearby neurons offer benefits in terms of decentralized training, and allow for the possibili...

Minimal Neural Network Conditions for Encoding Future Interactions.

International journal of neural systems
Space and time are fundamental attributes of the external world. Deciphering the brain mechanisms involved in processing the surrounding environment is one of the main challenges in neuroscience. This is particularly defiant when situations change ra...

Adaptive Synaptic Scaling in Spiking Networks for Continual Learning and Enhanced Robustness.

IEEE transactions on neural networks and learning systems
Synaptic plasticity plays a critical role in the expression power of brain neural networks. Among diverse plasticity rules, synaptic scaling presents indispensable effects on homeostasis maintenance and synaptic strength regulation. In the current mo...

Exploring temporal information dynamics in Spiking Neural Networks: Fast Temporal Efficient Training.

Journal of neuroscience methods
BACKGROUND: Spiking Neural Networks (SNNs) hold significant potential in brain simulation and temporal data processing. While recent research has focused on developing neuron models and leveraging temporal dynamics to enhance performance, there is a ...

An accurate and fast learning approach in the biologically spiking neural network.

Scientific reports
Computations adapted from the interactions of neurons in the nervous system have the potential to be a strong foundation for building computers with cognitive functions including decision-making, generalization, and real-time learning. In this contex...