AI Medical Compendium Journal:
Neural computation

Showing 111 to 120 of 203 articles

Hardware-amenable structural learning for spike-based pattern classification using a simple model of active dendrites.

Neural computation
This letter presents a spike-based model that employs neurons with functionally distinct dendritic compartments for classifying high-dimensional binary patterns. The synaptic inputs arriving on each dendritic subunit are nonlinearly processed before ...

Spontaneous motion on two-dimensional continuous attractors.

Neural computation
Attractor models are simplified models used to describe the dynamics of firing rate profiles of a pool of neurons. The firing rate profile, or the neuronal activity, is thought to carry information. Continuous attractor neural networks (CANNs) descri...

Neural Code Translation With LIF Neuron Microcircuits.

Neural computation
Spiking neural networks (SNNs) provide an energy-efficient alternative to traditional artificial neural networks, leveraging diverse neural encoding schemes such as rate, time-to-first-spike (TTFS), and population-based binary codes. Each encoding me...

Dynamics and Bifurcation Structure of a Mean-Field Model of Adaptive Exponential Integrate-and-Fire Networks.

Neural computation
The study of brain activity spans diverse scales and levels of description and requires the development of computational models alongside experimental investigations to explore integrations across scales. The high dimensionality of spiking networks p...

Dynamics of Continuous Attractor Neural Networks With Spike Frequency Adaptation.

Neural computation
Attractor neural networks consider that neural information is stored as stationary states of a dynamical system formed by a large number of interconnected neurons. The attractor property empowers a neural system to encode information robustly, but it...

Distributed Synaptic Connection Strength Changes Dynamics in a Population Firing Rate Model in Response to Continuous External Stimuli.

Neural computation
Neural network complexity allows for diverse neuronal population dynamics and realizes higherorder brain functions such as cognition and memory. Complexity is enhanced through chemical synapses with exponentially decaying conductance and greater vari...

Elucidating the Theoretical Underpinnings of Surrogate Gradient Learning in Spiking Neural Networks.

Neural computation
Training spiking neural networks to approximate universal functions is essential for studying information processing in the brain and for neuromorphic computing. Yet the binary nature of spikes poses a challenge for direct gradient-based training. Su...

The Leaky Integrate-and-Fire Neuron Is a Change-Point Detector for Compound Poisson Processes.

Neural computation
Animal nervous systems can detect changes in their environments within hundredths of a second. They do so by discerning abrupt shifts in sensory neural activity. Many neuroscience studies have employed change-point detection (CPD) algorithms to estim...

Spiking Neuron-Astrocyte Networks for Image Recognition.

Neural computation
From biological and artificial network perspectives, researchers have started acknowledging astrocytes as computational units mediating neural processes. Here, we propose a novel biologically inspired neuron-astrocyte network model for image recognit...

Enhanced EEG Forecasting: A Probabilistic Deep Learning Approach.

Neural computation
Forecasting electroencephalography (EEG) signals, that is, estimating future values of the time series based on the past ones, is essential in many real-time EEG-based applications, such as brain-computer interfaces and closed-loop brain stimulation....