AI Medical Compendium Journal:
Neural computation

Showing 51 to 60 of 203 articles

Effect of Depth and Width on Local Minima in Deep Learning.

Neural computation
In this paper, we analyze the effects of depth and width on the quality of local minima, without strong overparameterization and simplification assumptions in the literature. Without any simplification assumption, for deep nonlinear neural networks w...

Effective Dimensionality Reduction for Visualizing Neural Dynamics by Laplacian Eigenmaps.

Neural computation
With the development of neural recording technology, it has become possible to collect activities from hundreds or even thousands of neurons simultaneously. Visualization of neural population dynamics can help neuroscientists analyze large-scale neur...

A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.

Neural computation
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input da...

A Reservoir Computing Model of Reward-Modulated Motor Learning and Automaticity.

Neural computation
Reservoir computing is a biologically inspired class of learning algorithms in which the intrinsic dynamics of a recurrent neural network are mined to produce target time series. Most existing reservoir computing algorithms rely on fully supervised l...

A Computational Perspective of the Role of the Thalamus in Cognition.

Neural computation
The thalamus has traditionally been considered as only a relay source of cortical inputs, with hierarchically organized cortical circuits serially transforming thalamic signals to cognitively relevant representations. Given the absence of local excit...

Decoding Movements from Cortical Ensemble Activity Using a Long Short-Term Memory Recurrent Network.

Neural computation
Although many real-time neural decoding algorithms have been proposed for brain-machine interface (BMI) applications over the years, an optimal, consensual approach remains elusive. Recent advances in deep learning algorithms provide new opportunitie...

A Geometrical Analysis of Global Stability in Trained Feedback Networks.

Neural computation
Recurrent neural networks have been extensively studied in the context of neuroscience and machine learning due to their ability to implement complex computations. While substantial progress in designing effective learning algorithms has been achieve...

Quantifying Information Conveyed by Large Neuronal Populations.

Neural computation
Quantifying mutual information between inputs and outputs of a large neural circuit is an important open problem in both machine learning and neuroscience. However, evaluation of the mutual information is known to be generally intractable for large s...

Improving the Antinoise Ability of DNNs via a Bio-Inspired Noise Adaptive Activation Function Rand Softplus.

Neural computation
Although deep neural networks (DNNs) have led to many remarkable results in cognitive tasks, they are still far from catching up with human-level cognition in antinoise capability. New research indicates how brittle and susceptible current models are...

Learning Moral Graphs in Construction of High-Dimensional Bayesian Networks for Mixed Data.

Neural computation
Bayesian networks have been widely used in many scientific fields for describing the conditional independence relationships for a large set of random variables. This letter proposes a novel algorithm, the so-called -learning algorithm, for learning m...