AIMC Journal:
Neural computation

Showing 151 to 160 of 203 articles

Neural Information Processing and Computations of Two-Input Synapses.

Neural computation
Information processing in artificial neural networks is largely dependent on the nature of neuron models. While commonly used models are designed for linear integration of synaptic inputs, accumulating experimental evidence suggests that biological n...

Information Geometrically Generalized Covariate Shift Adaptation.

Neural computation
Many machine learning methods assume that the training and test data follow the same distribution. However, in the real world, this assumption is often violated. In particular, the marginal distribution of the data changes, called covariate shift, is...

Probing the Relationship Between Latent Linear Dynamical Systems and Low-Rank Recurrent Neural Network Models.

Neural computation
A large body of work has suggested that neural populations exhibit low-dimensional dynamics during behavior. However, there are a variety of different approaches for modeling low-dimensional neural population activity. One approach involves latent li...

Scalability of Large Neural Network Simulations via Activity Tracking With Time Asynchrony and Procedural Connectivity.

Neural computation
We present a new algorithm to efficiently simulate random models of large neural networks satisfying the property of time asynchrony. The model parameters (average firing rate, number of neurons, synaptic connection probability, and postsynaptic dura...

Permitted Sets and Convex Coding in Nonthreshold Linear Networks.

Neural computation
Hebbian theory proposes that ensembles of neurons form a basis for neural processing. It is possible to gain insight into the activity patterns of these neural ensembles through a binary analysis, regarding neurons as either active or inactive. The f...

Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks.

Neural computation
Neural computations can be framed as dynamical processes, whereby the structure of the dynamics within a neural network is a direct reflection of the computations that the network performs. A key step in generating mechanistic interpretations within ...

Invariance, Encodings, and Generalization: Learning Identity Effects With Neural Networks.

Neural computation
Often in language and other areas of cognition, whether two components of an object are identical or not determines if it is well formed. We call such constraints identity effects. When developing a system to learn well-formedness from examples, it i...

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs.

Neural computation
Backpropagation of error (backprop) is a powerful algorithm for training machine learning architectures through end-to-end differentiation. Recently it has been shown that backprop in multilayer perceptrons (MLPs) can be approximated using predictive...

Hypothesis Test and Confidence Analysis With Wasserstein Distance on General Dimension.

Neural computation
We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted considerable attention and has been widely applied to various machine learning tasks because of its excellent p...

Advancements in Algorithms and Neuromorphic Hardware for Spiking Neural Networks.

Neural computation
Artificial neural networks (ANNs) have experienced a rapid advancement for their success in various application domains, including autonomous driving and drone vision. Researchers have been improving the performance efficiency and computational requi...