Neural computation
Sep 12, 2022
Information processing in artificial neural networks is largely dependent on the nature of neuron models. While commonly used models are designed for linear integration of synaptic inputs, accumulating experimental evidence suggests that biological n...
Neural computation
Aug 16, 2022
Many machine learning methods assume that the training and test data follow the same distribution. However, in the real world, this assumption is often violated. In particular, the marginal distribution of the data changes, called covariate shift, is...
Neural computation
Aug 16, 2022
A large body of work has suggested that neural populations exhibit low-dimensional dynamics during behavior. However, there are a variety of different approaches for modeling low-dimensional neural population activity. One approach involves latent li...
Neural computation
Aug 16, 2022
We present a new algorithm to efficiently simulate random models of large neural networks satisfying the property of time asynchrony. The model parameters (average firing rate, number of neurons, synaptic connection probability, and postsynaptic dura...
Neural computation
Aug 16, 2022
Hebbian theory proposes that ensembles of neurons form a basis for neural processing. It is possible to gain insight into the activity patterns of these neural ensembles through a binary analysis, regarding neurons as either active or inactive. The f...
Neural computation
Jul 14, 2022
Neural computations can be framed as dynamical processes, whereby the structure of the dynamics within a neural network is a direct reflection of the computations that the network performs. A key step in generating mechanistic interpretations within ...
Neural computation
Jul 14, 2022
Often in language and other areas of cognition, whether two components of an object are identical or not determines if it is well formed. We call such constraints identity effects. When developing a system to learn well-formedness from examples, it i...
Neural computation
May 19, 2022
Backpropagation of error (backprop) is a powerful algorithm for training machine learning architectures through end-to-end differentiation. Recently it has been shown that backprop in multilayer perceptrons (MLPs) can be approximated using predictive...
Neural computation
May 19, 2022
We develop a general framework for statistical inference with the 1-Wasserstein distance. Recently, the Wasserstein distance has attracted considerable attention and has been widely applied to various machine learning tasks because of its excellent p...
Neural computation
May 19, 2022
Artificial neural networks (ANNs) have experienced a rapid advancement for their success in various application domains, including autonomous driving and drone vision. Researchers have been improving the performance efficiency and computational requi...