AI Medical Compendium Journal:
Neural computation

Showing 11 to 20 of 203 articles

Information-Theoretic Representation Learning for Positive-Unlabeled Classification.

Neural computation
Recent advances in weakly supervised classification allow us to train a classifier from only positive and unlabeled (PU) data. However, existing PU classification methods typically require an accurate estimate of the class-prior probability, a critic...

Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures.

Neural computation
The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition. Her...

Redundancy-Aware Pruning of Convolutional Neural Networks.

Neural computation
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without considering the correlation of neurons. We argue that such a way of pruning ...

Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods.

Neural computation
We develop theoretical foundations of resonator networks, a new type of recurrent neural network introduced in Frady, Kent, Olshausen, and Sommer (2020), a companion article in this issue, to solve a high-dimensional vector factorization problem aris...

Differential Covariance: A New Method to Estimate Functional Connectivity in fMRI.

Neural computation
Measuring functional connectivity from fMRI recordings is important in understanding processing in cortical networks. However, because the brain's connection pattern is complex, currently used methods are prone to producing false functional connectio...

Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation.

Neural computation
Spiking neural networks (SNNs) with the event-driven manner of transmitting spikes consume ultra-low power on neuromorphic chips. However, training deep SNNs is still challenging compared to convolutional neural networks (CNNs). The SNN training algo...

Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks.

Neural computation
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to qu...

Reverse-Engineering Neural Networks to Characterize Their Cost Functions.

Neural computation
This letter considers a class of biologically plausible cost functions for neural networks, where the same cost function is minimized by both neural activity and plasticity. We show that such cost functions can be cast as a variational bound on model...

A Predictive-Coding Network That Is Both Discriminative and Generative.

Neural computation
Predictive coding (PC) networks are a biologically interesting class of neural networks. Their layered hierarchy mimics the reciprocal connectivity pattern observed in the mammalian cortex, and they can be trained using local learning rules that appr...

Active Learning of Bayesian Linear Models with High-Dimensional Binary Features by Parameter Confidence-Region Estimation.

Neural computation
In this letter, we study an active learning problem for maximizing an unknown linear function with high-dimensional binary features. This problem is notoriously complex but arises in many important contexts. When the sampling budget, that is, the num...