AI Medical Compendium Journal:
Neural computation

Showing 141 to 150 of 203 articles

Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies.

Neural computation
Catastrophic forgetting remains an outstanding challenge in continual learning. Recently, methods inspired by the brain, such as continual representation learning and memory replay, have been used to combat catastrophic forgetting. Associative learni...

Mean-Field Approximations With Adaptive Coupling for Networks With Spike-Timing-Dependent Plasticity.

Neural computation
Understanding the effect of spike-timing-dependent plasticity (STDP) is key to elucidating how neural networks change over long timescales and to design interventions aimed at modulating such networks in neurological disorders. However, progress is r...

Sensitivity to Control Signals in Triphasic Rhythmic Neural Systems: A Comparative Mechanistic Analysis via Infinitesimal Local Timing Response Curves.

Neural computation
Similar activity patterns may arise from model neural networks with distinct coupling properties and individual unit dynamics. These similar patterns may, however, respond differently to parameter variations and specifically to tuning of inputs that ...

Modern Artificial Neural Networks: Is Evolution Cleverer?

Neural computation
Machine learning tools, particularly artificial neural networks (ANN), have become ubiquitous in many scientific disciplines, and machine learning-based techniques flourish not only because of the expanding computational power and the increasing avai...

Strong Allee Effect Synaptic Plasticity Rule in an Unsupervised Learning Environment.

Neural computation
Synaptic plasticity, or the ability of a brain to change one or more of its functions or structures at the synaptic level, has generated and is still generating a lot of interest from the scientific community especially from neuroscientists. These in...

Large Language Models and the Reverse Turing Test.

Neural computation
Large language models (LLMs) have been transformative. They are pretrained foundational models that are self-supervised and can be adapted with fine-tuning to a wide range of natural language tasks, each of which previously would have required a sepa...

Toward Network Intelligence.

Neural computation
This article proposes a conceptual framework to guide research in neural computation by relating it to mathematical progress in other fields and to examples illustrative of biological networks. The goal is to provide insight into how biological netwo...

Progressive Interpretation Synthesis: Interpreting Task Solving by Quantifying Previously Used and Unused Information.

Neural computation
A deep neural network is a good task solver, but it is difficult to make sense of its operation. People have different ideas about how to interpret its operation. We look at this problem from a new perspective where the interpretation of task solving...

Capacity Limitations of Visual Search in Deep Convolutional Neural Networks.

Neural computation
Deep convolutional neural networks (CNN) follow roughly the architecture of biological visual systems and have shown a performance comparable to human observers in object classification tasks. In this study, three deep neural networks pretrained for ...

Recurrent Neural-Linear Posterior Sampling for Nonstationary Contextual Bandits.

Neural computation
An agent in a nonstationary contextual bandit problem should balance between exploration and the exploitation of (periodic or structured) patterns present in its previous experiences. Handcrafting an appropriate historical context is an attractive al...