Recent advances in network science, control theory, and fractional calculus provide us with mathematical tools necessary for modeling and controlling complex dynamical networks (CDNs) that exhibit long-term memory. Selecting the minimum number of dri...
Recurrent neural networks have led to breakthroughs in natural language processing and speech recognition. Here we show that recurrent networks, specifically long short-term memory networks can also capture the temporal evolution of chemical/biophysi...
Attractor neural networks such as the Hopfield model can be used to model associative memory. An efficient associative memory should be able to store a large number of patterns which must all be stable. We study in detail the meaning and definition o...
Predicting and modeling human behavior and finding trends within human decision-making processes is a major problem of social science. Rock Paper Scissors (RPS) is the fundamental strategic question in many game theory problems and real-world competi...
Zebrafish have quickly emerged as a species of choice in preclinical research, holding promise to advance the field of behavioral pharmacology through high-throughput experiments. Besides biological and heuristic considerations, zebrafish also consti...
A crossbar array architecture employing resistive switching memory (RRAM) as a synaptic element accelerates vector-matrix multiplication in a parallel fashion, enabling energy-efficient pattern recognition. To implement the function of the synapse in...
Sequential neural activity has been observed in many parts of the brain and has been proposed as a neural mechanism for memory. The natural world expresses temporal relationships at a wide range of scales. Because we cannot know the relevant scales a...
We study the learning of an external signal by a neural network and the time to forget it when this network is submitted to noise. The presentation of an external stimulus to the recurrent network of binary neurons may change the state of the synapse...
Spiking neural networks (SNN) are computational models inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the con...
In 1982, John Hopfield published a neural network model for memory retrieval, a model that became a cornerstone in theoretical neuroscience. In a recent paper, Krotov and Hopfield built on these early studies and showed how a network that incorporate...
Join thousands of healthcare professionals staying informed about the latest AI breakthroughs in medicine. Get curated insights delivered to your inbox.