AIMC Topic: Memory

Clear Filters Showing 61 to 70 of 210 articles

A Fast Spatial Pool Learning Algorithm of Hierarchical Temporal Memory Based on Minicolumn's Self-Nomination.

Computational intelligence and neuroscience
As a new type of artificial neural network model, HTM has become the focus of current research and application. The sparse distributed representation is the basis of the HTM model, but the existing spatial pool learning algorithms have high training ...

Anomalous Behavior Detection Framework Using HTM-Based Semantic Folding Technique.

Computational and mathematical methods in medicine
Upon the working principles of the human neocortex, the Hierarchical Temporal Memory model has been developed which is a proposed theoretical framework for sequence learning. Both categorical and numerical types of data are handled by HTM. Semantic F...

Quantization Friendly MobileNet (QF-MobileNet) Architecture for Vision Based Applications on Embedded Platforms.

Neural networks : the official journal of the International Neural Network Society
Deep Neural Networks (DNNs) have become popular for various applications in the domain of image and computer vision due to their well-established performance attributes. DNN algorithms involve powerful multilevel feature extractions resulting in an e...

A comprehensive study of class incremental learning algorithms for visual tasks.

Neural networks : the official journal of the International Neural Network Society
The ability of artificial agents to increment their capabilities when confronted with new data is an open challenge in artificial intelligence. The main challenge faced in such cases is catastrophic forgetting, i.e., the tendency of neural networks t...

A Brain-Inspired Framework for Evolutionary Artificial General Intelligence.

IEEE transactions on neural networks and learning systems
From the medical field to agriculture, from energy to transportation, every industry is going through a revolution by embracing artificial intelligence (AI); nevertheless, AI is still in its infancy. Inspired by the evolution of the human brain, this...

Image memorability is predicted by discriminability and similarity in different stages of a convolutional neural network.

Learning & memory (Cold Spring Harbor, N.Y.)
The features of an image can be represented at multiple levels-from its low-level visual properties to high-level meaning. What drives some images to be memorable while others are forgettable? We address this question across two behavioral experiment...

Echo Memory-Augmented Network for time series classification.

Neural networks : the official journal of the International Neural Network Society
Echo State Networks (ESNs) are efficient recurrent neural networks (RNNs) which have been successfully applied to time series modeling tasks. However, ESNs are unable to capture the history information far from the current time step, since the echo s...

Embracing Change: Continual Learning in Deep Neural Networks.

Trends in cognitive sciences
Artificial intelligence research has seen enormous progress over the past few decades, but it predominantly relies on fixed datasets and stationary environments. Continual learning is an increasingly relevant area of study that asks how artificial sy...

Capturing human categorization of natural images by combining deep networks and cognitive models.

Nature communications
Human categorization is one of the most important and successful targets of cognitive modeling, with decades of model development and assessment using simple, low-dimensional artificial stimuli. However, it remains unclear how these findings relate t...

Overparameterized neural networks implement associative memory.

Proceedings of the National Academy of Sciences of the United States of America
Identifying computational mechanisms for memorization and retrieval of data is a long-standing problem at the intersection of machine learning and neuroscience. Our main finding is that standard overparameterized deep neural networks trained using st...