AIMC Topic: Learning

Clear Filters Showing 731 to 740 of 1400 articles

On the Post Hoc Explainability of Optimized Self-Organizing Reservoir Network for Action Recognition.

Sensors (Basel, Switzerland)
This work proposes a novel unsupervised self-organizing network, called the Self-Organizing Convolutional Echo State Network (SO-ConvESN), for learning node centroids and interconnectivity maps compatible with the deterministic initialization of Echo...

Synaptic Learning With Augmented Spikes.

IEEE transactions on neural networks and learning systems
Traditional neuron models use analog values for information representation and computation, while all-or-nothing spikes are employed in the spiking ones. With a more brain-like processing paradigm, spiking neurons are more promising for improvements ...

SMGEA: A New Ensemble Adversarial Attack Powered by Long-Term Gradient Memories.

IEEE transactions on neural networks and learning systems
Deep neural networks are vulnerable to adversarial attacks. More importantly, some adversarial examples crafted against an ensemble of source models transfer to other target models and, thus, pose a security threat to black-box applications (when att...

A novel convolutional neural network for kidney ultrasound images segmentation.

Computer methods and programs in biomedicine
BACKGROUND AND OBJECTIVE: Ultrasound imaging has been widely used in the screening of kidney diseases. The localization and segmentation of the kidneys in ultrasound images are helpful for the clinical diagnosis of diseases. However, it is a challeng...

A deep reinforcement transfer convolutional neural network for rolling bearing fault diagnosis.

ISA transactions
Deep neural networks highly depend on substantial labeled samples when identifying bearing fault. However, in some practical situations, it is very difficult to collect sufficient labeled samples, which limits the application of deep neural networks ...

Efficient learning rate adaptation based on hierarchical optimization approach.

Neural networks : the official journal of the International Neural Network Society
This paper proposes a new hierarchical approach to learning rate adaptation in gradient methods, called learning rate optimization (LRO). LRO formulates the learning rate adaption problem as a hierarchical optimization problem that minimizes the loss...

Deep learning-based quantitative analyses of spontaneous movements and their association with early neurological development in preterm infants.

Scientific reports
This study aimed to develop quantitative assessments of spontaneous movements in high-risk preterm infants based on a deep learning algorithm. Video images of spontaneous movements were recorded in very preterm infants at the term-equivalent age. The...

Superposition mechanism as a neural basis for understanding others.

Scientific reports
Social cognition has received much attention in fields such as neuroscience, psychology, cognitive science, and philosophy. Theory-theory (TT) and simulation theory (ST) provide the dominant theoretical frameworks for research on social cognition. Ho...

Evolving Long Short-Term Memory Network-Based Text Classification.

Computational intelligence and neuroscience
Recently, long short-term memory (LSTM) networks are extensively utilized for text classification. Compared to feed-forward neural networks, it has feedback connections, and thus, it has the ability to learn long-term dependencies. However, the LSTM ...

Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.

PloS one
Knowledge Distillation (KD) is one of the widely known methods for model compression. In essence, KD trains a smaller student model based on a larger teacher model and tries to retain the teacher model's level of performance as much as possible. Howe...