AI Medical Compendium Topic:
Learning

Clear Filters Showing 721 to 730 of 1362 articles

A deep reinforcement transfer convolutional neural network for rolling bearing fault diagnosis.

ISA transactions
Deep neural networks highly depend on substantial labeled samples when identifying bearing fault. However, in some practical situations, it is very difficult to collect sufficient labeled samples, which limits the application of deep neural networks ...

Efficient learning rate adaptation based on hierarchical optimization approach.

Neural networks : the official journal of the International Neural Network Society
This paper proposes a new hierarchical approach to learning rate adaptation in gradient methods, called learning rate optimization (LRO). LRO formulates the learning rate adaption problem as a hierarchical optimization problem that minimizes the loss...

Deep learning-based quantitative analyses of spontaneous movements and their association with early neurological development in preterm infants.

Scientific reports
This study aimed to develop quantitative assessments of spontaneous movements in high-risk preterm infants based on a deep learning algorithm. Video images of spontaneous movements were recorded in very preterm infants at the term-equivalent age. The...

Superposition mechanism as a neural basis for understanding others.

Scientific reports
Social cognition has received much attention in fields such as neuroscience, psychology, cognitive science, and philosophy. Theory-theory (TT) and simulation theory (ST) provide the dominant theoretical frameworks for research on social cognition. Ho...

Evolving Long Short-Term Memory Network-Based Text Classification.

Computational intelligence and neuroscience
Recently, long short-term memory (LSTM) networks are extensively utilized for text classification. Compared to feed-forward neural networks, it has feedback connections, and thus, it has the ability to learn long-term dependencies. However, the LSTM ...

Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.

PloS one
Knowledge Distillation (KD) is one of the widely known methods for model compression. In essence, KD trains a smaller student model based on a larger teacher model and tries to retain the teacher model's level of performance as much as possible. Howe...

Learning to Reason on Tree Structures for Knowledge-Based Visual Question Answering.

Sensors (Basel, Switzerland)
Collaborative reasoning for knowledge-based visual question answering is challenging but vital and efficient in understanding the features of the images and questions. While previous methods jointly fuse all kinds of features by attention mechanism o...

Employing automatic content recognition for teaching methodology analysis in classroom videos.

PloS one
A teacher plays a pivotal role in grooming a society and paves way for its social and economic developments. Teaching is a dynamic role and demands continuous adaptation. A teacher adopts teaching techniques suitable for a certain discipline and a si...

Flexible Neural Network Realized by the Probabilistic SiO Memristive Synaptic Array for Energy-Efficient Image Learning.

Advanced science (Weinheim, Baden-Wurttemberg, Germany)
The human brain's neural networks are sparsely connected via tunable and probabilistic synapses, which may be essential for performing energy-efficient cognitive and intellectual functions. In this sense, the implementation of a flexible neural netwo...

Weak Disambiguation for Partial Structured Output Learning.

IEEE transactions on cybernetics
Existing disambiguation strategies for partial structured output learning just cannot generalize well to solve the problem that there are some candidates that can be false positive or similar to the ground-truth label. In this article, we propose a n...