BACKGROUND: Artificial intelligence technology is among the most significant advancements that provide students with effective learning opportunities in this digital era. Therefore, the National League for Nursing states that it is necessary to refra...
Adults struggle to learn non-native speech categories in many experimental settings (Goto, Neuropsychologia, 9(3), 317-323 1971), but learn efficiently in a video game paradigm where non-native speech sounds have functional significance (Lim & Holt, ...
IEEE transactions on neural networks and learning systems
Aug 5, 2024
Model-based impedance learning control can provide variable impedance regulation for robots through online impedance learning without interaction force sensing. However, the existing related results only guarantee the closed-loop control systems to b...
Neural networks : the official journal of the International Neural Network Society
Aug 2, 2024
Contrastive learning has emerged as a cornerstone in unsupervised representation learning. Its primary paradigm involves an instance discrimination task utilizing InfoNCE loss where the loss has been proven to be a form of mutual information. Consequ...
Advances in artificial intelligence enable neural networks to learn a wide variety of tasks, yet our understanding of the learning dynamics of these networks remains limited. Here, we study the temporal dynamics during learning of Hebbian feedforward...
Our ability to combine simple constituents into more complex conceptual combinations is a fundamental aspect of cognition. Gradable adjectives (e.g., 'tall' and 'light') are a critical example of this process, as their meanings vary depending on the ...
Neural networks : the official journal of the International Neural Network Society
Jul 22, 2024
Person re-identification (ReID) has made good progress in stationary domains. The ReID model must be retrained to adapt to new scenarios (domains) as they emerge unexpectedly, which leads to catastrophic forgetting. Continual learning trains the mode...
It has been proposed that, when processing a stream of events, humans divide their experiences in terms of inferred latent causes (LCs) to support context-dependent learning. However, when shared structure is present across contexts, it is still uncl...
The brain may have evolved a modular architecture for daily tasks, with circuits featuring functionally specialized modules that match the task structure. We hypothesize that this architecture enables better learning and generalization than architect...