AIMC Topic:
Learning

Clear Filters Showing 1231 to 1240 of 1372 articles

Large Language Models and the Reverse Turing Test.

Neural computation
Large language models (LLMs) have been transformative. They are pretrained foundational models that are self-supervised and can be adapted with fine-tuning to a wide range of natural language tasks, each of which previously would have required a sepa...

Predicting molecular properties based on the interpretable graph neural network with multistep focus mechanism.

Briefings in bioinformatics
Graph neural networks based on deep learning methods have been extensively applied to the molecular property prediction because of its powerful feature learning ability and good performance. However, most of them are black boxes and cannot give the r...

Accelerating artificial intelligence: How federated learning can protect privacy, facilitate collaboration, and improve outcomes.

Health informatics journal
Cross-institution collaborations are constrained by data-sharing challenges. These challenges hamper innovation, particularly in artificial intelligence, where models require diverse data to ensure strong performance. Federated learning (FL) solves d...

CTCNet: A CNN-Transformer Cooperation Network for Face Image Super-Resolution.

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Recently, deep convolution neural networks (CNNs) steered face super-resolution methods have achieved great progress in restoring degraded facial details by joint training with facial priors. However, these methods have some obvious limitations. On t...

A Developed LSTM-Ladder-Network-Based Model for Sleep Stage Classification.

IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society
Sleep staging is crucial for diagnosing sleep-related disorders. The heavy and time-consuming task of manual staging can be released by automatic techniques. However, the automatic staging model would have a relatively poor performance when working o...

Finding Hierarchical Structure in Binary Sequences: Evidence from Lindenmayer Grammar Learning.

Cognitive science
In this article, we explore the extraction of recursive nested structure in the processing of binary sequences. Our aim was to determine whether humans learn the higher-order regularities of a highly simplified input where only sequential-order infor...

Extracting Low-Dimensional Psychological Representations from Convolutional Neural Networks.

Cognitive science
Convolutional neural networks (CNNs) are increasingly widely used in psychology and neuroscience to predict how human minds and brains respond to visual images. Typically, CNNs represent these images using thousands of features that are learned throu...

Progressive Interpretation Synthesis: Interpreting Task Solving by Quantifying Previously Used and Unused Information.

Neural computation
A deep neural network is a good task solver, but it is difficult to make sense of its operation. People have different ideas about how to interpret its operation. We look at this problem from a new perspective where the interpretation of task solving...

Recurrent Neural-Linear Posterior Sampling for Nonstationary Contextual Bandits.

Neural computation
An agent in a nonstationary contextual bandit problem should balance between exploration and the exploitation of (periodic or structured) patterns present in its previous experiences. Handcrafting an appropriate historical context is an attractive al...

multi-type neighbors enhanced global topology and pairwise attribute learning for drug-protein interaction prediction.

Briefings in bioinformatics
MOTIVATION: Accurate identification of proteins interacted with drugs helps reduce the time and cost of drug development. Most of previous methods focused on integrating multisource data about drugs and proteins for predicting drug-target interaction...