AIMC Topic: Learning

Clear Filters Showing 571 to 580 of 1399 articles

Location-aware convolutional neural networks for graph classification.

Neural networks : the official journal of the International Neural Network Society
Graph patterns play a critical role in various graph classification tasks, e.g., chemical patterns often determine the properties of molecular graphs. Researchers devote themselves to adapting Convolutional Neural Networks (CNNs) to graph classificat...

Pan-Logical Probabilistic Algorithms Based on Convolutional Neural Networks.

Computational intelligence and neuroscience
A brand-new kind of flexible logic system called universal logic aims to address a variety of uncertain problems. In this study, the role of convolutional neural networks in assessing probabilistic pan-logic algorithms is investigated. A generic logi...

Named Entity Aware Transfer Learning for Biomedical Factoid Question Answering.

IEEE/ACM transactions on computational biology and bioinformatics
Biomedical factoid question answering is an important task in biomedical question answering applications. It has attracted much attention because of its reliability. In question answering systems, better representation of words is of great importance...

LGLNN: Label Guided Graph Learning-Neural Network for few-shot learning.

Neural networks : the official journal of the International Neural Network Society
Graph Neural Networks (GNNs) have been employed for few-shot learning (FSL) tasks. The aim of GNN based FSL is to transform the few-shot learning problem into a graph node classification or edge labeling tasks, which can thus fully explore the relati...

Tweaking Deep Neural Networks.

IEEE transactions on pattern analysis and machine intelligence
Deep neural networks are trained so as to achieve a kind of the maximum overall accuracy through a learning process using given training data. Therefore, it is difficult to fix them to improve the accuracies of specific problematic classes or classes...

Hierarchical and Self-Attended Sequence Autoencoder.

IEEE transactions on pattern analysis and machine intelligence
It is important and challenging to infer stochastic latent semantics for natural language applications. The difficulty in stochastic sequential learning is caused by the posterior collapse in variational inference. The input sequence is disregarded i...

Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems.

IEEE transactions on neural networks and learning systems
The human brain is the gold standard of adaptive learning. It not only can learn and benefit from experience, but also can adapt to new situations. In contrast, deep neural networks only learn one sophisticated but fixed mapping from inputs to output...

Toward the Optimal Design and FPGA Implementation of Spiking Neural Networks.

IEEE transactions on neural networks and learning systems
The performance of a biologically plausible spiking neural network (SNN) largely depends on the model parameters and neural dynamics. This article proposes a parameter optimization scheme for improving the performance of a biologically plausible SNN ...

A Plug-in Method for Representation Factorization in Connectionist Models.

IEEE transactions on neural networks and learning systems
In this article, we focus on decomposing latent representations in generative adversarial networks or learned feature representations in deep autoencoders into semantically controllable factors in a semisupervised manner, without modifying the origin...

Optimizing Attention for Sequence Modeling via Reinforcement Learning.

IEEE transactions on neural networks and learning systems
Attention has been shown highly effective for modeling sequences, capturing the more informative parts in learning a deep representation. However, recent studies show that the attention values do not always coincide with intuition in tasks, such as m...