AIMC Journal:
IEEE transactions on neural networks and learning systems

Showing 391 to 400 of 780 articles

Multi-Output Selective Ensemble Identification of Nonlinear and Nonstationary Industrial Processes.

IEEE transactions on neural networks and learning systems
A key characteristic of biological systems is the ability to update the memory by learning new knowledge and removing out-of-date knowledge so that intelligent decision can be made based on the relevant knowledge acquired in the memory. Inspired by t...

Constructing Accurate and Efficient Deep Spiking Neural Networks With Double-Threshold and Augmented Schemes.

IEEE transactions on neural networks and learning systems
Spiking neural networks (SNNs) are considered as a potential candidate to overcome current challenges, such as the high-power consumption encountered by artificial neural networks (ANNs); however, there is still a gap between them with respect to the...

Mind the Remainder: Taylor's Theorem View on Recurrent Neural Networks.

IEEE transactions on neural networks and learning systems
Recurrent neural networks (RNNs) have gained tremendous popularity in almost every sequence modeling task. Despite the effort, these kinds of discrete unstructured data, such as texts, audio, and videos, are still difficult to be embedded in the feat...

Quantifying the Alignment of Graph and Features in Deep Learning.

IEEE transactions on neural networks and learning systems
We show that the classification performance of graph convolutional networks (GCNs) is related to the alignment between features, graph, and ground truth, which we quantify using a subspace alignment measure (SAM) corresponding to the Frobenius norm o...

An Efficient Memristor-Based Circuit Implementation of Squeeze-and-Excitation Fully Convolutional Neural Networks.

IEEE transactions on neural networks and learning systems
Recently, there has been a surge of interest in applying memristors to hardware implementations of deep neural networks due to various desirable properties of the memristor, such as nonvolativity, multivalue, and nanosize. Most existing neural networ...

Modularizing Deep Learning via Pairwise Learning With Kernels.

IEEE transactions on neural networks and learning systems
By redefining the conventional notions of layers, we present an alternative view on finitely wide, fully trainable deep neural networks as stacked linear models in feature spaces, leading to a kernel machine interpretation. Based on this construction...

Global-Guided Selective Context Network for Scene Parsing.

IEEE transactions on neural networks and learning systems
Recent studies on semantic segmentation are exploiting contextual information to address the problem of inconsistent parsing prediction in big objects and ignorance in small objects. However, they utilize multilevel contextual information equally acr...

Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.

IEEE transactions on neural networks and learning systems
Recurrent neural networks (RNNs) can remember temporal contextual information over various time steps. The well-known gradient vanishing/explosion problem restricts the ability of RNNs to learn long-term dependencies. The gate mechanism is a well-dev...

A Survey of End-to-End Driving: Architectures and Training Methods.

IEEE transactions on neural networks and learning systems
Autonomous driving is of great interest to industry and academia alike. The use of machine learning approaches for autonomous driving has long been studied, but mostly in the context of perception. In this article, we take a deeper look on the so-cal...

Harvesting Ambient RF for Presence Detection Through Deep Learning.

IEEE transactions on neural networks and learning systems
This article explores the use of ambient radio frequency (RF) signals for human presence detection through deep learning. Using Wi-Fi signal as an example, we demonstrate that the channel state information (CSI) obtained at the receiver contains rich...