IEEE transactions on neural networks and learning systems
May 2, 2022
A key characteristic of biological systems is the ability to update the memory by learning new knowledge and removing out-of-date knowledge so that intelligent decision can be made based on the relevant knowledge acquired in the memory. Inspired by t...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Spiking neural networks (SNNs) are considered as a potential candidate to overcome current challenges, such as the high-power consumption encountered by artificial neural networks (ANNs); however, there is still a gap between them with respect to the...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Recurrent neural networks (RNNs) have gained tremendous popularity in almost every sequence modeling task. Despite the effort, these kinds of discrete unstructured data, such as texts, audio, and videos, are still difficult to be embedded in the feat...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
We show that the classification performance of graph convolutional networks (GCNs) is related to the alignment between features, graph, and ground truth, which we quantify using a subspace alignment measure (SAM) corresponding to the Frobenius norm o...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Recently, there has been a surge of interest in applying memristors to hardware implementations of deep neural networks due to various desirable properties of the memristor, such as nonvolativity, multivalue, and nanosize. Most existing neural networ...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
By redefining the conventional notions of layers, we present an alternative view on finitely wide, fully trainable deep neural networks as stacked linear models in feature spaces, leading to a kernel machine interpretation. Based on this construction...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Recent studies on semantic segmentation are exploiting contextual information to address the problem of inconsistent parsing prediction in big objects and ignorance in small objects. However, they utilize multilevel contextual information equally acr...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Recurrent neural networks (RNNs) can remember temporal contextual information over various time steps. The well-known gradient vanishing/explosion problem restricts the ability of RNNs to learn long-term dependencies. The gate mechanism is a well-dev...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
Autonomous driving is of great interest to industry and academia alike. The use of machine learning approaches for autonomous driving has long been studied, but mostly in the context of perception. In this article, we take a deeper look on the so-cal...
IEEE transactions on neural networks and learning systems
Apr 4, 2022
This article explores the use of ambient radio frequency (RF) signals for human presence detection through deep learning. Using Wi-Fi signal as an example, we demonstrate that the channel state information (CSI) obtained at the receiver contains rich...