A Probabilistic Synapse With Strained MTJs for Spiking Neural Networks.
Journal:
IEEE transactions on neural networks and learning systems
Published Date:
Jun 18, 2019
Abstract
Spiking neural networks (SNNs) are of interest for applications for which conventional computing suffers from the nearly insurmountable memory-processor bottleneck. This paper presents a stochastic SNN architecture that is based on specialized logic-in-memory synaptic units to create a unique processing system that offers massively parallel processing power. Our proposed synaptic unit consists of strained magnetic tunnel junction (MTJ) devices and transistors. MTJs in our synapse are dual purpose, used as both random bit generators and as general-purpose memory. Our neurons are modeled as integrate-and-fire components with thresholding and refraction. Our circuit is implemented using CMOS 28-nm technology that is compatible with the MTJ technology. Our design shows that the required area for the proposed synapse is only [Formula: see text]. When idle, the synapse consumes 675 pW. When firing, the energy required to propagate a spike is 8.87 fJ. We then demonstrate an SNN that learns (without supervision) and classifies handwritten digits of the MNIST database. Simulation results show that our network presents high classification efficiency even in the presence of fabrication variability.