Published on Tue Sep 12 2017

Spatio-temporal Learning with Arrays of Analog Nanosynapses

Christopher H. Bennett, Damien Querlioz, Jacques-Olivier Klein

Nanosynapses perform both on-chip projection and regression operations. The system achieves nearly perfect (99%) accuracy at asufficient hidden layer size.

0
0
0
Abstract

Emerging nanodevices such as resistive memories are being considered for hardware realizations of a variety of artificial neural networks (ANNs), including highly promising online variants of the learning approaches known as reservoir computing (RC) and the extreme learning machine (ELM). We propose an RC/ELM inspired learning system built with nanosynapses that performs both on-chip projection and regression operations. To address time-dynamic tasks, the hidden neurons of our system perform spatio-temporal integration and can be further enhanced with variable sampling or multiple activation windows. We detail the system and show its use in conjunction with a highly analog nanosynapse device on a standard task with intrinsic timing dynamics- the TI-46 battery of spoken digits. The system achieves nearly perfect (99%) accuracy at sufficient hidden layer size, which compares favorably with software results. In addition, the model is extended to a larger dataset, the MNIST database of handwritten digits. By translating the database into the time domain and using variable integration windows, up to 95% classification accuracy is achieved. In addition to an intrinsically low-power programming style, the proposed architecture learns very quickly and can easily be converted into a spiking system with negligible loss in performance- all features that confer significant energy efficiency.

Mon Dec 17 2018
Neural Networks
Deep learning incorporating biologically-inspired neural dynamics
Neural networks have become the key technology of artificial intelligence. Spiking Neural Networks (SNNs) have held great promise because of their temporal dynamics and high-power efficiency. SNUs bridge the biologically-inspired SNNs with ANNs and provide a methodology for seamless inclusion of spiking neurons in deep learning architectures.
0
0
0
Thu Apr 30 2020
Neural Networks
Memristors -- from In-memory computing, Deep Learning Acceleration, Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired Computing
Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence. This paper reviews the case for a novel beyond CMOS hardware technology, memristors, as a potential solution for the implementation of power-efficient in-memory computing.
0
0
0
Tue May 28 2019
Neural Networks
Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses
Spiking neural networks (SNN) are artificial computational models inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventional artificial neural networks.
0
0
0
Sat Nov 14 2015
Neural Networks
Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines
Synaptic Sampling Machines are a class of neural network models that use stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastically counterpart of Hopfield networks.
0
0
0
Fri Jan 11 2019
Neural Networks
Low-Power Neuromorphic Hardware for Signal Processing Applications
Machine learning has emerged as the dominant tool for implementing complex cognitive tasks that require supervised, unsupervised, and reinforcement learning. Most state-of-the-art machine learning solutions are based on memory-less models of neurons. Inspired by the time-encoding mechanism used by the brain, third generation spiking
1
0
1
Sun Aug 16 2020
Neural Networks
Supervised Learning with First-to-Spike Decoding in Multilayer Spiking Neural Networks
New method can train multilayer spiking networks to solve classification problems. The proposed learning rule supports multiple spikes fired by stochastic hidden neurons. It is stable by relying on first-spike responses generated by a deterministic output layer.
0
0
0