Challenges and opportunities toward online training acceleration using RRAM-based hardware neural network

Chih Cheng Chang, Jen Chieh Liu, Yu Lin Shen, Teyuh Chou, Pin Chun Chen, I. Ting Wang, Chih Chun Su, Ming Hong Wu, Boris Hudec, Che Chia Chang, Chia-Ming Tsai, Tian-Sheuan Chang, H. S.Philip Wong, Tuo-Hung Hou

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

This paper highlights the feasible routes of using resistive memory (RRAM) for accelerating online training of deep neural networks (DNNs). A high degree of asymmetric nordinearity in analog RRAMs could be tolerated when weight update algorithms are optimized with reduced training noise. Hybrid-weight Net (HW-Net), a modified multilayer perceptron (MLP) algorithm that utilizes hybrid internal analog and external binary weights is also proposed. Highly accurate online training could be realized using simple binary RRAMs that have already been widely developed as digital memory.

Original languageEnglish
Title of host publication2017 IEEE International Electron Devices Meeting, IEDM 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages11.6.1-11.6.4
ISBN (Electronic)9781538635599
DOIs
StatePublished - 23 Jan 2018
Event63rd IEEE International Electron Devices Meeting, IEDM 2017 - San Francisco, United States
Duration: 2 Dec 20176 Dec 2017

Publication series

NameTechnical Digest - International Electron Devices Meeting, IEDM
ISSN (Print)0163-1918

Conference

Conference63rd IEEE International Electron Devices Meeting, IEDM 2017
CountryUnited States
CitySan Francisco
Period2/12/176/12/17

Fingerprint Dive into the research topics of 'Challenges and opportunities toward online training acceleration using RRAM-based hardware neural network'. Together they form a unique fingerprint.

Cite this