RePIM: Joint Exploitation of Activation and Weight Repetitions for In-ReRAM DNN Acceleration

Chen Yang Tsai, Chin Fu Nien, Tz Ching Yu, Hung Yu Yeh, Hsiang Yun Cheng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

Eliminating redundant computations is a common approach to improve the performance of ReRAM-based DNN accelerators. While existing practical ReRAM-based accelerators eliminate part of the redundant computations by exploiting sparsity in inputs and weights or utilizing weight patterns of DNN models, they fail to identify all the redundancy, resulting in many unnecessary computations. Thus, we propose a practical design, RePIM, that is the first to jointly exploit the repetition of both inputs and weights. Our evaluation shows that RePIM is effective in eliminating unnecessary computations, achieving an average of 15.24× speedup and 96.07% energy savings over the state-of-the-art practical ReRAM-based accelerator.

Original languageEnglish
Title of host publication2021 58th ACM/IEEE Design Automation Conference, DAC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages589-594
Number of pages6
ISBN (Electronic)9781665432740
DOIs
StatePublished - 05 12 2021
Externally publishedYes
Event58th ACM/IEEE Design Automation Conference, DAC 2021 - San Francisco, United States
Duration: 05 12 202109 12 2021

Publication series

NameProceedings - Design Automation Conference
Volume2021-December
ISSN (Print)0738-100X

Conference

Conference58th ACM/IEEE Design Automation Conference, DAC 2021
Country/TerritoryUnited States
CitySan Francisco
Period05/12/2109/12/21

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

Fingerprint

Dive into the research topics of 'RePIM: Joint Exploitation of Activation and Weight Repetitions for In-ReRAM DNN Acceleration'. Together they form a unique fingerprint.

Cite this