Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Hierarchical Approximate Memory for Deep Neural Network Applications

Title
Hierarchical Approximate Memory for Deep Neural Network Applications
Authors
LEE, SUNG GULEE, YOUNGJOOKIM, JEONGHUNHWANG, SEOKHAHA, MINHO
Date Issued
2020-11-02
Publisher
IEEE Signal Processing Society
Abstract
Power consumed by a computer memory system can be significantly reduced if a certain level of error is permitted in the data stored in memory. Such an approximate memory approach is viable for use in applications developed using deep neural networks (DNNs) because such applications are typically error-resilient. In this paper, the use of hierarchical approximate memory for DNNs is studied and modeled. Although previous research has focused on approximate memory for specific memory technologies, this work proposes to consider approximate memory for the entire memory hierarchy of a computer system by considering the error budget for a given target application. This paper proposes a system model based on the error budget (amount by which the memory error rate can be permitted to rise to) for a target application and the power usage characteristics of the constituent memory technologies of a memory hierarchy. Using DNN case studies involving SRAM, DRAM, and NAND, this paper shows that the overall memory power consumption can be reduced by up to 43.38% by using the proposed model to optimally divide up the available error budget.
URI
https://oasis.postech.ac.kr/handle/2014.oak/105795
ISSN
1058-6393
Article Type
Conference
Citation
54th Asilomar Conference on Signals, Systems, and Computers, 2020-11-02
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

이승구LEE, SUNG GU
Dept of Electrical Enginrg
Read more

Views & Downloads

Browse