Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Time-step interleaved weight reuse for LSTM neural network computing

Title
Time-step interleaved weight reuse for LSTM neural network computing
Authors
PARKNAEBEOM김율화안대현김태수KIM, JAE JOON
Date Issued
2020-08-11
Publisher
Association for Computing Machinery
Abstract
In Long Short-Term Memory (LSTM) neural network models, a weight matrix tends to be repeatedly loaded from DRAM if the size of on-chip storage of the processor is not large enough to store the entire matrix. To alleviate heavy overhead of DRAM access for weight loading in LSTM computations, we propose a weight reuse scheme which utilizes the weight sharing characteristics in two adjacent time-step computations. Experimental results show that the proposed weight reuse scheme reduces the energy consumption by 28.4-57.3% and increases the overall throughput by 110.8% compared to the conventional schemes.
URI
https://oasis.postech.ac.kr/handle/2014.oak/106068
Article Type
Conference
Citation
2020 ACM/IEEE International Symposium on Low Power Electronics and Design, ISLPED 2020, 2020-08-11
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

김재준KIM, JAE JOON
Dept. Convergence IT Engineering
Read more

Views & Downloads

Browse