Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

TF-MVP: Novel Sparsity-Aware Transformer Accelerator with Mixed-Length Vector Pruning

Title
TF-MVP: Novel Sparsity-Aware Transformer Accelerator with Mixed-Length Vector Pruning
Authors
Yoo, EunjiPark, GunhoMin, Jung GyuJung Kwon, SePark, BaeseongLee, DongsooLee, Youngjoo
Date Issued
2023-07-11
Publisher
Institute of Electrical and Electronics Engineers Inc.
Abstract
We present the energy-efficient TF-MVP architecture, a sparsity-aware transformer accelerator, by introducing novel algorithm-hardware co-optimization techniques. From the previous fine-grained pruning map, for the first time, the direction strength is developed to analyze the pruning patterns quantitatively, indicating the major pruning direction and size of each layer. Then, the mixed-length vector pruning (MVP) is proposed to generate the hardware-friendly pruned-transformer model, which is fully supported by our TF-MVP accelerator with the reconfigurable PE structure. Implemented in a 28nm CMOS technology, as a result, TF-MVP achieves 377 GOPs/W for accelerating GPT-2 small model by realizing 4096 multiply-accumulate operators, which is 2.09 times better than the state-of-the-art sparsity-aware transformer accelerator.
URI
https://oasis.postech.ac.kr/handle/2014.oak/121304
Article Type
Conference
Citation
60th ACM/IEEE Design Automation Conference, DAC 2023, 2023-07-11
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

이영주LEE, YOUNGJOO
Dept of Electrical Enginrg
Read more

Views & Downloads

Browse