Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads
Full metadata record
Files in This Item:
There are no files associated with this item.
DC FieldValueLanguage
dc.contributor.author권혁준-
dc.date.accessioned2022-03-29T02:48:57Z-
dc.date.available2022-03-29T02:48:57Z-
dc.date.issued2021-
dc.identifier.otherOAK-2015-08247-
dc.identifier.urihttp://postech.dcollection.net/common/orgView/200000366536ko_KR
dc.identifier.urihttps://oasis.postech.ac.kr/handle/2014.oak/111052-
dc.descriptionMaster-
dc.description.abstract본 논문은 가중치 밀집도가 낮은 신경망에서의 가중치 스케줄 효율을 높이기 위해 1) 가중치 채널 병합(channel-merging)을 통해 밀집도가 높은 신경망을 만들어 스케줄 효율을 높이고, 2) 병합된 가중치 채널을 다룰 수 있는 하드웨어 가속기를 제안 및 평가한다.-
dc.description.abstractIn this thesis, a channel-merging offline scheduling scheme is presented for improving the efficiency of the previous offline scheduler in highly pruned convolutional neural networks (CNN). In the channel-merging step, two channels in the same layers are merged lane-wise to increase the network’s channel-level sparsity. Also, a modified hardware architecture is presented to handle merged and scheduled weights. With the zero-skip and outlier-aware scheduling schemes of the previous accelerator, the proposed merging and scheduling method achieve more lane utilization and speedup. Despite a little area overhead of the proposed hardware, fast calculation and reduced memory access make the energy consumption lower than the previous hardware.-
dc.languageeng-
dc.publisher포항공과대학교-
dc.titleA Channel Merging Approach to Control Sparsity in Neural Networks-
dc.title.alternative채널 병합을 통한 신경망 밀집도 제어-
dc.typeThesis-
dc.contributor.college일반대학원 전자전기공학과-
dc.date.degree2021- 2-

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse