Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Stepwise Resolution Scaling for Low-Cost Convolutional Neural Network Training

Title
Stepwise Resolution Scaling for Low-Cost Convolutional Neural Network Training
Authors
이재철
Date Issued
2020
Publisher
포항공과대학교
Abstract
With the development of artificial intelligence, convolutional neural networks (CNNs) are now widely used in mobile systems such as home appliances and smartphones. However, CNNs have high computational costs for training and inference. Thus, they are typically used on cloud servers or specially designed deep learning accelerators with pre-trained parameters. To enable on-device learning, it is important to reduce the computational costs and memory usage. In this thesis, a stepwise resolution scaling technique is presented to address this problem. It starts by training with a low resolution, and then increasing the resolution, in a stepwise manner, after a predetermined number of epochs. This technique is further improved by using stepwise resolution scaling with blockwise layer freezing. With blockwise layer freezing, the number of active layer blocks is decreased as the training resolution is increased. It is experimentally shown that stepwise resolution scaling with blockwise layer freezing reduces the required computational costs and memory usage with practically no accuracy drop.
URI
http://postech.dcollection.net/common/orgView/200000288468
https://oasis.postech.ac.kr/handle/2014.oak/111721
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse