Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Gradient-based Meta-learning with Learned Layerwise Metric and Subspace

Title
Gradient-based Meta-learning with Learned Layerwise Metric and Subspace
Authors
이윤호
Date Issued
2018
Publisher
포항공과대학교
Abstract
Deep learning has been tremendously successful in many difficult tasks including image classification and game-playing. However, deep networks require copious amounts of data in order to achieve such performance. Meta-learning methods have recently gained attention in this context as a way to remedy this dependence on large datasets. Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the MT-net , which enables the meta- learner to learn on each layer’s activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an MT-net performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner’s adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.
URI
http://postech.dcollection.net/common/orgView/200000104867
https://oasis.postech.ac.kr/handle/2014.oak/93592
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse