Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Improving Robustness and Accuracy in Classification Models through Retrospective Online Adversarial Distillation Pohang University of Science and Technology

Title
Improving Robustness and Accuracy in Classification Models through Retrospective Online Adversarial Distillation Pohang University of Science and Technology
Authors
김중수
Date Issued
2024
Publisher
포항공과대학교
Abstract
Adversarial distillation (AD), transferring knowledge of a robust teacher model to a student model, has emerged as an advanced approach to improving robustness against adversarial attacks. However, AD in general suffers from the high computa- tional complexity of pre-training the robust teacher as well as the inherent trade-off between robustness and natural accuracy (i.e., accuracy on clean data). To address these issues, we propose retrospective online adversarial distillation (ROAD). ROAD exploits the student itself of the last epoch and a natural model (i.e., a model trained with clean data) as teachers, instead of a pre-trained robust teacher in the conventional AD. We revealed both theoretically and empirically that knowledge distillation from the student of the last epoch allows to penalize overly confident predictions on adver- sarial examples, leading to improved robustness and generalization. Also, the student and the natural model are trained together in a collaborative manner, which enables to improve natural accuracy of the student more effectively. We demonstrate by ex- tensive experiments that ROAD achieved outstanding performance in both robustness and natural accuracy with substantially reduced training time and computation cost.
URI
http://postech.dcollection.net/common/orgView/200000733607
https://oasis.postech.ac.kr/handle/2014.oak/123299
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse