Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Cross-domain Ensemble Distillation for Domain Generalization

Title
Cross-domain Ensemble Distillation for Domain Generalization
Authors
이경문김성연곽수하
Date Issued
2022-10-25
Publisher
Springer Science and Business Media Deutschland GmbH
Abstract
Domain generalization is the task of learning models that generalize to unseen target domains. We propose a simple yet effective method for domain generalization, named cross-domain ensemble distillation (XDED), that learns domain-invariant features while encouraging the model to converge to flat minima, which recently turned out to be a sufficient condition for domain generalization. To this end, our method generates an ensemble of the output logits from training data with the same label but from different domains and then penalizes each output for the mismatch with the ensemble. Also, we present a de-stylization technique that standardizes features to encourage the model to produce style-consistent predictions even in an arbitrary target domain. Our method greatly improves generalization capability in public benchmarks for cross-domain image classification, cross-dataset person re-ID, and cross-dataset semantic segmentation. Moreover, we show that models learned by our method are robust against adversarial attacks and unseen corruptions.
URI
https://oasis.postech.ac.kr/handle/2014.oak/122823
Article Type
Conference
Citation
17th European Conference on Computer Vision, ECCV 2022, page. 1 - 20, 2022-10-25
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Views & Downloads

Browse