Skip to content

Code for our NeurIPS 2022 paper 'Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation'

Notifications You must be signed in to change notification settings

wangkai930418/AaD_SFDA

 
 

Repository files navigation

(NeurIPS 2022) Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation

Shiqi Yang, Yaxing Wang, Kai Wang, Shangling Jui and Joost van de Weijer

Code for our paper 'Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation'

[project][arxiv]

Contributions

  • We provide a surprisingly simple solution for source-free domain adaptation, which is an upperbound of the proposed clustering objective:

img

  • And we can relate several methods in domain adaptation, source-free domain adaptation and contrastive learning via the perspective of discriminability and diversity:

img2

code on VisDA

We use pytoch 1.3 with cuda 10.0

Attention: Please note that the kl_div in pytorch equals to dot product if there is no log for the input.

Download VisDA dataset and change the path in the code to it. First train model on Source domain, directly run src_pretrain.py Source-free domain adaptation, directly run tar_adaptation.py

For more datasets, you can insert the core part (loss computing starting in Line 297 in tar_adaptation.py) into code of our NRC (NeurIPS 2021)

For computing SND, you can use the file snd.py (code is from SND), in the paper we compute SND after only training for a few epochs (~5 on visda).

About

Code for our NeurIPS 2022 paper 'Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 94.4%
  • Jupyter Notebook 3.3%
  • Shell 2.3%