Skip to content

Commit 1084cae

Browse files
Replace arXiv links with CVF links
1 parent f752571 commit 1084cae

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -255,4 +255,4 @@ since November 2021 and June 2022, respectively.
255255
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/ickd/) Li Liu, Qingle Huang, Sihao Lin, Hongwei Xie, Bing Wang, Xiaojun Chang, Xiaodan Liang. ["Exploring Inter-Channel Correlation for Diversity-Preserved Knowledge Distillation"](https://openaccess.thecvf.com/content/ICCV2021/html/Liu_Exploring_Inter-Channel_Correlation_for_Diversity-Preserved_Knowledge_Distillation_ICCV_2021_paper.html) (ICCV 2021)
256256
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/dist/) Tao Huang, Shan You, Fei Wang, Chen Qian, Chang Xu. ["Knowledge Distillation from A Stronger Teacher"](https://proceedings.neurips.cc/paper_files/paper/2022/hash/da669dfd3c36c93905a17ddba01eef06-Abstract-Conference.html) (NeurIPS 2022)
257257
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/srd/) Roy Miles, Krystian Mikolajczyk. ["Understanding the Role of the Projector in Knowledge Distillation"](https://ojs.aaai.org/index.php/AAAI/article/view/28219) (AAAI 2024)
258-
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/kd_w_ls/) Shangquan Sun, Wenqi Ren, Jingzhi Li, Rui Wang, Xiaochun Cao. ["Logit Standardization in Knowledge Distillation"](https://arxiv.org/abs/2403.01427) (CVPR 2024)
258+
- [:mag:](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/kd_w_ls/) Shangquan Sun, Wenqi Ren, Jingzhi Li, Rui Wang, Xiaochun Cao. ["Logit Standardization in Knowledge Distillation"](https://openaccess.thecvf.com/content/CVPR2024/html/Sun_Logit_Standardization_in_Knowledge_Distillation_CVPR_2024_paper.html) (CVPR 2024)

docs/source/benchmarks.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ Original work
4343
* ICKD: `"Exploring Inter-Channel Correlation for Diversity-Preserved Knowledge Distillation" <https://openaccess.thecvf.com/content/ICCV2021/html/Liu_Exploring_Inter-Channel_Correlation_for_Diversity-Preserved_Knowledge_Distillation_ICCV_2021_paper.html>`_
4444
* DIST: `"Knowledge Distillation from A Stronger Teacher" <https://proceedings.neurips.cc/paper_files/paper/2022/hash/da669dfd3c36c93905a17ddba01eef06-Abstract-Conference.html>`_
4545
* SRD: `"Understanding the Role of the Projector in Knowledge Distillation" <https://ojs.aaai.org/index.php/AAAI/article/view/28219>`_
46-
* KD w/ LS: `"Logit Standardization in Knowledge Distillation" <https://arxiv.org/abs/2403.01427>`_
46+
* KD w/ LS: `"Logit Standardization in Knowledge Distillation" <https://openaccess.thecvf.com/content/CVPR2024/html/Sun_Logit_Standardization_in_Knowledge_Distillation_CVPR_2024_paper.html>`_
4747

4848
References
4949
^^^^

0 commit comments

Comments
 (0)