File tree 1 file changed +2
-1
lines changed
1 file changed +2
-1
lines changed Original file line number Diff line number Diff line change 7
7
[ ![ DOI:10.1007/978-3-030-76423-4_3] ( https://zenodo.org/badge/DOI/10.1007/978-3-030-76423-4_3.svg )] ( https://doi.org/10.1007/978-3-030-76423-4_3 )
8
8
[ ![ DOI:10.18653/v1/2023.nlposs-1.18] ( https://zenodo.org/badge/DOI/10.18653/v1/2023.nlposs-1.18.svg )] ( https://doi.org/10.18653/v1/2023.nlposs-1.18 )
9
9
10
-
11
10
*** torchdistill*** (formerly * kdkit* ) offers various state-of-the-art knowledge distillation methods
12
11
and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code.
13
12
Even when you need to extract intermediate representations in teacher/student models,
@@ -19,6 +18,8 @@ In addition to knowledge distillation, this framework helps you design and perfo
19
18
simply by excluding teacher entries from a declarative yaml config file.
20
19
You can find such examples below and in [ configs/sample/] ( https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/ ) .
21
20
21
+ In December 2023, *** torchdistill*** officially joined [ PyTorch Ecosystem] ( https://pytorch.org/ecosystem/ ) .
22
+
22
23
When you refer to *** torchdistill*** in your paper, please cite [ these papers] ( https://github.com/yoshitomo-matsubara/torchdistill#citation )
23
24
instead of this GitHub repository.
24
25
** If you use** *** torchdistill*** ** as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**
You can’t perform that action at this time.
0 commit comments