Skip to content

Commit f752571

Browse files
Update news
1 parent 549b47d commit f752571

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

README.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,6 @@
77
[![DOI:10.1007/978-3-030-76423-4_3](https://zenodo.org/badge/DOI/10.1007/978-3-030-76423-4_3.svg)](https://doi.org/10.1007/978-3-030-76423-4_3)
88
[![DOI:10.18653/v1/2023.nlposs-1.18](https://zenodo.org/badge/DOI/10.18653/v1/2023.nlposs-1.18.svg)](https://doi.org/10.18653/v1/2023.nlposs-1.18)
99

10-
1110
***torchdistill*** (formerly *kdkit*) offers various state-of-the-art knowledge distillation methods
1211
and enables you to design (new) experiments simply by editing a declarative yaml config file instead of Python code.
1312
Even when you need to extract intermediate representations in teacher/student models,
@@ -19,6 +18,8 @@ In addition to knowledge distillation, this framework helps you design and perfo
1918
simply by excluding teacher entries from a declarative yaml config file.
2019
You can find such examples below and in [configs/sample/](https://github.com/yoshitomo-matsubara/torchdistill/tree/main/configs/sample/).
2120

21+
In December 2023, ***torchdistill*** officially joined [PyTorch Ecosystem](https://pytorch.org/ecosystem/).
22+
2223
When you refer to ***torchdistill*** in your paper, please cite [these papers](https://github.com/yoshitomo-matsubara/torchdistill#citation)
2324
instead of this GitHub repository.
2425
**If you use** ***torchdistill*** **as part of your work, your citation is appreciated and motivates me to maintain and upgrade this framework!**

0 commit comments

Comments
 (0)