Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,31 @@
|
|
| 1 |
-
---
|
| 2 |
-
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language:
|
| 3 |
+
- en
|
| 4 |
+
- zh
|
| 5 |
+
license: apache-2.0
|
| 6 |
+
library_name: transformers
|
| 7 |
+
datasets:
|
| 8 |
+
- EleutherAI/pile
|
| 9 |
+
- togethercomputer/RedPajama-Data-1T
|
| 10 |
+
- p208p2002/wudao
|
| 11 |
+
widget:
|
| 12 |
+
- text: <s> 4 + 3 =
|
| 13 |
+
---
|
| 14 |
+
## MiniLoong-3B
|
| 15 |
+
|
| 16 |
+
π [arXiv](https://arxiv.org/abs/2311.07052) | π» [GitHub](https://github.com/GeneZC/MiniMA) | π€ [HuggingFace-MiniMA-3B](https://huggingface.co/GeneZC/MiniMA-3B) | π€ [HuggingFace-MiniChat-3B](https://huggingface.co/GeneZC/MiniChat-3B) | π€ [ModelScope-MiniMA-3B](https://modelscope.cn/models/GeneZC/MiniMA-3B) | π€ [ModelScope-MiniChat-3B](https://modelscope.cn/models/GeneZC/MiniChat-3B) | π€ [HuggingFace-MiniChat-1.5-3B](https://huggingface.co/GeneZC/MiniChat-1.5-3B) | π€ [HuggingFace-MiniMA-2-3B](https://huggingface.co/GeneZC/MiniMA-2-3B) | π€ [HuggingFace-MiniChat-2-3B](https://huggingface.co/GeneZC/MiniChat-2-3B) | π€ [HuggingFace-MiniMA-2-1B](https://huggingface.co/GeneZC/MiniMA-2-1B) | π€ [HuggingFace-MiniLoong-3B](https://huggingface.co/GeneZC/MiniLoong-3B) | π€ [HuggingFace-MiniMix-2/4x3B](https://huggingface.co/GeneZC/MiniMix-2_4x3B)
|
| 17 |
+
|
| 18 |
+
β Must comply with LICENSE of LLaMA-2 since it is derived from LLaMA-2.
|
| 19 |
+
|
| 20 |
+
<img src="./teaser_d.jpg" alt="teaser_d" width="700" />
|
| 21 |
+
|
| 22 |
+
## Bibtex
|
| 23 |
+
|
| 24 |
+
```bibtex
|
| 25 |
+
@article{zhang2023law,
|
| 26 |
+
title={Towards the Law of Capacity Gap in Distilling Language Models},
|
| 27 |
+
author={Zhang, Chen and Song, Dawei and Ye, Zheyu and Gao, Yan},
|
| 28 |
+
year={2023},
|
| 29 |
+
url={https://arxiv.org/abs/2311.07052}
|
| 30 |
+
}
|
| 31 |
+
```
|