Update README.md (#1)
Browse files- Update README.md (5648fedd296d033e43d2c83a813f5ef1a817d2bb)
README.md
CHANGED
|
@@ -2,21 +2,21 @@
|
|
| 2 |
tags:
|
| 3 |
- merge
|
| 4 |
- mergekit
|
| 5 |
-
- lazymergekit
|
| 6 |
- Nexusflow/Starling-LM-7B-beta
|
| 7 |
- FuseAI/FuseChat-7B-VaRM
|
| 8 |
base_model:
|
| 9 |
- Nexusflow/Starling-LM-7B-beta
|
| 10 |
- FuseAI/FuseChat-7B-VaRM
|
|
|
|
| 11 |
---
|
| 12 |
|
| 13 |
-
#
|
| 14 |
|
| 15 |
-
|
| 16 |
* [Nexusflow/Starling-LM-7B-beta](https://huggingface.co/Nexusflow/Starling-LM-7B-beta)
|
| 17 |
* [FuseAI/FuseChat-7B-VaRM](https://huggingface.co/FuseAI/FuseChat-7B-VaRM)
|
| 18 |
|
| 19 |
-
##
|
| 20 |
|
| 21 |
```yaml
|
| 22 |
slices:
|
|
@@ -37,7 +37,7 @@ parameters:
|
|
| 37 |
dtype: bfloat16
|
| 38 |
```
|
| 39 |
|
| 40 |
-
##
|
| 41 |
|
| 42 |
```python
|
| 43 |
!pip install -qU transformers accelerate
|
|
@@ -60,4 +60,12 @@ pipeline = transformers.pipeline(
|
|
| 60 |
|
| 61 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
| 62 |
print(outputs[0]["generated_text"])
|
| 63 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
tags:
|
| 3 |
- merge
|
| 4 |
- mergekit
|
|
|
|
| 5 |
- Nexusflow/Starling-LM-7B-beta
|
| 6 |
- FuseAI/FuseChat-7B-VaRM
|
| 7 |
base_model:
|
| 8 |
- Nexusflow/Starling-LM-7B-beta
|
| 9 |
- FuseAI/FuseChat-7B-VaRM
|
| 10 |
+
license: apache-2.0
|
| 11 |
---
|
| 12 |
|
| 13 |
+
# L-MChat-7b
|
| 14 |
|
| 15 |
+
L-MChat-7b is a merge of the following models:
|
| 16 |
* [Nexusflow/Starling-LM-7B-beta](https://huggingface.co/Nexusflow/Starling-LM-7B-beta)
|
| 17 |
* [FuseAI/FuseChat-7B-VaRM](https://huggingface.co/FuseAI/FuseChat-7B-VaRM)
|
| 18 |
|
| 19 |
+
## Configuration
|
| 20 |
|
| 21 |
```yaml
|
| 22 |
slices:
|
|
|
|
| 37 |
dtype: bfloat16
|
| 38 |
```
|
| 39 |
|
| 40 |
+
## Usage
|
| 41 |
|
| 42 |
```python
|
| 43 |
!pip install -qU transformers accelerate
|
|
|
|
| 60 |
|
| 61 |
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
|
| 62 |
print(outputs[0]["generated_text"])
|
| 63 |
+
```
|
| 64 |
+
|
| 65 |
+
## License
|
| 66 |
+
|
| 67 |
+
Apache 2.0 but you cannot use this model to directly compete with OpenAI.
|
| 68 |
+
|
| 69 |
+
## How?
|
| 70 |
+
|
| 71 |
+
Usage of [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing).
|