Update README.md
Browse files
README.md
CHANGED
|
@@ -73,7 +73,7 @@ The following clients/libraries will automatically download models for you, prov
|
|
| 73 |
* Faraday.dev
|
| 74 |
|
| 75 |
- **Option A** - Downloading in `text-generation-webui`:
|
| 76 |
-
- **Step 1**: Under Download Model, you can enter the model repo: PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed
|
| 77 |
- **Step 2**: Then click Download.
|
| 78 |
|
| 79 |
- **Option B** - Downloading on the command line (including multiple files at once):
|
|
@@ -83,14 +83,14 @@ pip3 install huggingface-hub
|
|
| 83 |
```
|
| 84 |
- **Step 2**: Then you can download any individual model file to the current directory, at high speed, with a command like this:
|
| 85 |
```shell
|
| 86 |
-
huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed
|
| 87 |
```
|
| 88 |
<details>
|
| 89 |
<summary>More advanced huggingface-cli download usage (click to read)</summary>
|
| 90 |
Alternatively, you can also download multiple files at once with a pattern:
|
| 91 |
|
| 92 |
```shell
|
| 93 |
-
huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed
|
| 94 |
```
|
| 95 |
|
| 96 |
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
|
|
@@ -104,7 +104,7 @@ pip3 install hf_transfer
|
|
| 104 |
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
|
| 105 |
|
| 106 |
```shell
|
| 107 |
-
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed
|
| 108 |
```
|
| 109 |
|
| 110 |
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
|
|
|
|
| 73 |
* Faraday.dev
|
| 74 |
|
| 75 |
- **Option A** - Downloading in `text-generation-webui`:
|
| 76 |
+
- **Step 1**: Under Download Model, you can enter the model repo: PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed and below it, a specific filename to download, such as: phi-2.IQ3_M.gguf.
|
| 77 |
- **Step 2**: Then click Download.
|
| 78 |
|
| 79 |
- **Option B** - Downloading on the command line (including multiple files at once):
|
|
|
|
| 83 |
```
|
| 84 |
- **Step 2**: Then you can download any individual model file to the current directory, at high speed, with a command like this:
|
| 85 |
```shell
|
| 86 |
+
huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed Phi-3-mini-4k-instruct.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
|
| 87 |
```
|
| 88 |
<details>
|
| 89 |
<summary>More advanced huggingface-cli download usage (click to read)</summary>
|
| 90 |
Alternatively, you can also download multiple files at once with a pattern:
|
| 91 |
|
| 92 |
```shell
|
| 93 |
+
huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
|
| 94 |
```
|
| 95 |
|
| 96 |
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
|
|
|
|
| 104 |
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
|
| 105 |
|
| 106 |
```shell
|
| 107 |
+
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download PrunaAI/Phi-3-mini-4k-instruct-GGUF-smashed Phi-3-mini-4k-instruct.IQ3_M.gguf --local-dir . --local-dir-use-symlinks False
|
| 108 |
```
|
| 109 |
|
| 110 |
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
|