Another stuff left behind
Browse files
README.md
CHANGED
|
@@ -78,7 +78,7 @@ Fine-tuning was performed entirely on a consumer-grade laptop:
|
|
| 78 |
- **RAM:** 16 GB
|
| 79 |
- **Quantization:** 4-bit NF4
|
| 80 |
- **Strategy:** Low VRAM setup using gradient accumulation, packing, and LoRA adapters
|
| 81 |
-
|
| 82 |
This demonstrates that Phi-2 can be fine-tuned effectively even on low-VRAM devices.
|
| 83 |
|
| 84 |
---
|
|
|
|
| 78 |
- **RAM:** 16 GB
|
| 79 |
- **Quantization:** 4-bit NF4
|
| 80 |
- **Strategy:** Low VRAM setup using gradient accumulation, packing, and LoRA adapters
|
| 81 |
+
|
| 82 |
This demonstrates that Phi-2 can be fine-tuned effectively even on low-VRAM devices.
|
| 83 |
|
| 84 |
---
|