saurabhy27-outcomes commited on
Commit
40e065b
·
verified ·
1 Parent(s): 8d7081a

Model save

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [microsoft/Phi-4-multimodal-instruct](https://huggingface.co/microsoft/Phi-4-multimodal-instruct) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.0059
20
 
21
  ## Model description
22
 
@@ -39,7 +39,7 @@ The following hyperparameters were used during training:
39
  - train_batch_size: 2
40
  - eval_batch_size: 8
41
  - seed: 42
42
- - optimizer: Use adamw_torch with betas=(0.9,0.99) and epsilon=1e-07 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: cosine
44
  - lr_scheduler_warmup_ratio: 0.1
45
  - num_epochs: 1
@@ -48,12 +48,12 @@ The following hyperparameters were used during training:
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:----:|:---------------:|
51
- | 0.0341 | 1.0 | 1275 | 0.0059 |
52
 
53
 
54
  ### Framework versions
55
 
56
  - Transformers 4.51.3
57
- - Pytorch 2.1.0+cu118
58
- - Datasets 3.5.1
59
  - Tokenizers 0.21.1
 
16
 
17
  This model is a fine-tuned version of [microsoft/Phi-4-multimodal-instruct](https://huggingface.co/microsoft/Phi-4-multimodal-instruct) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.0043
20
 
21
  ## Model description
22
 
 
39
  - train_batch_size: 2
40
  - eval_batch_size: 8
41
  - seed: 42
42
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.99) and epsilon=1e-07 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: cosine
44
  - lr_scheduler_warmup_ratio: 0.1
45
  - num_epochs: 1
 
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:----:|:---------------:|
51
+ | 0.0166 | 1.0 | 1275 | 0.0043 |
52
 
53
 
54
  ### Framework versions
55
 
56
  - Transformers 4.51.3
57
+ - Pytorch 2.8.0.dev20250319+cu128
58
+ - Datasets 3.6.0
59
  - Tokenizers 0.21.1