Transformers
GGUF
mergekit
Merge
aashish1904 commited on
Commit
f7d30d1
·
verified ·
1 Parent(s): e811f63

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +107 -0
README.md ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ base_model:
5
+ - TheDrummer/Rocinante-12B-v1
6
+ - jtatman/mistral_nemo_12b_reasoning_psychology_lora
7
+ - Epiculous/Azure_Dusk-v0.2
8
+ - nbeerbower/mistral-nemo-bophades-12B
9
+ - Epiculous/Crimson_Dawn-v0.2
10
+ - nbeerbower/mistral-nemo-wissenschaft-12B
11
+ - anthracite-org/magnum-v2-12b
12
+ - TheDrummer/Rocinante-12B-v1.1
13
+ - anthracite-org/magnum-v2.5-12b-kto
14
+ - mpasila/Mistral-freeLiPPA-LoRA-12B
15
+ - anthracite-org/magnum-v2.5-12b-kto
16
+ - jeiku/Aura-NeMo-12B
17
+ - nbeerbower/mistral-nemo-cc-12B
18
+ - UsernameJustAnother/Nemo-12B-Marlin-v8
19
+ - elinas/Chronos-Gold-12B-1.0
20
+ - SillyTilly/mistralai_Mistral-Nemo-Base-2407
21
+ - nbeerbower/Lyra4-Gutenberg-12B
22
+ - nbeerbower/mistral-nemo-gutenberg-12B-v4
23
+ library_name: transformers
24
+ tags:
25
+ - mergekit
26
+ - merge
27
+ license: apache-2.0
28
+
29
+ ---
30
+
31
+ [![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
32
+
33
+
34
+ # QuantFactory/MN-Halide-12b-v1.0-GGUF
35
+ This is quantized version of [Azazelle/MN-Halide-12b-v1.0](https://huggingface.co/Azazelle/MN-Halide-12b-v1.0) created using llama.cpp
36
+
37
+ # Original Model Card
38
+
39
+ # MN-Halide-12b-v1.0
40
+
41
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
42
+
43
+ ## Merge Details
44
+ ### Merge Method
45
+
46
+ This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [SillyTilly/mistralai_Mistral-Nemo-Base-2407](https://huggingface.co/SillyTilly/mistralai_Mistral-Nemo-Base-2407) as a base.
47
+
48
+ ### Models Merged
49
+
50
+ The following models were included in the merge:
51
+ * [TheDrummer/Rocinante-12B-v1](https://huggingface.co/TheDrummer/Rocinante-12B-v1) + [jtatman/mistral_nemo_12b_reasoning_psychology_lora](https://huggingface.co/jtatman/mistral_nemo_12b_reasoning_psychology_lora)
52
+ * [Epiculous/Azure_Dusk-v0.2](https://huggingface.co/Epiculous/Azure_Dusk-v0.2)
53
+ * [nbeerbower/mistral-nemo-bophades-12B](https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B)
54
+ * [Epiculous/Crimson_Dawn-v0.2](https://huggingface.co/Epiculous/Crimson_Dawn-v0.2)
55
+ * [nbeerbower/mistral-nemo-wissenschaft-12B](https://huggingface.co/nbeerbower/mistral-nemo-wissenschaft-12B)
56
+ * [anthracite-org/magnum-v2-12b](https://huggingface.co/anthracite-org/magnum-v2-12b)
57
+ * [TheDrummer/Rocinante-12B-v1.1](https://huggingface.co/TheDrummer/Rocinante-12B-v1.1)
58
+ * [anthracite-org/magnum-v2.5-12b-kto](https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto) + [mpasila/Mistral-freeLiPPA-LoRA-12B](https://huggingface.co/mpasila/Mistral-freeLiPPA-LoRA-12B)
59
+ * [anthracite-org/magnum-v2.5-12b-kto](https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto) + [jeiku/Aura-NeMo-12B](https://huggingface.co/jeiku/Aura-NeMo-12B)
60
+ * [nbeerbower/mistral-nemo-cc-12B](https://huggingface.co/nbeerbower/mistral-nemo-cc-12B)
61
+ * [UsernameJustAnother/Nemo-12B-Marlin-v8](https://huggingface.co/UsernameJustAnother/Nemo-12B-Marlin-v8)
62
+ * [elinas/Chronos-Gold-12B-1.0](https://huggingface.co/elinas/Chronos-Gold-12B-1.0)
63
+ * [nbeerbower/Lyra4-Gutenberg-12B](https://huggingface.co/nbeerbower/Lyra4-Gutenberg-12B)
64
+ * [nbeerbower/mistral-nemo-gutenberg-12B-v4](https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4)
65
+
66
+ ### Configuration
67
+
68
+ The following YAML configuration was used to produce this model:
69
+
70
+ ```yaml
71
+ base_model: SillyTilly/mistralai_Mistral-Nemo-Base-2407
72
+ dtype: float32
73
+ merge_method: model_stock
74
+ slices:
75
+ - sources:
76
+ - layer_range: [0, 40]
77
+ model: nbeerbower/Lyra4-Gutenberg-12B
78
+ - layer_range: [0, 40]
79
+ model: nbeerbower/mistral-nemo-gutenberg-12B-v4
80
+ - layer_range: [0, 40]
81
+ model: elinas/Chronos-Gold-12B-1.0
82
+ - layer_range: [0, 40]
83
+ model: UsernameJustAnother/Nemo-12B-Marlin-v8
84
+ - layer_range: [0, 40]
85
+ model: TheDrummer/Rocinante-12B-v1.1
86
+ - layer_range: [0, 40]
87
+ model: Epiculous/Azure_Dusk-v0.2
88
+ - layer_range: [0, 40]
89
+ model: Epiculous/Crimson_Dawn-v0.2
90
+ - layer_range: [0, 40]
91
+ model: TheDrummer/Rocinante-12B-v1+jtatman/mistral_nemo_12b_reasoning_psychology_lora
92
+ - layer_range: [0, 40]
93
+ model: nbeerbower/mistral-nemo-wissenschaft-12B
94
+ - layer_range: [0, 40]
95
+ model: nbeerbower/mistral-nemo-bophades-12B
96
+ - layer_range: [0, 40]
97
+ model: anthracite-org/magnum-v2.5-12b-kto+mpasila/Mistral-freeLiPPA-LoRA-12B
98
+ - layer_range: [0, 40]
99
+ model: nbeerbower/mistral-nemo-cc-12B
100
+ - layer_range: [0, 40]
101
+ model: anthracite-org/magnum-v2-12b
102
+ - layer_range: [0, 40]
103
+ model: anthracite-org/magnum-v2.5-12b-kto+jeiku/Aura-NeMo-12B
104
+ - layer_range: [0, 40]
105
+ model: SillyTilly/mistralai_Mistral-Nemo-Base-2407
106
+ tokenizer_source: unsloth/Mistral-Nemo-Base-2407
107
+ ```