FallenMerick commited on
Commit
385464c
·
verified ·
1 Parent(s): c806919

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -67
README.md CHANGED
@@ -1,67 +1,70 @@
1
- ---
2
- license: cc-by-4.0
3
- language:
4
- - en
5
- base_model:
6
- - TeeZee/Orca-2-13b_flat
7
- - NeverSleep/X-NoroChronos-13B
8
- - NeverSleep/Noromaid-13b-v0.3
9
- - KatyTheCutie/EstopianMaid-13B
10
- - Undi95/MLewdBoros-L2-13B
11
- - KoboldAI/LLaMA2-13B-Psyfighter2
12
- - KoboldAI/LLaMA2-13B-Erebus-v3
13
- library_name: transformers
14
- tags:
15
- - storywriting
16
- - text adventure
17
- - creative
18
- - story
19
- - writing
20
- - fiction
21
- - roleplaying
22
- - rp
23
- - mergekit
24
- - merge
25
-
26
- ---
27
-
28
- ![pic](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B/resolve/main/Bionic-Vaquita.jpg)
29
-
30
- # Bionic-Vaquita-13B
31
-
32
- In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 13B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
33
- Seven different Llama-2 13B models were hand-picked and merged via TIES to create three separate components for the final stack (to be made public upon request).
34
- Emotional intelligence and coherency were the primary focus of the late-stage manual testing that led to selecting this model.
35
- </br>
36
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
37
-
38
- ## Merge Details
39
- ### Merge Method
40
-
41
- This model was merged using the passthrough merge method.
42
-
43
- ### Models Merged
44
-
45
- The following models were included in the merge:
46
- * FallenMerick/XNoroChronos-Orca2-Noromaid
47
- * FallenMerick/Psyfighter2-Orca2-Erebus3
48
- * FallenMerick/EstopianMaid-Orca2-MlewdBoros
49
-
50
- ### Configuration
51
-
52
- The following YAML configuration was used to produce this model:
53
-
54
- ```yaml
55
- slices:
56
- - sources:
57
- - model: FallenMerick/XNoroChronos-Orca2-Noromaid
58
- layer_range: [0, 16]
59
- - sources:
60
- - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros
61
- layer_range: [16, 24]
62
- - sources:
63
- - model: FallenMerick/Psyfighter2-Orca2-Erebus3
64
- layer_range: [24, 40]
65
- merge_method: passthrough
66
- dtype: bfloat16
67
- ```
 
 
 
 
1
+ ---
2
+ license: cc-by-4.0
3
+ language:
4
+ - en
5
+ base_model:
6
+ - TeeZee/Orca-2-13b_flat
7
+ - NeverSleep/X-NoroChronos-13B
8
+ - NeverSleep/Noromaid-13b-v0.3
9
+ - KatyTheCutie/EstopianMaid-13B
10
+ - Undi95/MLewdBoros-L2-13B
11
+ - KoboldAI/LLaMA2-13B-Psyfighter2
12
+ - KoboldAI/LLaMA2-13B-Erebus-v3
13
+ library_name: transformers
14
+ tags:
15
+ - storywriting
16
+ - text adventure
17
+ - creative
18
+ - story
19
+ - writing
20
+ - fiction
21
+ - roleplaying
22
+ - rp
23
+ - mergekit
24
+ - merge
25
+
26
+ ---
27
+
28
+ ![pic](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B/resolve/main/Bionic-Vaquita.jpg)
29
+
30
+ # Bionic-Vaquita-13B
31
+
32
+ In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 13B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
33
+ </br>
34
+ Seven different Llama-2 13B models were hand-picked and merged via TIES to create three separate components for the final stack (to be made public upon request).
35
+ </br>
36
+ Emotional intelligence and coherency were the primary focus of the late-stage manual testing that led to selecting this model.
37
+ </br>
38
+ </br>
39
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
40
+
41
+ ## Merge Details
42
+ ### Merge Method
43
+
44
+ This model was merged using the passthrough merge method.
45
+
46
+ ### Models Merged
47
+
48
+ The following models were included in the merge:
49
+ * FallenMerick/XNoroChronos-Orca2-Noromaid
50
+ * FallenMerick/Psyfighter2-Orca2-Erebus3
51
+ * FallenMerick/EstopianMaid-Orca2-MlewdBoros
52
+
53
+ ### Configuration
54
+
55
+ The following YAML configuration was used to produce this model:
56
+
57
+ ```yaml
58
+ slices:
59
+ - sources:
60
+ - model: FallenMerick/XNoroChronos-Orca2-Noromaid
61
+ layer_range: [0, 16]
62
+ - sources:
63
+ - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros
64
+ layer_range: [16, 24]
65
+ - sources:
66
+ - model: FallenMerick/Psyfighter2-Orca2-Erebus3
67
+ layer_range: [24, 40]
68
+ merge_method: passthrough
69
+ dtype: bfloat16
70
+ ```