saishf commited on
Commit
12e65a6
1 Parent(s): 0b7bb08

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +62 -0
README.md ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Epiculous/Fett-uccine-7B
4
+ - eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2
5
+ - OpenPipe/mistral-ft-optimized-1227
6
+ - ChaoticNeutrals/Eris_7B
7
+ library_name: transformers
8
+ license: cc-by-nc-4.0
9
+ tags:
10
+ - mergekit
11
+ - merge
12
+ language:
13
+ - en
14
+ pipeline_tag: text-generation
15
+ ---
16
+ # merge
17
+
18
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
19
+
20
+ ## Merge Details
21
+ * This model is an attempt at making a smart rp model with the finesse of [Epiculous/Fett-uccine-7B](https://huggingface.co/Epiculous/Fett-uccine-7B).
22
+ * From limited testing i've found it to be my favourite of my personal 7B models. It stays pretty coherent at 8k+ ctx.
23
+ * I like to use "Alpaca" format with "Universal-Light" for longer messages. Switching to ChatML causes the messages to be much shorter? I haven't a clue why but sometimes it's nice.
24
+ * It doesn't seem to show many issues but i'd be willing to try to fix any problems or bugs as it shows some potential.
25
+ ### Merge Method
26
+
27
+ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) as a base.
28
+
29
+ ### Models Merged
30
+
31
+ The following models were included in the merge:
32
+ * [Epiculous/Fett-uccine-7B](https://huggingface.co/Epiculous/Fett-uccine-7B)
33
+ * [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2)
34
+ * [ChaoticNeutrals/Eris_7B](https://huggingface.co/ChaoticNeutrals/Eris_7B)
35
+
36
+ ### Configuration
37
+
38
+ The following YAML configuration was used to produce this model:
39
+
40
+ ```yaml
41
+ models:
42
+ - model: OpenPipe/mistral-ft-optimized-1227
43
+ # No parameters necessary for base model
44
+ - model: Epiculous/Fett-uccine-7B
45
+ parameters:
46
+ density: 0.53
47
+ weight: 0.4
48
+ - model: eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2
49
+ parameters:
50
+ density: 0.53
51
+ weight: 0.35
52
+ - model: ChaoticNeutrals/Eris_7B
53
+ parameters:
54
+ density: 0.53
55
+ weight: 0.25
56
+ merge_method: dare_ties
57
+ base_model: OpenPipe/mistral-ft-optimized-1227
58
+ parameters:
59
+ int8_mask: true
60
+ dtype: bfloat16
61
+
62
+ ```