Natch69 commited on
Commit
a07225f
1 Parent(s): 72d8c46

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -25
README.md CHANGED
@@ -17,18 +17,7 @@ Gemma-TinyLLama-Passthrough is a merge of the following models using [mergekit](
17
  ## 🧩 Configuration
18
 
19
  \```yaml
20
- # models:
21
- # - model: unsloth/gemma-7b-bnb-4bit
22
- # layer_range: [0, 32]
23
- # # no parameters necessary for base model
24
- # - model: mistralai/Mistral-7B-v0.1
25
- # layer_range: [24, 32]
26
- # merge_method: passthrough
27
- # # base_model: unsloth/gemma-7b-bnb-4bit
28
- # parameters:
29
- # normalize: true
30
- # int8_mask: true
31
- # dtype: float16
32
  slices:
33
  - sources:
34
  - model: unsloth/gemma-2b-bnb-4bit
@@ -38,17 +27,5 @@ slices:
38
  layer_range: [6, 22]
39
  merge_method: passthrough
40
  dtype: bfloat16
41
- # models:
42
- # - model: unsloth/gemma-2b-bnb-4bit
43
- # parameters:
44
- # density: 0.53
45
- # weight: 0.45
46
- # - model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
47
- # parameters:
48
- # weight: 0.5
49
- # merge_method: ties
50
- # base_model: unsloth/gemma-2b-bnb-4bit
51
- # parameters:
52
- # int8_mask: true
53
- # dtype: bfloat16
54
  \```
 
17
  ## 🧩 Configuration
18
 
19
  \```yaml
20
+
 
 
 
 
 
 
 
 
 
 
 
21
  slices:
22
  - sources:
23
  - model: unsloth/gemma-2b-bnb-4bit
 
27
  layer_range: [6, 22]
28
  merge_method: passthrough
29
  dtype: bfloat16
30
+
 
 
 
 
 
 
 
 
 
 
 
 
31
  \```