Pretergeek commited on
Commit
0106b57
1 Parent(s): 47d8993

Upload 9 files

Browse files
README.md CHANGED
@@ -1,3 +1,241 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - openchat/openchat-3.5-0106
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+
9
+ ---
10
+ # Untitled Model (1)
11
+
12
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
13
+
14
+ ## Merge Details
15
+ ### Merge Method
16
+
17
+ This model was merged using the passthrough merge method.
18
+
19
+ ### Models Merged
20
+
21
+ The following models were included in the merge:
22
+ * [openchat/openchat-3.5-0106](https://huggingface.co/openchat/openchat-3.5-0106)
23
+
24
+ ### Configuration
25
+
26
+ The following YAML configuration was used to produce this model:
27
+
28
+ ```yaml
29
+ slices:
30
+ - sources:
31
+ - model: openchat/openchat-3.5-0106
32
+ layer_range: [0, 2]
33
+ - sources:
34
+ - model: openchat/openchat-3.5-0106
35
+ layer_range: [1, 2]
36
+ parameters:
37
+ scale:
38
+ - filter: o_proj
39
+ value: 0.0
40
+ - filter: down_proj
41
+ value: 0.0
42
+ - value: 1.0
43
+ - sources:
44
+ - model: openchat/openchat-3.5-0106
45
+ layer_range: [2, 4]
46
+ - sources:
47
+ - model: openchat/openchat-3.5-0106
48
+ layer_range: [3, 4]
49
+ parameters:
50
+ scale:
51
+ - filter: o_proj
52
+ value: 0.0
53
+ - filter: down_proj
54
+ value: 0.0
55
+ - value: 1.0
56
+ - sources:
57
+ - model: openchat/openchat-3.5-0106
58
+ layer_range: [4, 6]
59
+ - sources:
60
+ - model: openchat/openchat-3.5-0106
61
+ layer_range: [5, 6]
62
+ parameters:
63
+ scale:
64
+ - filter: o_proj
65
+ value: 0.0
66
+ - filter: down_proj
67
+ value: 0.0
68
+ - value: 1.0
69
+ - sources:
70
+ - model: openchat/openchat-3.5-0106
71
+ layer_range: [6, 8]
72
+ - sources:
73
+ - model: openchat/openchat-3.5-0106
74
+ layer_range: [7, 8]
75
+ parameters:
76
+ scale:
77
+ - filter: o_proj
78
+ value: 0.0
79
+ - filter: down_proj
80
+ value: 0.0
81
+ - value: 1.0
82
+ - sources:
83
+ - model: openchat/openchat-3.5-0106
84
+ layer_range: [8, 10]
85
+ - sources:
86
+ - model: openchat/openchat-3.5-0106
87
+ layer_range: [9, 10]
88
+ parameters:
89
+ scale:
90
+ - filter: o_proj
91
+ value: 0.0
92
+ - filter: down_proj
93
+ value: 0.0
94
+ - value: 1.0
95
+ - sources:
96
+ - model: openchat/openchat-3.5-0106
97
+ layer_range: [10, 12]
98
+ - sources:
99
+ - model: openchat/openchat-3.5-0106
100
+ layer_range: [11, 12]
101
+ parameters:
102
+ scale:
103
+ - filter: o_proj
104
+ value: 0.0
105
+ - filter: down_proj
106
+ value: 0.0
107
+ - value: 1.0
108
+ - sources:
109
+ - model: openchat/openchat-3.5-0106
110
+ layer_range: [12, 14]
111
+ - sources:
112
+ - model: openchat/openchat-3.5-0106
113
+ layer_range: [13, 14]
114
+ parameters:
115
+ scale:
116
+ - filter: o_proj
117
+ value: 0.0
118
+ - filter: down_proj
119
+ value: 0.0
120
+ - value: 1.0
121
+ - sources:
122
+ - model: openchat/openchat-3.5-0106
123
+ layer_range: [14, 16]
124
+ - sources:
125
+ - model: openchat/openchat-3.5-0106
126
+ layer_range: [15, 16]
127
+ parameters:
128
+ scale:
129
+ - filter: o_proj
130
+ value: 0.0
131
+ - filter: down_proj
132
+ value: 0.0
133
+ - value: 1.0
134
+ - sources:
135
+ - model: openchat/openchat-3.5-0106
136
+ layer_range: [16, 18]
137
+ - sources:
138
+ - model: openchat/openchat-3.5-0106
139
+ layer_range: [17, 18]
140
+ parameters:
141
+ scale:
142
+ - filter: o_proj
143
+ value: 0.0
144
+ - filter: down_proj
145
+ value: 0.0
146
+ - value: 1.0
147
+ - sources:
148
+ - model: openchat/openchat-3.5-0106
149
+ layer_range: [18, 20]
150
+ - sources:
151
+ - model: openchat/openchat-3.5-0106
152
+ layer_range: [19, 20]
153
+ parameters:
154
+ scale:
155
+ - filter: o_proj
156
+ value: 0.0
157
+ - filter: down_proj
158
+ value: 0.0
159
+ - value: 1.0
160
+ - sources:
161
+ - model: openchat/openchat-3.5-0106
162
+ layer_range: [20, 22]
163
+ - sources:
164
+ - model: openchat/openchat-3.5-0106
165
+ layer_range: [21, 22]
166
+ parameters:
167
+ scale:
168
+ - filter: o_proj
169
+ value: 0.0
170
+ - filter: down_proj
171
+ value: 0.0
172
+ - value: 1.0
173
+ - sources:
174
+ - model: openchat/openchat-3.5-0106
175
+ layer_range: [22, 24]
176
+ - sources:
177
+ - model: openchat/openchat-3.5-0106
178
+ layer_range: [23, 24]
179
+ parameters:
180
+ scale:
181
+ - filter: o_proj
182
+ value: 0.0
183
+ - filter: down_proj
184
+ value: 0.0
185
+ - value: 1.0
186
+ - sources:
187
+ - model: openchat/openchat-3.5-0106
188
+ layer_range: [24, 26]
189
+ - sources:
190
+ - model: openchat/openchat-3.5-0106
191
+ layer_range: [25, 26]
192
+ parameters:
193
+ scale:
194
+ - filter: o_proj
195
+ value: 0.0
196
+ - filter: down_proj
197
+ value: 0.0
198
+ - value: 1.0
199
+ - sources:
200
+ - model: openchat/openchat-3.5-0106
201
+ layer_range: [26, 28]
202
+ - sources:
203
+ - model: openchat/openchat-3.5-0106
204
+ layer_range: [27, 28]
205
+ parameters:
206
+ scale:
207
+ - filter: o_proj
208
+ value: 0.0
209
+ - filter: down_proj
210
+ value: 0.0
211
+ - value: 1.0
212
+ - sources:
213
+ - model: openchat/openchat-3.5-0106
214
+ layer_range: [28, 30]
215
+ - sources:
216
+ - model: openchat/openchat-3.5-0106
217
+ layer_range: [29, 30]
218
+ parameters:
219
+ scale:
220
+ - filter: o_proj
221
+ value: 0.0
222
+ - filter: down_proj
223
+ value: 0.0
224
+ - value: 1.0
225
+ - sources:
226
+ - model: openchat/openchat-3.5-0106
227
+ layer_range: [30, 32]
228
+ - sources:
229
+ - model: openchat/openchat-3.5-0106
230
+ layer_range: [31, 32]
231
+ parameters:
232
+ scale:
233
+ - filter: o_proj
234
+ value: 0.0
235
+ - filter: down_proj
236
+ value: 0.0
237
+ - value: 1.0
238
+ merge_method: passthrough
239
+ dtype: bfloat16
240
+
241
+ ```
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "<|end_of_turn|>": 32000,
3
+ "<|pad_0|>": 32001
4
+ }
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openchat/openchat-3.5-0106",
3
+ "architectures": [
4
+ "MistralForCausalLM"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 1,
8
+ "eos_token_id": 32000,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 4096,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 14336,
13
+ "max_position_embeddings": 8192,
14
+ "model_type": "mistral",
15
+ "num_attention_heads": 32,
16
+ "num_hidden_layers": 48,
17
+ "num_key_value_heads": 8,
18
+ "rms_norm_eps": 1e-05,
19
+ "rope_theta": 10000.0,
20
+ "sliding_window": 4096,
21
+ "tie_word_embeddings": false,
22
+ "torch_dtype": "bfloat16",
23
+ "transformers_version": "4.42.4",
24
+ "use_cache": true,
25
+ "vocab_size": 32002
26
+ }
mergekit_config.yml ADDED
@@ -0,0 +1,211 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ slices:
2
+ - sources:
3
+ - model: openchat/openchat-3.5-0106
4
+ layer_range: [0, 2]
5
+ - sources:
6
+ - model: openchat/openchat-3.5-0106
7
+ layer_range: [1, 2]
8
+ parameters:
9
+ scale:
10
+ - filter: o_proj
11
+ value: 0.0
12
+ - filter: down_proj
13
+ value: 0.0
14
+ - value: 1.0
15
+ - sources:
16
+ - model: openchat/openchat-3.5-0106
17
+ layer_range: [2, 4]
18
+ - sources:
19
+ - model: openchat/openchat-3.5-0106
20
+ layer_range: [3, 4]
21
+ parameters:
22
+ scale:
23
+ - filter: o_proj
24
+ value: 0.0
25
+ - filter: down_proj
26
+ value: 0.0
27
+ - value: 1.0
28
+ - sources:
29
+ - model: openchat/openchat-3.5-0106
30
+ layer_range: [4, 6]
31
+ - sources:
32
+ - model: openchat/openchat-3.5-0106
33
+ layer_range: [5, 6]
34
+ parameters:
35
+ scale:
36
+ - filter: o_proj
37
+ value: 0.0
38
+ - filter: down_proj
39
+ value: 0.0
40
+ - value: 1.0
41
+ - sources:
42
+ - model: openchat/openchat-3.5-0106
43
+ layer_range: [6, 8]
44
+ - sources:
45
+ - model: openchat/openchat-3.5-0106
46
+ layer_range: [7, 8]
47
+ parameters:
48
+ scale:
49
+ - filter: o_proj
50
+ value: 0.0
51
+ - filter: down_proj
52
+ value: 0.0
53
+ - value: 1.0
54
+ - sources:
55
+ - model: openchat/openchat-3.5-0106
56
+ layer_range: [8, 10]
57
+ - sources:
58
+ - model: openchat/openchat-3.5-0106
59
+ layer_range: [9, 10]
60
+ parameters:
61
+ scale:
62
+ - filter: o_proj
63
+ value: 0.0
64
+ - filter: down_proj
65
+ value: 0.0
66
+ - value: 1.0
67
+ - sources:
68
+ - model: openchat/openchat-3.5-0106
69
+ layer_range: [10, 12]
70
+ - sources:
71
+ - model: openchat/openchat-3.5-0106
72
+ layer_range: [11, 12]
73
+ parameters:
74
+ scale:
75
+ - filter: o_proj
76
+ value: 0.0
77
+ - filter: down_proj
78
+ value: 0.0
79
+ - value: 1.0
80
+ - sources:
81
+ - model: openchat/openchat-3.5-0106
82
+ layer_range: [12, 14]
83
+ - sources:
84
+ - model: openchat/openchat-3.5-0106
85
+ layer_range: [13, 14]
86
+ parameters:
87
+ scale:
88
+ - filter: o_proj
89
+ value: 0.0
90
+ - filter: down_proj
91
+ value: 0.0
92
+ - value: 1.0
93
+ - sources:
94
+ - model: openchat/openchat-3.5-0106
95
+ layer_range: [14, 16]
96
+ - sources:
97
+ - model: openchat/openchat-3.5-0106
98
+ layer_range: [15, 16]
99
+ parameters:
100
+ scale:
101
+ - filter: o_proj
102
+ value: 0.0
103
+ - filter: down_proj
104
+ value: 0.0
105
+ - value: 1.0
106
+ - sources:
107
+ - model: openchat/openchat-3.5-0106
108
+ layer_range: [16, 18]
109
+ - sources:
110
+ - model: openchat/openchat-3.5-0106
111
+ layer_range: [17, 18]
112
+ parameters:
113
+ scale:
114
+ - filter: o_proj
115
+ value: 0.0
116
+ - filter: down_proj
117
+ value: 0.0
118
+ - value: 1.0
119
+ - sources:
120
+ - model: openchat/openchat-3.5-0106
121
+ layer_range: [18, 20]
122
+ - sources:
123
+ - model: openchat/openchat-3.5-0106
124
+ layer_range: [19, 20]
125
+ parameters:
126
+ scale:
127
+ - filter: o_proj
128
+ value: 0.0
129
+ - filter: down_proj
130
+ value: 0.0
131
+ - value: 1.0
132
+ - sources:
133
+ - model: openchat/openchat-3.5-0106
134
+ layer_range: [20, 22]
135
+ - sources:
136
+ - model: openchat/openchat-3.5-0106
137
+ layer_range: [21, 22]
138
+ parameters:
139
+ scale:
140
+ - filter: o_proj
141
+ value: 0.0
142
+ - filter: down_proj
143
+ value: 0.0
144
+ - value: 1.0
145
+ - sources:
146
+ - model: openchat/openchat-3.5-0106
147
+ layer_range: [22, 24]
148
+ - sources:
149
+ - model: openchat/openchat-3.5-0106
150
+ layer_range: [23, 24]
151
+ parameters:
152
+ scale:
153
+ - filter: o_proj
154
+ value: 0.0
155
+ - filter: down_proj
156
+ value: 0.0
157
+ - value: 1.0
158
+ - sources:
159
+ - model: openchat/openchat-3.5-0106
160
+ layer_range: [24, 26]
161
+ - sources:
162
+ - model: openchat/openchat-3.5-0106
163
+ layer_range: [25, 26]
164
+ parameters:
165
+ scale:
166
+ - filter: o_proj
167
+ value: 0.0
168
+ - filter: down_proj
169
+ value: 0.0
170
+ - value: 1.0
171
+ - sources:
172
+ - model: openchat/openchat-3.5-0106
173
+ layer_range: [26, 28]
174
+ - sources:
175
+ - model: openchat/openchat-3.5-0106
176
+ layer_range: [27, 28]
177
+ parameters:
178
+ scale:
179
+ - filter: o_proj
180
+ value: 0.0
181
+ - filter: down_proj
182
+ value: 0.0
183
+ - value: 1.0
184
+ - sources:
185
+ - model: openchat/openchat-3.5-0106
186
+ layer_range: [28, 30]
187
+ - sources:
188
+ - model: openchat/openchat-3.5-0106
189
+ layer_range: [29, 30]
190
+ parameters:
191
+ scale:
192
+ - filter: o_proj
193
+ value: 0.0
194
+ - filter: down_proj
195
+ value: 0.0
196
+ - value: 1.0
197
+ - sources:
198
+ - model: openchat/openchat-3.5-0106
199
+ layer_range: [30, 32]
200
+ - sources:
201
+ - model: openchat/openchat-3.5-0106
202
+ layer_range: [31, 32]
203
+ parameters:
204
+ scale:
205
+ - filter: o_proj
206
+ value: 0.0
207
+ - filter: down_proj
208
+ value: 0.0
209
+ - value: 1.0
210
+ merge_method: passthrough
211
+ dtype: bfloat16
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4.4", "total_size": 21463080960}, "weight_map": {"lm_head.weight": "model-00001-of-00005.safetensors", "model.embed_tokens.weight": "model-00001-of-00005.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.16.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.19.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00001-of-00005.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00001-of-00005.safetensors", "model.layers.23.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.22.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.25.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.28.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00002-of-00005.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00002-of-00005.safetensors", "model.layers.32.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.31.input_layernorm.weight": "model-00002-of-00005.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00002-of-00005.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00002-of-00005.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00002-of-00005.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.32.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.34.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.34.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.37.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.37.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.40.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.40.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.post_attention_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00003-of-00005.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00003-of-00005.safetensors", "model.layers.44.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.43.input_layernorm.weight": "model-00003-of-00005.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00003-of-00005.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00003-of-00005.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.44.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.43.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.4.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.46.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.46.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.7.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00004-of-00005.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00004-of-00005.safetensors", "model.layers.11.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.10.input_layernorm.weight": "model-00004-of-00005.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00004-of-00005.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00004-of-00005.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.input_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.input_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.13.input_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.norm.weight": "model-00005-of-00005.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|end_of_turn|>",
4
+ "<|pad_0|>"
5
+ ],
6
+ "bos_token": {
7
+ "content": "<s>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false
12
+ },
13
+ "eos_token": {
14
+ "content": "<|end_of_turn|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false
19
+ },
20
+ "unk_token": {
21
+ "content": "<unk>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false
26
+ }
27
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dadfd56d766715c61d2ef780a525ab43b8e6da4de6865bda3d95fdef5e134055
3
+ size 493443
tokenizer_config.json ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": null,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<unk>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "</s>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false,
28
+ "special": true
29
+ },
30
+ "32000": {
31
+ "content": "<|end_of_turn|>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false,
36
+ "special": true
37
+ },
38
+ "32001": {
39
+ "content": "<|pad_0|>",
40
+ "lstrip": false,
41
+ "normalized": false,
42
+ "rstrip": false,
43
+ "single_word": false,
44
+ "special": true
45
+ }
46
+ },
47
+ "additional_special_tokens": [
48
+ "<|end_of_turn|>",
49
+ "<|pad_0|>"
50
+ ],
51
+ "bos_token": "<s>",
52
+ "chat_template": "{{ bos_token }}{% for message in messages %}{{ 'GPT4 Correct ' + message['role'].title() + ': ' + message['content'] + '<|end_of_turn|>'}}{% endfor %}{% if add_generation_prompt %}{{ 'GPT4 Correct Assistant:' }}{% endif %}",
53
+ "clean_up_tokenization_spaces": false,
54
+ "eos_token": "<|end_of_turn|>",
55
+ "legacy": true,
56
+ "model_max_length": 1000000000000000019884624838656,
57
+ "pad_token": null,
58
+ "sp_model_kwargs": {},
59
+ "spaces_between_special_tokens": false,
60
+ "tokenizer_class": "LlamaTokenizer",
61
+ "unk_token": "<unk>",
62
+ "use_default_system_prompt": true
63
+ }