Update README.md
Browse files
README.md
CHANGED
@@ -3,4 +3,35 @@ Requantization of a Q5_K_M quant of a trending 70b model without better quant/fp
|
|
3 |
Model has a theta of 1,000,000, and not 10,000, like Llama 2 models usually have.
|
4 |
|
5 |
So, no Alpha or Rope Base Frequency up to its base 32k context, if it works as intended.
|
6 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
Model has a theta of 1,000,000, and not 10,000, like Llama 2 models usually have.
|
4 |
|
5 |
So, no Alpha or Rope Base Frequency up to its base 32k context, if it works as intended.
|
6 |
+
|
7 |
+
And if it does, no linear rope is necessary either to reach the base 32k context.
|
8 |
+
|
9 |
+
Benchs I made with the original Q2_K quant published by Miqudev :
|
10 |
+
|
11 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag,87.75,,400,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
12 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag,86.5,,1000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
13 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag,86,,2000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
14 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag_Bin,81.5,,400,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
15 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag_Bin,83.7,,1000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
16 |
+
- miqu-1-70b.q2_K.gguf,-,Hellaswag_Bin,84,,2000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
17 |
+
- miqu-1-70b.q2_K.gguf,-,Arc-Challenge,56.18729097,,299,2024-01-29 05:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
18 |
+
- miqu-1-70b.q2_K.gguf,-,Arc-Easy,75.78947368,,570,2024-01-29 05:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
19 |
+
- miqu-1-70b.q2_K.gguf,-,MMLU,46.96485623,,313,2024-01-29 05:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
20 |
+
- miqu-1-70b.q2_K.gguf,-,Thruthful-QA,41.49326805,,817,2024-01-29 05:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqude miqu-1-70b.q2_K.gguf,-,Winogrande,78.2163,,1267,2024-01-29 05:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
21 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,4.3063,512,512,2024-01-29 01:40:00,RBF1000000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,655
|
22 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,4.6476,512,512,2024-01-29 01:40:00,RBF1000000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,81
|
23 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,4.6576,512,512,2024-01-29 01:40:00,RBF500000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,81
|
24 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,4.7762,512,512,2024-01-29 01:40:00,RBF100000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,81
|
25 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,4.8766,512,512,2024-01-29 01:40:00,RBF50000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,81
|
26 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,5.3367,512,512,2024-01-29 01:40:00,RBF10000,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,81
|
27 |
+
- miqu-1-70b.q2_K.gguf,-,wikitext,3.8606,4096,4096,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,41
|
28 |
+
|
29 |
+
Benchs I made with the Q3_K_M I quantized from Miqudev's Q5_K_M with an intermediary Q8_0 step, and an iMatrix of 12800 tokens from wiki.train.raw :
|
30 |
+
|
31 |
+
- miqu-1-70b.Q3_K_M.gguf,-,Hellaswag,88.75,,400,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
32 |
+
- miqu-1-70b.Q3_K_M.gguf,-,Hellaswag,88.1,,1000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
33 |
+
- miqu-1-70b.Q3_K_M.gguf,-,Hellaswag,87.3,,2000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
34 |
+
- miqu-1-70b.Q3_K_M.gguf,-,Hellaswag_Bin,82,,400,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
35 |
+
- miqu-1-70b.Q3_K_M.gguf,-,Hellaswag_Bin,85.1,,1000,2024-01-29 01:40:00,,70b,Mistral_Medium,32768,,,GGUF,miqudev,miqudev,
|
36 |
+
|
37 |
+
TbC
|