Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ Miqu 1 70b : a leak of Mistral Medium Alpha. Credit for this model goes to the M
|
|
9 |
|
10 |
Requantizations of a Q5_K_M quant of a trending 70b model without better quant/fp16 available, this through a Q8_0 intermediary step.
|
11 |
|
12 |
-
Miqudev provided Q5_K_M, Q4_K_M, and Q2_K
|
13 |
|
14 |
Here, you will find :
|
15 |
- Q3_K_M, Q3_K_S, Q3_K_XS, Q2_K_S, IQ3_XXS SOTA and IQ2_XS SOTA available.
|
@@ -41,7 +41,7 @@ So, CodeLlama 70b is nerfed like the other CodeLlama in general benchmarks terms
|
|
41 |
|
42 |
---
|
43 |
|
44 |
-
Benchs I made with the original Q2_K quant of Miku 70b, made from
|
45 |
|
46 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6451b24dc5d273f95482bfa4/wiDlIl1FMrVQo0fAcr3YO.png)
|
47 |
|
|
|
9 |
|
10 |
Requantizations of a Q5_K_M quant of a trending 70b model without better quant/fp16 available, this through a Q8_0 intermediary step.
|
11 |
|
12 |
+
Miqudev provided Q5_K_M, Q4_K_M, and Q2_K on this page : https://huggingface.co/miqudev/miqu-1-70b
|
13 |
|
14 |
Here, you will find :
|
15 |
- Q3_K_M, Q3_K_S, Q3_K_XS, Q2_K_S, IQ3_XXS SOTA and IQ2_XS SOTA available.
|
|
|
41 |
|
42 |
---
|
43 |
|
44 |
+
Benchs I made with the original Q2_K quant of Miku 70b, most probably made from an initial FP16 and published by Miqudev :
|
45 |
|
46 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6451b24dc5d273f95482bfa4/wiDlIl1FMrVQo0fAcr3YO.png)
|
47 |
|