BlackSamorez
commited on
Commit
•
ea1c68f
1
Parent(s):
9c3cec7
Update README.md
Browse files
README.md
CHANGED
@@ -6,6 +6,6 @@ Selected evaluation results for this model:
|
|
6 |
|
7 |
| Model | AQLM scheme | WinoGrande | PiQA | HellaSwag | ArcE | ArcC | Model size, Gb | Hub link |
|
8 |
|------|------|------|-------|-------|-------|------|------|------|
|
9 |
-
| Mixtral-8x7B-Instruct-v0.1 (THIS)| 1x16 | 0.7593 |0.8043 | 0.6179 | 0.7768 | 0.4793 | 12.6 | [Link](https://huggingface.co/BlackSamorez/Mixtral-
|
10 |
|
11 |
To learn more about the inference, as well as the information on how to quantize models yourself, please refer to the [official GitHub repo](https://github.com/Vahe1994/AQLM).
|
|
|
6 |
|
7 |
| Model | AQLM scheme | WinoGrande | PiQA | HellaSwag | ArcE | ArcC | Model size, Gb | Hub link |
|
8 |
|------|------|------|-------|-------|-------|------|------|------|
|
9 |
+
| Mixtral-8x7B-Instruct-v0.1 (THIS)| 1x16 | 0.7593 |0.8043 | 0.6179 | 0.7768 | 0.4793 | 12.6 | [Link](https://huggingface.co/BlackSamorez/Mixtral-8x7B-Instruct-v0_1-AQLM-2Bit-1x16-hf)|
|
10 |
|
11 |
To learn more about the inference, as well as the information on how to quantize models yourself, please refer to the [official GitHub repo](https://github.com/Vahe1994/AQLM).
|