bartowski commited on
Commit
c9e4c57
1 Parent(s): 81541e1

Add 6.5 link

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -25,6 +25,8 @@ Conversion was done using the default calibration dataset.
25
  Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
26
 
27
  Original model: https://huggingface.co/cognitivecomputations/dolphin-2.7-mixtral-8x7b
 
 
28
 
29
  ## Download instructions
30
 
 
25
  Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
26
 
27
  Original model: https://huggingface.co/cognitivecomputations/dolphin-2.7-mixtral-8x7b
28
+
29
+ <a href="https://huggingface.co/bartowski/dolphin-2.7-mixtral-8x7b-exl2/tree/6_5">6.5 bits per weight</a>
30
 
31
  ## Download instructions
32