wolfram commited on
Commit
f05e370
1 Parent(s): 78e9169

Update README.md

Browse files

Thank you very much for the quants!

Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -17,7 +17,9 @@ tags:
17
 
18
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6303ca537373aacccd85d8a7/RFEW_K0ABp3k_N3j02Ki4.jpeg)
19
 
20
- - GGUF: [wolfram/miquliz-120b-GGUF](https://huggingface.co/wolfram/miquliz-120b-GGUF)
 
 
21
 
22
  This is a 120b frankenmerge created by interleaving layers of [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) with [lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) using [mergekit](https://github.com/cg123/mergekit).
23
 
@@ -25,6 +27,8 @@ Inspired by [goliath-120b](https://huggingface.co/alpindale/goliath-120b).
25
 
26
  Thanks for the support, [CopilotKit](https://github.com/CopilotKit/CopilotKit) - the open-source platform for building in-app AI Copilots into any product, with any LLM model. Check out their GitHub.
27
 
 
 
28
  ## Prompt template: Mistral
29
 
30
  ```
 
17
 
18
  ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6303ca537373aacccd85d8a7/RFEW_K0ABp3k_N3j02Ki4.jpeg)
19
 
20
+ - EXL2: 2.4bpw | [2.65bpw](https://huggingface.co/LoneStriker/miquliz-120b-2.65bpw-h6-exl2) | [2.9bpw](https://huggingface.co/LoneStriker/miquliz-120b-2.9bpw-h6-exl2) | [4.0bpw](https://huggingface.co/LoneStriker/miquliz-120b-4.0bpw-h6-exl2)
21
+ - GGUF: [IQ3_XXS](https://huggingface.co/wolfram/miquliz-120b-GGUF) | [Q4_K_S+Q4_K_M](https://huggingface.co/NanoByte/miquliz-120b-Q4-GGUF)
22
+ - HF: [wolfram/miquliz-120b](https://huggingface.co/wolfram/miquliz-120b)
23
 
24
  This is a 120b frankenmerge created by interleaving layers of [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) with [lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) using [mergekit](https://github.com/cg123/mergekit).
25
 
 
27
 
28
  Thanks for the support, [CopilotKit](https://github.com/CopilotKit/CopilotKit) - the open-source platform for building in-app AI Copilots into any product, with any LLM model. Check out their GitHub.
29
 
30
+ Thanks for the EXL2 and GGUF quants, [Lone Striker](https://huggingface.co/LoneStriker) and [NanoByte](https://huggingface.co/NanoByte)!
31
+
32
  ## Prompt template: Mistral
33
 
34
  ```