File size: 716 Bytes
138cd5d
 
 
 
7deb29f
138cd5d
 
 
 
 
 
7deb29f
138cd5d
7deb29f
138cd5d
7deb29f
c08eb9b
138cd5d
4b63308
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
base_model:
- unsloth/Mistral-Small-Instruct-2409
- unsloth/Mistral-Small-Instruct-2409
- rAIfle/Acolyte-LORA
library_name: transformers
tags:
- mergekit
- merge

---
# Acolyte-22B

![image/png](https://cdn-uploads.huggingface.co/production/uploads/6569a4ed2419be6072890cf8/3dcGMcrWK2-2vQh9QBt3o.png)

LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size.
Check the [LoRA](https://huggingface.co/rAIfle/Acolyte-LORA) for dataset info.

Use `Mistral V2 & V3` template.

## Quants

- [iMat GGUFs](https://huggingface.co/Quant-Cartel/Acolyte-22B-iMat-GGUF)
- [exl2 longcals](https://huggingface.co/Quant-Cartel/Acolyte-22B-exl2-longcal)