base_model: [] | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
<div style="width: auto; margin-left: auto; margin-right: auto"> | |
<img src="https://i.imgur.com/Tn9MBg6.png" alt="MidnightMiqu" style="width: 100%; min-width: 400px; display: block; margin: auto;"> | |
</div> | |
# Midnight-Miqu-70B-v1.5 - EXL2 5.0bpw | |
This is a 5.0bpw EXL2 quant of [sophosympatheia/Midnight-Miqu-70B-v1.5](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5) | |
Details about the model and the merge info can be found at the above mode page. | |
I have not extensively tested this quant/model other than ensuring I could load it and chat with it. | |
## Quant Details | |
This is the script used for quantization. | |
```bash | |
#!/bin/bash | |
# Activate the conda environment | |
source ~/miniconda3/etc/profile.d/conda.sh | |
conda activate exllamav2 | |
# Define variables | |
MODEL_DIR="models/Midnight-Miqu-70B-v1.5" | |
OUTPUT_DIR="exl2_midnightv15-70b" | |
MEASUREMENT_FILE="measurements/midnight70b-v15.json" | |
BIT_PRECISION=5.0 | |
CONVERTED_FOLDER="models/Midnight-Miqu-70B-v1.5_exl2_5.0bpw" | |
# Create directories | |
mkdir $OUTPUT_DIR | |
mkdir $CONVERTED_FOLDER | |
# Run conversion commands | |
python convert.py -i $MODEL_DIR -o $OUTPUT_DIR -nr -om $MEASUREMENT_FILE | |
python convert.py -i $MODEL_DIR -o $OUTPUT_DIR -nr -m $MEASUREMENT_FILE -b $BIT_PRECISION -cf $CONVERTED_FOLDER | |
``` | |