Edit model card

I have no idea what I’m doing… if this causes the apocalypse someone please let me know.

DonutHole-8x7B 8.0bpw h8 EXL2

Includes measurement.json file for further quantization

Original Model: https://huggingface.co/ycros/DonutHole-8x7B

Original Model Card

DonutHole-8x7B

GGUF versions here

Bagel, Mixtral Instruct, Holodeck, LimaRP.

What mysteries lie in the hole of a donut?

Good with Alpaca prompt formats, also works with Mistral format. See usage details below.

image/webp

This is similar to BagelMIsteryTour, but I've swapped out Sensualize for the new Holodeck. I'm not sure if it's better or not yet, or how it does at higher (8k+) contexts just yet.

Similar sampler advice applies as for BMT: minP (0.07 - 0.3 to taste) -> temp (either dynatemp 0-4ish, or like a temp of 3-4 with a smoothing factor of around 2.5ish). And yes, that's temp last. It does okay without rep pen up to a point, it doesn't seem to get into a complete jam, but it can start to repeat sentences, so you'll probably need some, perhaps 1.02-1.05 at a 1024 range seems okayish. (rep pen sucks, but there are better things coming).

I've mainly tested with LimaRP style Alpaca prompts (instruction/input/response), and briefly with Mistral's own format.

Full credit to all the model and dataset authors, I am but a derp with compute and a yaml file.


This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using mistralai/Mixtral-8x7B-v0.1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: mistralai/Mixtral-8x7B-v0.1
models:
  - model: mistralai/Mixtral-8x7B-v0.1+Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora
    parameters:
      density: 0.5
      weight: 0.2
  - model: KoboldAI/Mixtral-8x7B-Holodeck-v1
    parameters:
      density: 0.5
      weight: 0.2
  - model: mistralai/Mixtral-8x7B-Instruct-v0.1
    parameters:
      density: 0.6
      weight: 1.0
  - model: jondurbin/bagel-dpo-8x7b-v0.2
    parameters:
      density: 0.6
      weight: 0.5
merge_method: dare_ties
dtype: bfloat16

Downloads last month
0
Inference API
Unable to determine this model's library. Check the docs .

Model tree for FuturisticVibes/DonutHole-8x7B-8.0bpw-h8-exl2