metadata
language:
- en
license: llama3.1
tags:
- moe
Experimental RP-oriented MoE
Llama 3 ClaudeMaid v1.0 4x8B
base_model: NeverSleep_Lumimaid-v0.2-8B
gate_mode: random
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: aifeifei798_DarkIdol-Llama-3.1-8B-Instruct-1.0-Uncensored
- source_model: Nitral-AI_Sekhmet_Bet-L3.1-8B-v0.2
- source_model: NeverSleep_Lumimaid-v0.2-8B
- source_model: Undi95_Meta-Llama-3.1-8B-Claude-bf16
Usage
It seems like 1.71 koboldcpp can't run GGUFs of llama-3.1 MoE models yet, or perhaps im just dumb and messed something up. If anyone has similar problem - run the model directly from llama.cpp, here's simple open source GUI(Windows) you can use if the console is your worst enemy
UPDATE 28.07.2024 Try this koboldcpp version
Models used
- NeverSleep/Lumimaid-v0.2-8B
- Undi95/Meta-Llama-3.1-8B-Claude
- Nitral-AI/Sekhmet_Bet-L3.1-8B-v0.2
- aifeifei798/DarkIdol-Llama-3.1-8B-Instruct-1.0-Uncensored