ExpertRamonda-7Bx2_MoE / mergekit_moe_config.yml
mayacinka's picture
Upload folder using huggingface_hub
179703f verified
raw
history blame contribute delete
545 Bytes
base_model: mlabonne/AlphaMonarch-7B
gate_mode: hidden
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: mlabonne/AlphaMonarch-7B
positive_prompts:
- "You excel at reasoning skills. For every prompt you think of an answer from 3 different angles"
## (optional)
# negative_prompts:
# - "This is a prompt expert_model_1 should not be used for"
- source_model: bardsai/jaskier-7b-dpo-v5.6
positive_prompts:
- "You excel at logic and reasoning skills. Reply in a straightforward and concise way"