Edit model card

NeuralMona_MoE-4x7B

NeuralMona_MoE-4x7B is a Mixture of Experts (MoE) made with the following models using LazyMergekit:

🧩 Configuration

base_model: CultriX/MonaTrix-v4
dtype: bfloat16
experts:
  - source_model: "CultriX/MonaTrix-v4"  # Historical Analysis, Geopolitics, and Economic Evaluation
    positive_prompts:
      - "Historic analysis"
      - "Geopolitical impacts"
      - "Evaluate significance"
      - "Predict impact"
      - "Assess consequences"
      - "Discuss implications"
      - "Explain geopolitical"
      - "Analyze historical"
      - "Examine economic"
      - "Evaluate role"
      - "Analyze importance"
      - "Discuss cultural impact"
      - "Discuss historical"
    negative_prompts:
      - "Compose"
      - "Translate"
      - "Debate"
      - "Solve math"
      - "Analyze data"
      - "Forecast"
      - "Predict"
      - "Process"
      - "Coding"
      - "Programming"
      - "Code"
      - "Datascience"
      - "Cryptography"

  - source_model: "mlabonne/OmniTruthyBeagle-7B-v0"  # Multilingual Communication and Cultural Insights
    positive_prompts:
      - "Describe cultural"
      - "Explain in language"
      - "Translate"
      - "Compare cultural differences"
      - "Discuss cultural impact"
      - "Narrate in language"
      - "Explain impact on culture"
      - "Discuss national identity"
      - "Describe cultural significance"
      - "Narrate cultural"
      - "Discuss folklore"
    negative_prompts:
      - "Compose"
      - "Debate"
      - "Solve math"
      - "Analyze data"
      - "Forecast"
      - "Predict"
      - "Coding"
      - "Programming"
      - "Code"
      - "Datascience"
      - "Cryptography"

  - source_model: "CultriX/MoNeuTrix-7B-v1"  # Problem Solving, Innovation, and Creative Thinking
    positive_prompts:
      - "Devise strategy"
      - "Imagine society"
      - "Invent device"
      - "Design concept"
      - "Propose theory"
      - "Reason math"
      - "Develop strategy"
      - "Invent"
    negative_prompts:
      - "Translate"
      - "Discuss"
      - "Debate"
      - "Summarize"
      - "Explain"
      - "Detail"
      - "Compose"

  - source_model: "paulml/OmniBeagleSquaredMBX-v3-7B"  # Explaining Scientific Phenomena and Principles
    positive_prompts:
      - "Explain scientific"
      - "Discuss impact"
      - "Analyze potential"
      - "Elucidate significance"
      - "Summarize findings"
      - "Detail explanation"
    negative_prompts:
      - "Cultural significance"
      - "Engage in creative writing"
      - "Perform subjective judgment tasks"
      - "Discuss cultural traditions"
      - "Write review"
      - "Design"
      - "Create"
      - "Narrate"
      - "Discuss"

πŸ’» Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "CultriX/NeuralMona_MoE-4x7B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
4,296
Safetensors
Model size
24.2B params
Tensor type
BF16
Β·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for CultriX/NeuralMona_MoE-4x7B

Space using CultriX/NeuralMona_MoE-4x7B 1