Edit model card

Medchator-2x7b

Medchator-2x7b is a Mixure of Experts (MoE) made with the following models:

Evaluations

Open LLM Leaderboard

image/png

Model Name ARC HellaSwag MMLU TruthfulQA Winogrande GSM8K
Orca-2-7b 78.4 76.1 53.7 52.4 74.2 47.2
LLAMA-2-7b 43.2 77.1 44.4 38.7 69.5 16
MT7Bi-sft 54.1 75.11 - 43.08 72.14 15.54
MT7bi-dpo 54.69 75.89 52.82 45.48 71.58 25.93
Medorca-2x7b 54.1 76.04 54.1 48.04 74.51 20.64
Medchator-2x7b 57.59 78.14 56.13 48.77 75.3 32.83

Medical Performance

Clinical Camel demonstrates competitive performance on medical benchmarks.

Table: Five-Shot Performance of GPT3.5, llama-2-7b and Llama-2-70b on Various Medical Datasets

Dataset Medchator-2x7b GPT3.5 Llama-2 7b Llama-2 70b
MMLU Anatomy 56.3 60.7 48.9 62.9
MMLU Clinical Knowledge 63.0 68.7 46.0 71.7
MMLU College Biology 63.8 72.9 47.2 84.7
MMLU College Medicine 50.9 63.6 42.8 64.2
MMLU Medical Genetics 67.0 68.0 55.0 74.0
MMLU Professional Medicine 55.1 69.8 53.6 75.0

🧩 Configuration

base_model: microsoft/Orca-2-7b
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: AdaptLLM/medicine-chat
    positive_prompts: 
      - "How does sleep affect cardiovascular health?"
      - "Could a plant-based diet improve arthritis symptoms?"
      - "A patient comes in with symptoms of dizziness and nausea"
      - "When discussing diabetes management, the key factors to consider are"
      - "The differential diagnosis for a headache with visual aura could include"
    negative_prompts:
      - "Recommend a good recipe for a vegetarian lasagna."
      - "Give an overview of the French Revolution."
      - "Explain how a digital camera captures an image."
      - "What are the environmental impacts of deforestation?"
      - "The recent advancements in artificial intelligence have led to developments in"
      - "The fundamental concepts in economics include ideas like supply and demand, which explain"
  - source_model: microsoft/Orca-2-7b
    positive_prompts:
      - "Here is a funny joke for you -"
      - "When considering the ethical implications of artificial intelligence, one must take into account"
      - "In strategic planning, a company must analyze its strengths and weaknesses, which involves"
      - "Understanding consumer behavior in marketing requires considering factors like"
      - "The debate on climate change solutions hinges on arguments that"
    negative_prompts:
      - "In discussing dietary adjustments for managing hypertension, it's crucial to emphasize"
      - "For early detection of melanoma, dermatologists recommend that patients regularly check their skin for"
      - "Explaining the importance of vaccination, a healthcare professional should highlight"

πŸ’» Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "Technoculture/Medchator-2x7b"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
85
Safetensors
Model size
11.1B params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Technoculture/Medchator-2x7b

Quantizations
2 models

Dataset used to train Technoculture/Medchator-2x7b

Collection including Technoculture/Medchator-2x7b