Post
1579
๐ฃAi2 Releasing OLMoE!
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
๐ฆPaper: https://arxiv.org/abs/2409.02060
๐คModel: allenai/OLMoE-1B-7B-0924-Instruct
๐พDatasets: allenai/OLMoE-mix-0924
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
๐ฆPaper: https://arxiv.org/abs/2409.02060
๐คModel: allenai/OLMoE-1B-7B-0924-Instruct
๐พDatasets: allenai/OLMoE-mix-0924