Datasets:

Languages:
English
ArXiv:
License:
Muennighoff commited on
Commit
843e6d9
1 Parent(s): c11c6e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ pretty_name: OLMoE Mix (September 2024)
21
 
22
  The following data mix was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024.
23
 
24
- The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924), the SFT of OLMoE-1B-7B is available [here](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924-SFT), and a version combining SFT and DPO is available following [this link](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924-Instruct).
25
 
26
  ## Statistics
27
 
 
21
 
22
  The following data mix was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024.
23
 
24
+ The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/allenai/OLMoE-1B-7B-0924), the SFT of OLMoE-1B-7B is available [here](https://huggingface.co/allenai/OLMoE-1B-7B-0924-SFT), and a version combining SFT and DPO is available following [this link](https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct).
25
 
26
  ## Statistics
27