Muennighoff commited on
Commit
8419a6d
1 Parent(s): 3fd2261

Add branches

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -47,9 +47,9 @@ branches = [b.name for b in out.branches]
47
  ```
48
 
49
  Important branches:
50
- - `step1200000-tokens5033B`: Pretraining checkpoint used for annealing. There are a few more checkpoints after this one but we did not use them.
51
- - `main`: Checkpoint annealed from `step1200000-tokens5033B` for an additional 100B tokens (23,842 steps). We use this checkpoint for our adaptation (https://huggingface.co/OLMoE/OLMoE-1B-7B-0824-SFT & https://huggingface.co/OLMoE/OLMoE-1B-7B-0824-Instruct).
52
- - `fp32`: FP32 version of `main`. The model weights were stored in FP32 during training but we did not observe any performance drop from casting them to BF16 after training so we upload all weights in BF16. If you want the original FP32 checkpoint for `main` you can use this one. You will find that it yields slightly different results but should perform around the same on benchmarks.
53
 
54
  # Citation
55
 
 
47
  ```
48
 
49
  Important branches:
50
+ - `main`: Preference tuned via DPO model of https://hf.co/OLMoE/OLMoE-1B-7B-0824-SFT (`main` branch)
51
+ - `no-load-balancing`: Ablation without load balancing loss during DPO starting from the `no-load-balancing` branch of https://hf.co/OLMoE/OLMoE-1B-7B-0824-SFT
52
+ - `non-annealed`: Ablation starting from the `non-annealed` branch of https://hf.co/OLMoE/OLMoE-1B-7B-0824-SFT which is an SFT of the pretraining checkpoint prior to annealing (branch `step1200000-tokens5033B` of https://hf.co/OLMoE/OLMoE-1B-7B-0824)
53
 
54
  # Citation
55