Update README.md
Browse files
README.md
CHANGED
@@ -17,13 +17,6 @@ DCLM-Baseline-7B is a 7 billion parameter language model trained on the DCLM-Bas
|
|
17 |
|------|-----------------|--------|-------------|-----------------|----------------|
|
18 |
| 7B | 2.5T | 32 | 4096 | 32 | 2048 |
|
19 |
|
20 |
-
| Size | Training Tokens | Layers | Hidden Size | Attention Heads | Context Length |
|
21 |
-
|------|--------|---------|-------------|-----------------|----------------|
|
22 |
-
| [OLMo 1B](https://huggingface.co/allenai/OLMo-1B) | 3 Trillion |16 | 2048 | 16 | 2048 |
|
23 |
-
| [OLMo 7B](https://huggingface.co/allenai/OLMo-7B) | 2.5 Trillion | 32 | 4096 | 32 | 2048 |
|
24 |
-
| [OLMo 7B Twin 2T](https://huggingface.co/allenai/OLMo-7B-Twin-2T) | 2 Trillion | 32 | 4096 | 32 | 2048 |
|
25 |
-
|
26 |
-
|
27 |
|
28 |
### Model Description
|
29 |
|
|
|
17 |
|------|-----------------|--------|-------------|-----------------|----------------|
|
18 |
| 7B | 2.5T | 32 | 4096 | 32 | 2048 |
|
19 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
|
21 |
### Model Description
|
22 |
|