|
--- |
|
license: apache-2.0 |
|
--- |
|
|
|
This OLMo version was used to develop [Molmo-O-7B](https://huggingface.co/allenai/Molmo-7B-O-0924). |
|
It was trained on the [OLMoE-Mix](https://huggingface.co/datasets/allenai/OLMoE-mix-0924) and uses |
|
the [Dolma 2 tokenizer](https://huggingface.co/allenai/dolma2-tokenizer). |
|
|
|
This model is not intented to be used as is--it is provided as research artifacts to facilitate |
|
reproduction and research on Molmo. Details about this model family will be presented in an upcoming |
|
OLMo manuscript. |