1,378,234,368 tokens (using the Llama tokenizer, ~1.18b gpt4 tokens) from a deduped pile raw shard, filter len<896, ask-llm (“How to Train Data-Efficient LLMs”) w/ mistralai/Mistral-7B-Instruct-v0.2, keep top 1/4
{
"text": "Once upon a time...",
"pos": -5.654354325
}
- Downloads last month
- 33