File size: 787 Bytes
30ce407
 
 
 
b177a97
 
 
 
 
 
 
30ce407
3b63b33
 
 
 
 
 
 
 
 
 
1f8fe48
 
dbb5932
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
license: apache-2.0
language:
- en
task_categories:
- feature-extraction
tags:
- t5
- flan
size_categories:
- 100K<n<1M
---
All of the data together is around 41GB. It's the last hidden states of 131,072 samples from refinedweb padded/truncated to 512 tokens on the left, fed through [google/flan-t5-small](https://hf.co/google/flan-t5-small).

Structure:

```
{
  "encoding": List, shaped (512, 512) aka (tokens, d_model),
  "text": String, the original text that was encoded,
  "attention_mask": List, binary mask to pass to your model with encoding to not attend to pad tokens
}
```

just a tip, you cannot load this with the RAM in the free ver of google colab, not even a single file, streaming won't work either. I have 80gb RAM and it was barely enough to work with streaming.