Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Datasets:
SocialGrep
/
one-million-reddit-confessions
like
7
Modalities:
Tabular
Text
Formats:
csv
Languages:
English
Size:
1M - 10M
Libraries:
Datasets
pandas
Croissant
+ 1
License:
cc-by-4.0
Dataset card
Viewer
Files
Files and versions
Community
2
refs/convert/parquet
one-million-reddit-confessions
Commit History
Delete old duckdb index files
2ff229e
verified
parquet-converter
commited on
Mar 8
Update duckdb index files
ca881ac
parquet-converter
commited on
Aug 25, 2023
Update duckdb index files
b2c847e
parquet-converter
commited on
Aug 21, 2023
Update parquet files
5eb4f51
parquet-converter
commited on
Aug 21, 2023
Update duckdb index files
057ea58
parquet-converter
commited on
Aug 20, 2023
Update parquet files
ad4695f
parquet-converter
commited on
Aug 17, 2023
Update parquet files
8063849
parquet-converter
commited on
Jul 5, 2023
Update parquet files
5402bde
parquet-converter
commited on
May 3, 2023
Update parquet files
16e5b0d
parquet-converter
commited on
May 2, 2023
Update parquet files
7909aa1
parquet-converter
commited on
Apr 28, 2023
Update parquet files
7a11111
parquet-converter
commited on
Jan 19, 2023
Fix `license` metadata (
#1
)
55ea22d
SocialGrep
julien-c
HF staff
commited on
Jul 1, 2022
Create README.md
f2a749b
SocialGrep
commited on
Oct 12, 2021
Upload one-million-reddit-confessions.csv with git-lfs
a49aec6
SocialGrep
commited on
Oct 12, 2021
initial commit
2338b93
system
HF staff
commited on
Oct 12, 2021