The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: ArrowNotImplementedError Message: Cannot write struct type 'links' with no child field to Parquet. Consider adding a dummy child field. Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2013, in _prepare_split_single writer.write_table(table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 583, in write_table self._build_writer(inferred_schema=pa_table.schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 404, in _build_writer self.pa_writer = self._WRITER_CLASS(self.stream, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__ self.writer = _parquet.ParquetWriter( File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__ File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'links' with no child field to Parquet. Consider adding a dummy child field. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2029, in _prepare_split_single num_examples, num_bytes = writer.finalize() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 602, in finalize self._build_writer(self.schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 404, in _build_writer self.pa_writer = self._WRITER_CLASS(self.stream, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__ self.writer = _parquet.ParquetWriter( File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__ File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'links' with no child field to Parquet. Consider adding a dummy child field. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1396, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1045, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1029, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1124, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1884, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2040, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
chunk_compression
null | dtype
string | hidden
bool | htype
string | is_link
bool | is_sequence
bool | length
int64 | links
dict | max_chunk_size
int64 | max_shape
sequence | min_shape
sequence | name
string | sample_compression
null | tiling_threshold
null | typestr
null | verify
bool | version
string | allow_delete
bool | default_index
list | groups
sequence | hidden_tensors
sequence | tensor_names
dict | tensors
sequence | vdb_indexes
sequence |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | int64 | true | generic | false | false | 23,579 | {} | 4,000,000 | [
1
] | [
1
] | _embedding_shape | null | null | null | true | 3.9.23 | null | null | null | null | null | null | null |
null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | 3.9.23 | true | [
{
"start": null,
"step": null,
"stop": null
}
] | [] | [
"_embedding_shape"
] | {
"_embedding_shape": "_embedding_shape",
"embedding": "embedding",
"id": "id",
"metadata": "metadata",
"text": "text"
} | [
"text",
"metadata",
"embedding",
"_embedding_shape",
"id"
] | null |
null | float32 | false | embedding | false | false | 23,579 | {
"_embedding_shape": {
"extend": "extend_shape",
"flatten_sequence": true,
"update": "update_shape"
}
} | 64,000,000 | [
1536
] | [
1536
] | embedding | null | null | null | true | 3.9.23 | null | null | null | null | null | null | [] |
null | str | false | text | false | false | 23,579 | {} | null | [
1
] | [
1
] | id | null | null | null | true | 3.9.23 | null | null | null | null | null | null | [] |
null | Any | false | json | false | false | 23,579 | {} | null | [
1
] | [
1
] | metadata | null | null | null | true | 3.9.23 | null | null | null | null | null | null | null |
null | str | false | text | false | false | 23,579 | {} | null | [
1
] | [
1
] | text | null | null | null | true | 3.9.23 | null | null | null | null | null | null | [] |
null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |