Datasets:
Dataset Viewer issue
Here, the error is carried over once I press the „Viewer“ button.
Ill stop to report this now.
All the best,
M
The dataset viewer is not working.
Error details:
Error code: UnexpectedError
Weird, I refreshed and got the same error:
[Errno 39] Directory not empty: '/storage/hf-datasets-cache/medium/datasets/92601609631952-config-parquet-and-info-mikehemberger-medicinal-p-abb399f3/mikehemberger___medicinal-plants/default-697ad16f67c3c36b/0.0.0/7b7ce5247a942be131d49ad4f3de5866083399a0f250901bd8dc202f8c5f7ce5.incomplete'
maybe ping @lhoestq on this one.
The actual error is
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1697, in _prepare_split_single
num_examples, num_bytes = writer.finalize()
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 587, in finalize
self.write_examples_on_file()
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 449, in write_examples_on_file
self.write_batch(batch_examples=batch_examples)
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 560, in write_batch
self.write_table(pa_table, writer_batch_size)
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 575, in write_table
pa_table = embed_table_storage(pa_table)
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2310, in embed_table_storage
arrays = [
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2311, in <listcomp>
embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name]
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1834, in wrapper
return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1834, in <listcomp>
return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2180, in embed_array_storage
return feature.embed_storage(array)
File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 276, in embed_storage
storage = pa.StructArray.from_arrays([bytes_array, path_array], [\"bytes\", \"path\"], mask=bytes_array.is_null())
File \"pyarrow/array.pxi\", line 3205, in pyarrow.lib.StructArray.from_arrays
File \"pyarrow/array.pxi\", line 3645, in pyarrow.lib.c_mask_inverted_from_obj
TypeError: Mask must be a pyarrow.Array of type boolean
it looks like a pyarrow bug
Hi
@lhoestq
,
Scary error message. I hope I’m not to blame…
Will it complicate matters if I work on this dataset further, ie add labels etc. and push it back to the hub? Otherwise I’ll leave this is as.
You're not to blame at all ^^'
I'm also having trouble reproducing the error, which makes this bug harder to understand and report to the Apache Arrow team.
I'll continue my investigation and hopefully find a workaround
Cool, thank you. I’ll keep my hands off of this dataset for now.
Good luck :)
Hi there,
I‘m thinking about working on this dataset some more.
It seems that the error message changed since the data has been auto-converted to parquet. Were you able to progress on this (those) issue(s)?
Best,
Mike
Indeed:
Rows from parquet row groups are too big to be read: 983.60 MiB (max=286.10 MiB)
It means that some rows are above the size limit. We don't want to have the users load 1GB of images when accessing the dataset page.