Error loading dataset

#1
by krishnagarg09 - opened

Could you please take a look? It looks like training set is loaded properly but error occurred while loading validation set.

Downloading readme: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 33.0/33.0 [00:00<00:00, 190kB/s]                
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 640M/640M [00:12<00:00, 52.5MB/s]                
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 24.0M/24.0M [00:00<00:00, 69.2MB/s]                
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 24.0M/24.0M [00:00<00:00, 57.6MB/s]                
Downloading data files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [00:12<00:00,  4.32s/it]                
Extracting data files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [00:00<00:00, 2849.39it/s]                
Generating train split: 514154 examples [00:01, 439659.72 examples/s]                                                 
Generating validation split: 0 examples [00:00, ? examples/s]                                                         
Traceback (most recent call last):                                                                                    
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/builder.py", line 1940, in _prepare_split_single
    writer.write_table(table)                                                                                         
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/arrow_writer.py", line 572, in write_table
    pa_table = table_cast(pa_table, self._schema)                                                                     
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                     
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/table.py", line 2328, in table_cast
    return cast_table_to_schema(table, schema)                                                                        
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                        
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/table.py", line 2286, in cast_table_to_schema
    raise ValueError(f"Couldn't cast\n{table.schema}\nto\n{features}\nbecause column names don't match")              
ValueError: Couldn't cast                                                                                             
abstract: string                                                                                                      
keywords: string                                                                                                      
title: string                                                                                                         
to                                                                                                                    
{'id': Value(dtype='string', id=None), 'title': Value(dtype='string', id=None), 'abstract': Value(dtype='string', id=None), 'keywords': Sequence(feature=V
alue(dtype='string', id=None), length=-1, id=None)}                                                                   
because column names don't match                                                                                      
                                                                                                                      
The above exception was the direct cause of the following exception:                                                                                      

Traceback (most recent call last):                                           
  File "/Data-nvme/kgarg8/LLM4KP/kp_train3_memray_dataset.py", line 82, in <module>                                                                       
    data = load_dataset("memray/kp20k", use_auth_token=True)                                                                                              
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                              
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/load.py", line 2136, in load_dataset
    builder_instance.download_and_prepare(                                   
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/builder.py", line 954, in download_and_prepare
    self._download_and_prepare(                                              
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/builder.py", line 1049, in _download_and_prepare
    self._prepare_split(split_generator, **prepare_split_kwargs)                                                                                          
  File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/builder.py", line 1813, in _prepare_split
    for job_id, done, content in self._prepare_split_single( 
File "/home/kgarg8/miniconda3/envs/llm4kp_env/lib/python3.11/site-packages/datasets/builder.py", line 1958, in _prepare_split_single
    raise DatasetGenerationError("An error occurred while generating the dataset") from e                                                                 
datasets.builder.DatasetGenerationError: An error occurred while generating the dataset  

Hi @krishnagarg09 ,

I didn't implement a parser for this dataset.
Can you try downloading the data manually and loading it as a regular jsonl file?

Best,
Rui

memray changed discussion status to closed

Sign up or log in to comment