File size: 2,600 Bytes
1e80a65
466860e
 
 
 
 
 
1e80a65
 
466860e
1e80a65
466860e
1e80a65
466860e
 
1e80a65
466860e
1e80a65
466860e
1e80a65
 
 
 
 
466860e
1e80a65
466860e
1e80a65
466860e
1e80a65
466860e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
license: cc-by-nc-4.0
tags:
- sparsh
- DIGIT
- SSL pre-training
pretty_name: touch-slide
---

# Dataset Details

Touch-Slide is a dataset inspired by [YCB-Slide](https://github.com/rpl-cmu/YCB-Slide). Its purpose is to increase the amount of data from multiple DIGIT sensors for self-supervised learning (SSL) pre-training of the Sparsh model. Touch-Slide consists of human-sliding interactions on toy kitchen objects using the DIGIT sensor. We used 9 objects, as shown below, and collected 5 trajectories for each, resulting in a total of 180k frames.

<!-- insert figure from assets/touch_slide_objs.png -->
![Touch-Slide](assets/touch_slide_objs.png)

This is a visual example of how the dataset was collected, showcasing sliding interactions that capture trajectories rich in shear forces:

![a](assets/bread-real.gif) ![b](assets/bread-digit.gif)



## Uses

This dataset does not include labels and is intended for self-supervised training only. It is specifically designed for training the Sparsh models listed in the Hugging Face [Sparsh collection](https://huggingface.co/collections/facebook/sparsh-67167ce57566196a4526c328).

Please refer to the [Sparsh repo](https://github.com/facebookresearch/sparsh) for further information about usage.

## Dataset Structure

The dataset consists of 5 trajectories for each object. Each trajectory is stored as a pickle file, containing binarized tactile images. The structure is as follows:

```bash
Touch-Slide
β”œβ”€β”€ object_0 # eg: bread
β”‚   β”œβ”€β”€ dataset_0.pkl
β”‚   β”œβ”€β”€ ...
β”‚   β”œβ”€β”€ dataset_4.pkl
β”œβ”€β”€ object_1 # eg: corn
β”œβ”€β”€ ...
```

This is a sample code for loading the dataset in pickle format and extracting the images:

```python
def load_pickle_dataset(file_dataset):
    with open(file_dataset, "rb") as f:
        all_frames = pickle.load(f)
    return all_frames

def load_bin_image(io_buf):
    img = Image.open(io.BytesIO(io_buf))
    img = np.array(img)
    return img

frames = load_pickle_dataset('bread/dataset_0.pkl')
img = load_bin_image(frames[0])
```

## BibTeX entry and citation info

```bibtex
@inproceedings{
    higuera2024sparsh,
    title={Sparsh: Self-supervised touch representations for vision-based tactile sensing},
    author={Carolina Higuera and Akash Sharma and Chaithanya Krishna Bodduluri and Taosha Fan and Patrick Lancaster and Mrinal Kalakrishnan and Michael Kaess and Byron Boots and Mike Lambeta and Tingfan Wu and Mustafa Mukadam},
    booktitle={8th Annual Conference on Robot Learning},
    year={2024},
    url={https://openreview.net/forum?id=xYJn2e1uu8}
}
```