rdt-1b / README.md
robotics-diffusion-transformer's picture
Update README.md
6964520 verified
|
raw
history blame
No virus
1.37 kB
metadata
license: mit

RDT-1B

RDT-1B is a 1B-parameter imitation learning Diffusion Transformer pre-trained on 1M+ multi-robot episodes. Given a language instruction and 3-view RGB image observations, RDT can predict the next 64 robot actions. RDT is inherently compatible with almost all kinds of modern mobile manipulators, from single-arm to dual-arm, joint to EEF, pos. to vel., and even with a mobile chassis.

All the code and model weights are licensed under MIT license.

Please refer to our project page, github repository and paper for more information.

Model Details

  • Developed by Thu-ml team

  • License: MIT

  • Pretrain dataset: [More Information Needed]

  • Finetune dataset: [More Information Needed]

  • Repository: [More Information Needed]

  • Paper : [More Information Needed]

  • Project Page: https://rdt-robotics.github.io/rdt-robotics/

Uses

RDT-1B supports finetuning and pre-training on custom dataset, as well as deploying and inferencing on real-robots.

Please refer to our repository for all the above guides.

Citation

BibTeX:

[More Information Needed]