Files
2026-01-05 15:23:22 +08:00
..
2025-04-28 11:15:34 +08:00
2025-04-12 01:00:52 +08:00
2026-01-05 15:23:22 +08:00

OpenX to LeRobot

Open X-Embodiment assembles a dataset from 22 different robots collected through a collaboration between 21 institutions, demonstrating 527 skills (160266 tasks). (Copied from docs)

🚀 What's New in This Script

In this dataset, we have made several key improvements:

  • OXE Standard Transformations 🔄: We have integrated OXE's standard transformations to ensure uniformity across data.
  • Alignment of State and Action Information 🤖: State and action information are now perfectly aligned, enhancing the clarity and coherence of the dataset.
  • Robot Type and Control Frequency 📊: Annotations have been added for robot type and control frequency to improve dataset comprehensibility.
  • Joint Information 🦾: Joint-specific details have been included to assist with fine-grained understanding.

Dataset Structure of meta/info.json:

{
  "codebase_version": "v3.0", // lastest lerobot format
  "robot_type": "franka", // specific robot type, unknown if not provided
  "fps": 3, // control frequency, 10 if not provided
  // will add an additional key "control_frequency"
  "features": {
    "observation.images.image_key": {
      "dtype": "video",
      "shape": [128, 128, 3],
      "names": ["height", "width", "rgb"], // bgr to rgb if needed
      "info": {
        "video.fps": 3.0,
        "video.height": 128,
        "video.width": 128,
        "video.channels": 3,
        "video.codec": "av1",
        "video.pix_fmt": "yuv420p",
        "video.is_depth_map": false,
        "has_audio": false
      }
    },
    "observation.state": {
      "dtype": "float32",
      "shape": [8],
      "names": {
        "motors": ["x", "y", "z", "roll", "pitch", "yaw", "pad", "gripper"] 
        // unified 8-dim vector: [xyz, state type, gripper], motor_x if not provided
      }
    },
    "action": {
      "dtype": "float32",
      "shape": [7],
      "names": {
        "motors": ["x", "y", "z", "roll", "pitch", "yaw", "gripper"] 
        // unified 7-dim vector: [xyz, action type, gripper], motor_x if not provided
      }
    }
  }
}

Installation

  1. Install LeRobot: Follow instructions in official repo.

  2. Install others: For reading tfds/rlds, we need to install tensorflow-datasets:

    pip install tensorflow
    pip install tensorflow-datasets
    

Get started

  1. Download source code:

    git clone https://github.com/Tavish9/any4lerobot.git
    
  2. Modify path in convert.sh:

    python openx_rlds.py \
        --raw-dir /path/to/droid/1.0.0 \
        --local-dir /path/to/LEROBOT_DATASET \
        --repo-id your_hf_id \
        --use-videos \
        --push-to-hub
    
  3. Execute the script:

    bash convert.sh
    

Available OpenX_LeRobot Dataset

We have upload most of the OpenX datasets in huggingface🤗.

You can visualize the dataset in this space.