mirror of
https://github.com/huggingface/lerobot.git
synced 2026-05-14 08:09:45 +00:00
ca87ccd941
* feat(scripts): lerobot-rollout * fix(rollout) require dataset in dagger + use duration too * fix(docs): dagger num_episodes * test(rollout): fix expectations * fix(rollout): features check * fix(rollout): device and task propagation + feature pos + warn fps + move rename_map config * docs(rollout): edit rename_map instructions * chore(rollout): multiple minor improvements * chore(rollout): address coments + minor improvements * fix(rollout): enable default * fix(tests): default value RTCConfig * fix(rollout): robot_observation_processor and notify_observation at policy frequency instead of interpolator rate Co-authored-by: Pepijn <138571049+pkooij@users.noreply.github.com> * fix(rollout): prevent relativeactions with sync inference engine Co-authored-by: Pepijn <138571049+pkooij@users.noreply.github.com> * fix(rollout): rtc reanchor to non normalized state Co-authored-by: Pepijn <138571049+pkooij@users.noreply.github.com> * fix(rollout): fixing the episode length to use hwc (#3469) also reducing default length to 5 minutes * feat(rollout): go back to initial position is now a config * fix(rollout): properly propagating video_files_size_in_mb to lerobot_dataset (#3470) * chore(rollout): note about dagger correction stage * chore(docs): update comments and docstring * fix(test): move rtc relative out of rollout module * fix(rollout): address the review comments --------- Co-authored-by: Pepijn <138571049+pkooij@users.noreply.github.com> Co-authored-by: Maxime Ellerbach <maxime.ellerbach@huggingface.co>
304 lines
9.1 KiB
Plaintext
304 lines
9.1 KiB
Plaintext
# Unitree G1
|
|
|
|
<img
|
|
src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/lerobot/unitree_thumbnail.jpg"
|
|
alt="Unitree G1 locomanipulation demo"
|
|
style={{ width: "100%" }}
|
|
/>
|
|
|
|
The Unitree G1 humanoid is now supported in LeRobot! You can teleoperate, train locomanipulation policies, test in sim, and more. Both 29 and 23 DoF variants are supported.
|
|
|
|
---
|
|
|
|
## Part 1: Getting Started
|
|
|
|
### Install the Unitree SDK
|
|
|
|
Follow the [unitree_sdk2_python installation guide](https://github.com/unitreerobotics/unitree_sdk2_python#installation). Tested with `unitree_sdk2py==1.0.1` and `cyclonedds==0.10.2`:
|
|
|
|
```bash
|
|
conda create -y -n lerobot python=3.12
|
|
conda activate lerobot
|
|
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
|
|
cd unitree_sdk2_python
|
|
pip install -e .
|
|
cd ..
|
|
```
|
|
|
|
### Install LeRobot
|
|
|
|
```bash
|
|
conda install ffmpeg -c conda-forge
|
|
conda install -c conda-forge "pinocchio>=3.0.0,<4.0.0"
|
|
git clone https://github.com/huggingface/lerobot.git
|
|
cd lerobot
|
|
pip install -e '.[unitree_g1]'
|
|
```
|
|
|
|
<Tip>
|
|
For now, pinocchio must be installed from conda-forge (not pip) to include the
|
|
CasADi bindings needed for arm IK.
|
|
</Tip>
|
|
|
|
### Test the Installation (Simulation)
|
|
|
|
The simulation environment has its own dependencies. Check the Simulation environment dependencies: [Unitree G1 Mujoco EnvHub](https://huggingface.co/lerobot/unitree-g1-mujoco/tree/main).
|
|
|
|
```bash
|
|
pip install mujoco loguru msgpack msgpack-numpy
|
|
```
|
|
|
|
```bash
|
|
lerobot-teleoperate \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=true \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.id=wbc_unitree \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30, "warmup_s": 5}}' \
|
|
--display_data=true \
|
|
--robot.controller=GrootLocomotionController
|
|
```
|
|
|
|
This will launch a [MuJoCo sim instance](https://huggingface.co/lerobot/unitree-g1-mujoco/tree/main) for the G1. You can connect a gamepad to your machine before launching in order to control the robot's locomotion in sim. We support both [HolosomaLocomotionController](https://github.com/amazon-far/holosoma) and [GrootLocomotionController](https://github.com/NVlabs/GR00T-WholeBodyControl) via `--robot.controller`.
|
|
|
|
- Press `9` to release the robot
|
|
- Press `7` / `8` to increase / decrease waist height
|
|
|
|
### Connect to the Physical Robot
|
|
|
|
The G1's Ethernet IP is fixed at `192.168.123.164`. Your machine must have a static IP on the same subnet: `192.168.123.x` where `x ≠ 164`.
|
|
|
|
```bash
|
|
# Replace 'enp131s0' with your ethernet interface name (check with `ip a`)
|
|
sudo ip addr flush dev enp131s0
|
|
sudo ip addr add 192.168.123.200/24 dev enp131s0
|
|
sudo ip link set enp131s0 up
|
|
```
|
|
|
|
### SSH into the Robot
|
|
|
|
```bash
|
|
ssh unitree@192.168.123.164
|
|
# Password: 123
|
|
```
|
|
|
|
### Share Internet via Ethernet
|
|
|
|
The G1 needs internet access to clone repos and install packages. Share your laptop's connection over Ethernet:
|
|
|
|
**On your laptop:**
|
|
|
|
```bash
|
|
sudo sysctl -w net.ipv4.ip_forward=1
|
|
|
|
# Replace wlp132s0f0 with your WiFi interface name
|
|
sudo iptables -t nat -A POSTROUTING -o wlp132s0f0 -s 192.168.123.0/24 -j MASQUERADE
|
|
sudo iptables -A FORWARD -i wlp132s0f0 -o enp131s0 -m state --state RELATED,ESTABLISHED -j ACCEPT
|
|
sudo iptables -A FORWARD -i enp131s0 -o wlp132s0f0 -j ACCEPT
|
|
```
|
|
|
|
**On the G1:**
|
|
|
|
```bash
|
|
sudo ip route del default 2>/dev/null || true
|
|
sudo ip route add default via 192.168.123.200 dev eth0
|
|
echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf
|
|
|
|
# Verify
|
|
ping -c 3 8.8.8.8
|
|
```
|
|
|
|
### Install the Unitree SDK on the G1
|
|
|
|
Follow the [unitree_sdk2_python installation guide](https://github.com/unitreerobotics/unitree_sdk2_python#installation):
|
|
|
|
```bash
|
|
conda create -y -n lerobot python=3.12
|
|
conda activate lerobot
|
|
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
|
|
cd unitree_sdk2_python
|
|
python -m pip install -e .
|
|
cd ..
|
|
```
|
|
|
|
### Install LeRobot on the G1
|
|
|
|
```bash
|
|
git clone https://github.com/huggingface/lerobot.git
|
|
cd lerobot
|
|
conda install -c conda-forge "pinocchio>=3.0.0,<4.0.0"
|
|
python -m pip install -e '.[unitree_g1]'
|
|
```
|
|
|
|
<Tip>
|
|
For now, pinocchio must be installed from conda-forge (not pip) to include the
|
|
CasADi bindings needed for arm IK.
|
|
</Tip>
|
|
|
|
### (Optional) Enable WiFi on the Robot
|
|
|
|
For wireless SSH access, you can enable WiFi on the G1 (it's blocked by default):
|
|
|
|
```bash
|
|
sudo rfkill unblock all
|
|
sudo ip link set wlan0 up
|
|
sudo nmcli radio wifi on
|
|
sudo nmcli device set wlan0 managed yes
|
|
sudo systemctl restart NetworkManager
|
|
```
|
|
|
|
**Connect to a WiFi network:**
|
|
|
|
```bash
|
|
nmcli device wifi list
|
|
|
|
sudo nmcli connection add type wifi ifname wlan0 con-name "YourNetwork" ssid "YourNetwork"
|
|
sudo nmcli connection modify "YourNetwork" wifi-sec.key-mgmt wpa-psk
|
|
sudo nmcli connection modify "YourNetwork" wifi-sec.psk "YourPassword"
|
|
sudo nmcli connection modify "YourNetwork" connection.autoconnect yes
|
|
sudo nmcli connection up "YourNetwork"
|
|
|
|
ip a show wlan0
|
|
```
|
|
|
|
You can then SSH over WiFi instead of Ethernet:
|
|
|
|
```bash
|
|
ssh unitree@<ROBOT_WIFI_IP>
|
|
# Password: 123
|
|
```
|
|
|
|
---
|
|
|
|
## Part 2: Teleoperation & Locomotion
|
|
|
|
### Run the Robot Server
|
|
|
|
On the robot (from `~/lerobot`):
|
|
|
|
```bash
|
|
cd ~/lerobot
|
|
python src/lerobot/robots/unitree_g1/run_g1_server.py --camera
|
|
```
|
|
|
|
### Run the Locomotion Policy
|
|
|
|
You can run the teleoperation client from your laptop over Ethernet, over WiFi (experimental), or directly on the robot itself. Mind potential latency introduced by your network.
|
|
|
|
**From your laptop:**
|
|
|
|
```bash
|
|
lerobot-teleoperate \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=false \
|
|
--robot.robot_ip=<ROBOT_IP> \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.id=wbc_unitree \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--display_data=true \
|
|
--robot.controller=HolosomaLocomotionController
|
|
```
|
|
|
|
We support both [GrootLocomotionController](https://github.com/NVlabs/GR00T-WholeBodyControl) and [HolosomaLocomotionController](https://github.com/amazon-far/holosoma) via `--robot.controller`.
|
|
|
|
---
|
|
|
|
## Part 3: Loco-Manipulation with the Homunculus Exoskeleton
|
|
|
|
We provide a loco-manipulation solution via the Homunculus Exoskeleton — an open-source 7 DoF exoskeleton for whole-body control. Check it out [here](https://github.com/nepyope/hmc_exo).
|
|
|
|
### Calibrate
|
|
|
|
```bash
|
|
lerobot-calibrate \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.left_arm_config.port=/dev/ttyACM1 \
|
|
--teleop.right_arm_config.port=/dev/ttyACM0 \
|
|
--teleop.id=exo
|
|
```
|
|
|
|
During calibration move each joint through its entire range. After fitting, move the joint in a neutral position and press `n` to advance.
|
|
|
|
### Record a Dataset
|
|
|
|
```bash
|
|
lerobot-record \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=true \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.left_arm_config.port=/dev/ttyACM1 \
|
|
--teleop.right_arm_config.port=/dev/ttyACM0 \
|
|
--teleop.id=exo \
|
|
--dataset.repo_id=your-username/dataset-name \
|
|
--dataset.single_task="Test" \
|
|
--dataset.num_episodes=2 \
|
|
--dataset.episode_time_s=5 \
|
|
--dataset.reset_time_s=5 \
|
|
--dataset.push_to_hub=true \
|
|
--dataset.streaming_encoding=true \
|
|
--dataset.encoder_threads=2
|
|
```
|
|
|
|
> **Note:** Omit `--teleop.left_arm_config.port` and `--teleop.right_arm_config.port` if you're only using the joystick.
|
|
|
|
Example dataset: [nepyope/unitree_box_move_blue_full](https://huggingface.co/datasets/nepyope/unitree_box_move_blue_full)
|
|
|
|
---
|
|
|
|
## Part 4: Training & Inference
|
|
|
|
### Train
|
|
|
|
```bash
|
|
python src/lerobot/scripts/lerobot_train.py \
|
|
--dataset.repo_id=your-username/dataset-name \
|
|
--policy.type=pi05 \
|
|
--output_dir=./outputs/pi05_training \
|
|
--job_name=pi05_training \
|
|
--policy.repo_id=your-username/your-repo-id \
|
|
--policy.pretrained_path=lerobot/pi05_base \
|
|
--policy.compile_model=true \
|
|
--policy.gradient_checkpointing=true \
|
|
--wandb.enable=true \
|
|
--policy.dtype=bfloat16 \
|
|
--policy.freeze_vision_encoder=false \
|
|
--policy.train_expert_only=false \
|
|
--steps=3000 \
|
|
--policy.device=cuda \
|
|
--batch_size=32
|
|
```
|
|
|
|
### Inference with RTC
|
|
|
|
Once trained, we recommend deploying policies using inference-time RTC:
|
|
|
|
```bash
|
|
lerobot-rollout \
|
|
--strategy.type=base \
|
|
--policy.path=your-username/your-repo-id \
|
|
--policy.device=cuda \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=false \
|
|
--robot.controller=HolosomaLocomotionController \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--task="task_description" \
|
|
--duration=1000 \
|
|
--fps=30 \
|
|
--inference.type=rtc
|
|
```
|
|
|
|
---
|
|
|
|
## Additional Resources
|
|
|
|
- [Unitree SDK Documentation](https://github.com/unitreerobotics/unitree_sdk2_python)
|
|
- [GR00T-WholeBodyControl](https://github.com/NVlabs/GR00T-WholeBodyControl)
|
|
- [Holosoma](https://github.com/amazon-far/holosoma)
|
|
- [LeRobot Documentation](https://github.com/huggingface/lerobot)
|
|
- [Unitree IL LeRobot](https://github.com/unitreerobotics/unitree_IL_lerobot)
|
|
|
|
---
|
|
|
|
_Last updated: March 2026_
|