mirror of
https://github.com/huggingface/lerobot.git
synced 2026-05-15 00:29:52 +00:00
4f2ef024d8
* move locomotion from examples to robot, move controller to teleoperator class * modify teleoperate to send back actions to robot * whole body controller * add holosoma to locomotros * various updates * update joint zeroing etc * ensure safefail with locomotion * add unitree locomotion * launch camera from g1 server * publish at varying framerates * fix async read in camera * attempting to fix camera lag * test camera speedup * training * inference works * remove logging from pi0 * remove logging * push local changes * testing * final changes * revert control_utils * revert utils * revert * revert g1 * revert again: * revert utils * push recents * remove examples * remove junk * remove mjlog * revergt edit_dataset * Update lerobot_edit_dataset.py Signed-off-by: Martino Russi <77496684+nepyope@users.noreply.github.com> * undo teleop changes * revert logging * remove loggings * remove loogs * revert dataset tools * Update dataset_tools.py Signed-off-by: Martino Russi <77496684+nepyope@users.noreply.github.com> * move gravity to utils * revert changes * remove matplotlib viewer (rerun works fine) * factory revert * send policy action directly * recent changes * implement flexible action space * send empty command if arms are missing * rename locomotion to controller * add init * implement feedback * add feedback for teleoperator * fix ruff * fix ruff * use read_latest * fix zmq camera * revert exo_serial * simplify PR * revert exo_changes * revert camera_zmq * Update camera_zmq.py Signed-off-by: Martino Russi <77496684+nepyope@users.noreply.github.com> * remove frame duplication from zmq server * revert channerfactoryinitialize * keep channelfactoryinitialize * remove zeroing out logic * fix typo * refactor teleop class * simplify teleop further * import armindex at the top * fix visualizer again * revert ik helper * push stuff * simplify image_server * update image_server * asd * add threading logic * simplify ik helper stuff * simplify holosoma * fix names * fix docs * revert leg override * clean connect * fix controller * fix ruff * clean teleoperator * set_from_wireless * avoid double initializations * refactor robot class * fix pre-commit * update docs * update docs format * add teleop instructions * unitree_g1 specific exception in record/teleoperate * add thumbnail to docs * add thumbnail to doc * refactor(unitree): multiple improvements (#3103) * refactor(unitree): multiple improvements * test(unitree): added tests + improved installation instructions * refactor(robots): minor changes unitree robot kinematic * chore(robots): rename g1 kinematics file --------- Signed-off-by: Martino Russi <77496684+nepyope@users.noreply.github.com> Signed-off-by: Steven Palma <imstevenpmwork@ieee.org> Co-authored-by: Steven Palma <imstevenpmwork@ieee.org> Co-authored-by: Steven Palma <steven.palma@huggingface.co>
262 lines
7.5 KiB
Plaintext
262 lines
7.5 KiB
Plaintext
# Unitree G1
|
|
|
|
<img
|
|
src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/lerobot/unitree_thumbnail.jpg"
|
|
alt="Unitree G1 locomanipulation demo"
|
|
style={{ width: "100%" }}
|
|
/>
|
|
|
|
The Unitree G1 humanoid is now supported in LeRobot! You can teleoperate, train locomanipulation policies, test in sim, and more. Both 29 and 23 DoF variants are supported.
|
|
|
|
---
|
|
|
|
## Part 1: Getting Started
|
|
|
|
### Install LeRobot on Your Machine
|
|
|
|
```bash
|
|
conda create -y -n lerobot python=3.12
|
|
conda activate lerobot
|
|
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
|
|
cd unitree_sdk2_python && pip install -e .
|
|
git clone https://github.com/huggingface/lerobot.git
|
|
cd lerobot
|
|
pip install -e '.[unitree_g1]'
|
|
```
|
|
|
|
### Test the Installation (Simulation)
|
|
|
|
```bash
|
|
lerobot-teleoperate \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=true \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.id=wbc_unitree \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--display_data=true
|
|
```
|
|
|
|
This will launch a [MuJoCo sim instance](https://huggingface.co/lerobot/unitree-g1-mujoco/tree/main) for the G1.
|
|
|
|
- Press `9` to release the robot
|
|
- Press `7` / `8` to increase / decrease waist height
|
|
|
|
### Connect to the Robot
|
|
|
|
The G1's Ethernet IP is fixed at `192.168.123.164`. Your machine must have a static IP on the same subnet: `192.168.123.x` where `x ≠ 164`.
|
|
|
|
```bash
|
|
# Replace 'enp131s0' with your ethernet interface name (check with `ip a`)
|
|
sudo ip addr flush dev enp131s0
|
|
sudo ip addr add 192.168.123.200/24 dev enp131s0
|
|
sudo ip link set enp131s0 up
|
|
```
|
|
|
|
### SSH into the Robot
|
|
|
|
```bash
|
|
ssh unitree@192.168.123.164
|
|
# Password: 123
|
|
```
|
|
|
|
### Install LeRobot on the G1
|
|
|
|
From the robot:
|
|
|
|
```bash
|
|
conda create -y -n lerobot python=3.12
|
|
conda activate lerobot
|
|
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
|
|
cd unitree_sdk2_python && pip install -e .
|
|
git clone https://github.com/huggingface/lerobot.git
|
|
cd lerobot
|
|
pip install -e '.[unitree_g1]'
|
|
```
|
|
|
|
> **Note:** The Unitree SDK requires CycloneDDS v0.10.2. See the [Unitree SDK docs](https://github.com/unitreerobotics/unitree_sdk2_python) for details.
|
|
|
|
---
|
|
|
|
## Part 2: Enable WiFi on the Robot
|
|
|
|
Wi-Fi connectivity is blocked by default on the G1. To activate:
|
|
|
|
```bash
|
|
sudo rfkill unblock all
|
|
sudo ip link set wlan0 up
|
|
sudo nmcli radio wifi on
|
|
sudo nmcli device set wlan0 managed yes
|
|
sudo systemctl restart NetworkManager
|
|
```
|
|
|
|
**On your laptop** (share internet via Ethernet):
|
|
|
|
```bash
|
|
sudo sysctl -w net.ipv4.ip_forward=1
|
|
|
|
# Replace wlp132s0f0 with your WiFi interface name
|
|
sudo iptables -t nat -A POSTROUTING -o wlp132s0f0 -s 192.168.123.0/24 -j MASQUERADE
|
|
sudo iptables -A FORWARD -i wlp132s0f0 -o enp131s0 -m state --state RELATED,ESTABLISHED -j ACCEPT
|
|
sudo iptables -A FORWARD -i enp131s0 -o wlp132s0f0 -j ACCEPT
|
|
```
|
|
|
|
**On the G1** (set default route through your laptop):
|
|
|
|
```bash
|
|
sudo ip route del default 2>/dev/null || true
|
|
sudo ip route add default via 192.168.123.200 dev eth0
|
|
echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf
|
|
|
|
# Verify
|
|
ping -c 3 8.8.8.8
|
|
```
|
|
|
|
**Connect to a WiFi network:**
|
|
|
|
```bash
|
|
nmcli device wifi list
|
|
|
|
sudo nmcli connection add type wifi ifname wlan0 con-name "YourNetwork" ssid "YourNetwork"
|
|
sudo nmcli connection modify "YourNetwork" wifi-sec.key-mgmt wpa-psk
|
|
sudo nmcli connection modify "YourNetwork" wifi-sec.psk "YourPassword"
|
|
sudo nmcli connection modify "YourNetwork" connection.autoconnect yes
|
|
sudo nmcli connection up "YourNetwork"
|
|
|
|
ip a show wlan0
|
|
```
|
|
|
|
You can now SSH over WiFi:
|
|
|
|
```bash
|
|
ssh unitree@<ROBOT_WIFI_IP>
|
|
# Password: 123
|
|
```
|
|
|
|
---
|
|
|
|
## Part 3: Teleoperation & Locomotion
|
|
|
|
### Run the Robot Server
|
|
|
|
On the robot:
|
|
|
|
```bash
|
|
python src/lerobot/robots/unitree_g1/run_g1_server.py --camera
|
|
```
|
|
|
|
### Run the Locomotion Policy
|
|
|
|
```bash
|
|
lerobot-teleoperate \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=false \
|
|
--robot.robot_ip=<ROBOT_IP> \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.id=wbc_unitree \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--display_data=true \
|
|
--robot.controller=HolosomaLocomotionController
|
|
```
|
|
|
|
We support both [HolosomaLocomotionController](https://github.com/amazon-far/holosoma) and [GrootLocomotionController](https://github.com/NVlabs/GR00T-WholeBodyControl).
|
|
|
|
---
|
|
|
|
## Part 4: Loco-Manipulation with the Homunculus Exoskeleton
|
|
|
|
We provide a loco-manipulation solution via the Homunculus Exoskeleton — an open-source 7 DoF exoskeleton for whole-body control. Assembly instructions [here](https://github.com/nepyope/hmc_exo).
|
|
|
|
### Calibrate
|
|
|
|
```bash
|
|
lerobot-calibrate \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.left_arm_config.port=/dev/ttyACM1 \
|
|
--teleop.right_arm_config.port=/dev/ttyACM0 \
|
|
--teleop.id=exo
|
|
```
|
|
|
|
During calibration move each joint through its entire range. After fitting, move the joint in a neutral position and press `n` to advance.
|
|
|
|
### Record a Dataset
|
|
|
|
```bash
|
|
lerobot-record \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=true \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--teleop.type=unitree_g1 \
|
|
--teleop.left_arm_config.port=/dev/ttyACM1 \
|
|
--teleop.right_arm_config.port=/dev/ttyACM0 \
|
|
--teleop.id=exo \
|
|
--dataset.repo_id=your-username/dataset-name \
|
|
--dataset.single_task="Test" \
|
|
--dataset.num_episodes=2 \
|
|
--dataset.episode_time_s=5 \
|
|
--dataset.reset_time_s=5 \
|
|
--dataset.push_to_hub=true \
|
|
--dataset.streaming_encoding=true \
|
|
--dataset.encoder_threads=2
|
|
```
|
|
|
|
> **Note:** Omit `--teleop.left_arm_config.port` and `--teleop.right_arm_config.port` if you're only using the joystick.
|
|
|
|
Example dataset: [nepyope/unitree_box_move_blue_full](https://huggingface.co/datasets/nepyope/unitree_box_move_blue_full)
|
|
|
|
---
|
|
|
|
## Part 5: Training & Inference
|
|
|
|
### Train
|
|
|
|
```bash
|
|
python src/lerobot/scripts/lerobot_train.py \
|
|
--dataset.repo_id=your-username/dataset-name \
|
|
--policy.type=pi05 \
|
|
--output_dir=./outputs/pi05_training \
|
|
--job_name=pi05_training \
|
|
--policy.repo_id=your-username/your-repo-id \
|
|
--policy.pretrained_path=lerobot/pi05_base \
|
|
--policy.compile_model=true \
|
|
--policy.gradient_checkpointing=true \
|
|
--wandb.enable=true \
|
|
--policy.dtype=bfloat16 \
|
|
--policy.freeze_vision_encoder=false \
|
|
--policy.train_expert_only=false \
|
|
--steps=3000 \
|
|
--policy.device=cuda \
|
|
--batch_size=32
|
|
```
|
|
|
|
### Inference with RTC
|
|
|
|
Once trained, we recommend deploying policies using inference-time RTC:
|
|
|
|
```bash
|
|
python examples/rtc/eval_with_real_robot.py \
|
|
--policy.path=your-username/your-repo-id \
|
|
--policy.device=cuda \
|
|
--robot.type=unitree_g1 \
|
|
--robot.is_simulation=false \
|
|
--robot.controller=HolosomaLocomotionController \
|
|
--robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
|
|
--task="task_description" \
|
|
--duration=1000 \
|
|
--fps=30 \
|
|
--rtc.enabled=true
|
|
```
|
|
|
|
---
|
|
|
|
## Additional Resources
|
|
|
|
- [Unitree SDK Documentation](https://github.com/unitreerobotics/unitree_sdk2_python)
|
|
- [GR00T-WholeBodyControl](https://github.com/NVlabs/GR00T-WholeBodyControl)
|
|
- [Holosoma](https://github.com/amazon-far/holosoma)
|
|
- [LeRobot Documentation](https://github.com/huggingface/lerobot)
|
|
- [Unitree IL LeRobot](https://github.com/unitreerobotics/unitree_IL_lerobot)
|
|
|
|
---
|
|
|
|
_Last updated: March 2026_
|