# Unitree G1 Unitree G1 locomanipulation demo The Unitree G1 humanoid is now supported in LeRobot! You can teleoperate, train locomanipulation policies, test in sim, and more. Both 29 and 23 DoF variants are supported. --- ## Part 1: Getting Started ### Install LeRobot on Your Machine ```bash conda create -y -n lerobot python=3.12 conda activate lerobot git clone https://github.com/unitreerobotics/unitree_sdk2_python.git cd unitree_sdk2_python && pip install -e . git clone https://github.com/huggingface/lerobot.git cd lerobot pip install -e '.[unitree_g1]' ``` ### Test the Installation (Simulation) ```bash lerobot-teleoperate \ --robot.type=unitree_g1 \ --robot.is_simulation=true \ --teleop.type=unitree_g1 \ --teleop.id=wbc_unitree \ --robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \ --display_data=true ``` This will launch a [MuJoCo sim instance](https://huggingface.co/lerobot/unitree-g1-mujoco/tree/main) for the G1. - Press `9` to release the robot - Press `7` / `8` to increase / decrease waist height ### Connect to the Robot The G1's Ethernet IP is fixed at `192.168.123.164`. Your machine must have a static IP on the same subnet: `192.168.123.x` where `x ≠ 164`. ```bash # Replace 'enp131s0' with your ethernet interface name (check with `ip a`) sudo ip addr flush dev enp131s0 sudo ip addr add 192.168.123.200/24 dev enp131s0 sudo ip link set enp131s0 up ``` ### SSH into the Robot ```bash ssh unitree@192.168.123.164 # Password: 123 ``` ### Install LeRobot on the G1 From the robot: ```bash conda create -y -n lerobot python=3.12 conda activate lerobot git clone https://github.com/unitreerobotics/unitree_sdk2_python.git cd unitree_sdk2_python && pip install -e . git clone https://github.com/huggingface/lerobot.git cd lerobot pip install -e '.[unitree_g1]' ``` > **Note:** The Unitree SDK requires CycloneDDS v0.10.2. See the [Unitree SDK docs](https://github.com/unitreerobotics/unitree_sdk2_python) for details. --- ## Part 2: Enable WiFi on the Robot Wi-Fi connectivity is blocked by default on the G1. To activate: ```bash sudo rfkill unblock all sudo ip link set wlan0 up sudo nmcli radio wifi on sudo nmcli device set wlan0 managed yes sudo systemctl restart NetworkManager ``` **On your laptop** (share internet via Ethernet): ```bash sudo sysctl -w net.ipv4.ip_forward=1 # Replace wlp132s0f0 with your WiFi interface name sudo iptables -t nat -A POSTROUTING -o wlp132s0f0 -s 192.168.123.0/24 -j MASQUERADE sudo iptables -A FORWARD -i wlp132s0f0 -o enp131s0 -m state --state RELATED,ESTABLISHED -j ACCEPT sudo iptables -A FORWARD -i enp131s0 -o wlp132s0f0 -j ACCEPT ``` **On the G1** (set default route through your laptop): ```bash sudo ip route del default 2>/dev/null || true sudo ip route add default via 192.168.123.200 dev eth0 echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf # Verify ping -c 3 8.8.8.8 ``` **Connect to a WiFi network:** ```bash nmcli device wifi list sudo nmcli connection add type wifi ifname wlan0 con-name "YourNetwork" ssid "YourNetwork" sudo nmcli connection modify "YourNetwork" wifi-sec.key-mgmt wpa-psk sudo nmcli connection modify "YourNetwork" wifi-sec.psk "YourPassword" sudo nmcli connection modify "YourNetwork" connection.autoconnect yes sudo nmcli connection up "YourNetwork" ip a show wlan0 ``` You can now SSH over WiFi: ```bash ssh unitree@ # Password: 123 ``` --- ## Part 3: Teleoperation & Locomotion ### Run the Robot Server On the robot: ```bash python src/lerobot/robots/unitree_g1/run_g1_server.py --camera ``` ### Run the Locomotion Policy ```bash lerobot-teleoperate \ --robot.type=unitree_g1 \ --robot.is_simulation=false \ --robot.robot_ip= \ --teleop.type=unitree_g1 \ --teleop.id=wbc_unitree \ --robot.cameras='{"global_view": {"type": "zmq", "server_address": "", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \ --display_data=true \ --robot.controller=HolosomaLocomotionController ``` We support both [HolosomaLocomotionController](https://github.com/amazon-far/holosoma) and [GrootLocomotionController](https://github.com/NVlabs/GR00T-WholeBodyControl). --- ## Part 4: Loco-Manipulation with the Homunculus Exoskeleton We provide a loco-manipulation solution via the Homunculus Exoskeleton — an open-source 7 DoF exoskeleton for whole-body control. Assembly instructions [here](https://github.com/nepyope/hmc_exo). ### Calibrate ```bash lerobot-calibrate \ --teleop.type=unitree_g1 \ --teleop.left_arm_config.port=/dev/ttyACM1 \ --teleop.right_arm_config.port=/dev/ttyACM0 \ --teleop.id=exo ``` During calibration move each joint through its entire range. After fitting, move the joint in a neutral position and press `n` to advance. ### Record a Dataset ```bash lerobot-record \ --robot.type=unitree_g1 \ --robot.is_simulation=true \ --robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \ --teleop.type=unitree_g1 \ --teleop.left_arm_config.port=/dev/ttyACM1 \ --teleop.right_arm_config.port=/dev/ttyACM0 \ --teleop.id=exo \ --dataset.repo_id=your-username/dataset-name \ --dataset.single_task="Test" \ --dataset.num_episodes=2 \ --dataset.episode_time_s=5 \ --dataset.reset_time_s=5 \ --dataset.push_to_hub=true \ --dataset.streaming_encoding=true \ --dataset.encoder_threads=2 ``` > **Note:** Omit `--teleop.left_arm_config.port` and `--teleop.right_arm_config.port` if you're only using the joystick. Example dataset: [nepyope/unitree_box_move_blue_full](https://huggingface.co/datasets/nepyope/unitree_box_move_blue_full) --- ## Part 5: Training & Inference ### Train ```bash python src/lerobot/scripts/lerobot_train.py \ --dataset.repo_id=your-username/dataset-name \ --policy.type=pi05 \ --output_dir=./outputs/pi05_training \ --job_name=pi05_training \ --policy.repo_id=your-username/your-repo-id \ --policy.pretrained_path=lerobot/pi05_base \ --policy.compile_model=true \ --policy.gradient_checkpointing=true \ --wandb.enable=true \ --policy.dtype=bfloat16 \ --policy.freeze_vision_encoder=false \ --policy.train_expert_only=false \ --steps=3000 \ --policy.device=cuda \ --batch_size=32 ``` ### Inference with RTC Once trained, we recommend deploying policies using inference-time RTC: ```bash python examples/rtc/eval_with_real_robot.py \ --policy.path=your-username/your-repo-id \ --policy.device=cuda \ --robot.type=unitree_g1 \ --robot.is_simulation=false \ --robot.controller=HolosomaLocomotionController \ --robot.cameras='{"global_view": {"type": "zmq", "server_address": "", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \ --task="task_description" \ --duration=1000 \ --fps=30 \ --rtc.enabled=true ``` --- ## Additional Resources - [Unitree SDK Documentation](https://github.com/unitreerobotics/unitree_sdk2_python) - [GR00T-WholeBodyControl](https://github.com/NVlabs/GR00T-WholeBodyControl) - [Holosoma](https://github.com/amazon-far/holosoma) - [LeRobot Documentation](https://github.com/huggingface/lerobot) - [Unitree IL LeRobot](https://github.com/unitreerobotics/unitree_IL_lerobot) --- _Last updated: March 2026_