
Unitree H1
unitree-h1

Unitree H1
unitree-h1
The Unitree H1 is a 180 cm tall, lightweight 47 kg humanoid research robot with 19 degrees of freedom (DOF), 3D LiDAR, depth cameras, 360 N·m knee torque, and a world‑record top speed of 3.3 m/s.
The Unitree H1 is a full‑size humanoid robot engineered for high‑agility locomotion, dynamic balance, and advanced research applications. Equipped with powerful 360 N·m joint actuators, a world‑record walking speed of 3.3 m/s, and 360‑degree depth perception via 3D LiDAR and depth cameras, the H1 delivers elite performance for robotics laboratories and industrial R&D.
The Unitree H1 is a full‑size, high‑performance humanoid robot designed for advanced locomotion research, robotics development, and dynamic real‑world applications. Standing approximately 180 cm tall and weighing around 47 kg, the H1 delivers exceptional agility through its in‑house engineered powertrain and lightweight structure. Its custom M107 joint motors provide up to 360 N·m of torque, enabling powerful, controlled, and fluid movement across complex terrain. This compact yet robust architecture allows the H1 to maintain high stability even while running, jumping, or executing rapid directional changes. [humanoidindex.org], [1x.tech]
A defining characteristic of the Unitree H1 is its world‑record locomotion capability, achieving speeds of over 3.3 m/s, with potential mobility exceeding 5 m/s under optimized conditions. This makes it the fastest full‑size electric humanoid robot currently available, outperforming most commercial research platforms in dynamic gait generation, balance control, and overall mobility. The H1’s leg joints offer 5 degrees of freedom, while its arms deliver 4 degrees of freedom per limb, enabling a wide range of human‑like movement patterns and interaction tasks with expandable functionality in the H1‑2 model. [humanoid-robots.io], [blog.robozaps.com]
For perception, the H1 integrates a complete 360° sensing suite consisting of 3D LiDAR (Livox MID360) and Intel RealSense depth cameras, allowing real‑time spatial awareness, mapping, and obstacle detection. These sensors enable autonomous navigation in structured and unstructured environments. Additional onboard sensing includes IMU, joint encoders, and force/torque feedback, essential for maintaining dynamic balance and enabling adaptive control strategies. This rich perception stack supports tasks such as SLAM, terrain analysis, autonomous walking, and human‑robot interaction. [humanoidindex.org], [humanoid.guide]
The H1’s computing platform varies by configuration. The base model features dual‑processor setups suitable for motion control and onboard perception, while the advanced H1‑2 and research packages incorporate NVIDIA Jetson Orin (up to 100 TOPS) for AI‑accelerated workloads. These frameworks allow the robot to support ROS/ROS2, Python APIs, C++ SDKs, cloud connectivity, and advanced algorithm development. This makes the H1 a powerful tool for university research, robotics laboratories, reinforcement‑learning experimentation, motion‑policy development, and industrial AI prototyping. [humanoid-robots.io], [blog.robozaps.com]
The H1’s 864 Wh battery provides approximately 1.5–2 hours of runtime depending on load, with quick‑swap capability to minimize downtime during continuous testing. Its structural design — including hollow electrical routing, industrial‑grade crossed‑roller bearings, and a fully enclosed leg actuation system — ensures durability, clean cable management, and safety in demanding research environments. With a price range between $90,000 and $150,000, depending on configuration and region, the Unitree H1 is positioned as one of the most accessible full‑size, high‑agility humanoid robots available to research institutions and engineering teams worldwide
manufacturer
Unitree
WARRANTY YEARS
2
battery_life_h
2
imu
The Unitree H1 includes an integrated multi‑field microphone system that supports spatial audio capture for navigation, sound‑based awareness, and voice‑recognition functions. This microphone array assists the robot in detecting directional cues, enhancing situational awareness, and improving interaction in dynamic environments.
storage_gb
0
feature_bullets
The Unitree H1 supports a flexible programming environment centered on ROS and ROS2, making it suitable for advanced robotics research and custom development workflows. Developers can interact with the robot using Python and C++ SDKs, allowing full access to locomotion control, perception modules, and navigation functions. [humanoid.guide]
Research‑grade configurations of the H1 also integrate NVIDIA Jetson Orin as an optional high‑performance compute module, enabling AI‑accelerated workloads such as machine‑learning pipelines, reinforcement‑learning locomotion models, and advanced perception algorithms. The robot’s dual‑computer architecture separates low‑level motion control from user‑side development, providing a stable environment for building complex autonomous behaviors. [humanoid-robots.io]
The H1 additionally supports remote operation through Unitree’s app and PC‑based tools, with APIs available for navigation, mapping, motion control, and human‑robot interaction. This makes the H1 a powerful research platform for universities, labs, and engineering teams working on next‑generation humanoid robotics
manufacturer country
China
height_cm
180
charging_time_h
3
microphones
The Unitree H1 includes an integrated microphone system used for voice‑related functions and basic audio perception. This onboard microphone supports interaction, sound awareness, and recognition tasks, complementing the robot’s 3D LiDAR and depth‑camera sensing suite.
programming
The Unitree H1 supports a flexible programming environment centered on ROS and ROS2, making it suitable for advanced robotics research and custom development workflows. Developers can interact with the robot using Python and C++ SDKs, allowing full access to locomotion control, perception modules, and navigation functions. [humanoid.guide]
Research‑grade configurations of the H1 also integrate NVIDIA Jetson Orin as an optional high‑performance compute module, enabling AI‑accelerated workloads such as machine‑learning pipelines, reinforcement‑learning locomotion models, and advanced perception algorithms. The robot’s dual‑computer architecture separates low‑level motion control from user‑side development, providing a stable environment for building complex autonomous behaviors. [humanoid-robots.io]
The H1 additionally supports remote operation through Unitree’s app and PC‑based tools, with APIs available for navigation, mapping, motion control, and human‑robot interaction. This makes the H1 a powerful research platform for universities, labs, and engineering teams working on next‑generation humanoid robotics
use_cases
The Unitree H1 supports a flexible programming environment centered on ROS and ROS2, making it suitable for advanced robotics research and custom development workflows. Developers can interact with the robot using Python and C++ SDKs, allowing full access to locomotion control, perception modules, and navigation functions. [humanoid.guide]
Research‑grade configurations of the H1 also integrate NVIDIA Jetson Orin as an optional high‑performance compute module, enabling AI‑accelerated workloads such as machine‑learning pipelines, reinforcement‑learning locomotion models, and advanced perception algorithms. The robot’s dual‑computer architecture separates low‑level motion control from user‑side development, providing a stable environment for building complex autonomous behaviors. [humanoid-robots.io]
The H1 additionally supports remote operation through Unitree’s app and PC‑based tools, with APIs available for navigation, mapping, motion control, and human‑robot interaction. This makes the H1 a powerful research platform for universities, labs, and engineering teams working on next‑generation humanoid robotics
robot type
Humanoid
WIDTH_cm
57
num_joints_total
27
speakers
The Unitree H1 does not list any built‑in speaker system in its official specifications. Available documentation highlights sensors such as 3D LiDAR, depth cameras, IMU, joint encoders, force sensors, and an integrated microphone, but no audio‑output hardware has been confirmed.
os
The Unitree H1 runs a ROS‑compatible software environment that supports both ROS and ROS2, along with Python and C++ SDKs for development. The robot’s control stack is built on Unitree’s proprietary real‑time motion‑control framework, while higher‑level functions can be programmed through open interfaces. Research configurations use dual onboard processors—and in some versions an optional NVIDIA Jetson Orin compute module—to handle perception, navigation, and AI workloads.
Category
Forschung / Industrie
depth_cm
22
num_joints_arms
4
cpu
cpu: Intel i7‑1265U
certifications
0
price in euro
weight_kg
47
num_joints_legs
5
gpu
The Unitree H1 supports high‑speed running, reaching over 3.3 m/s, enabled by its powerful M107 joint motors and advanced dynamic‑balance control. It performs stable real‑time balancing during rapid gait changes and uneven terrain. Its 360° perception system, combining 3D LiDAR and depth cameras, enables robust obstacle detection and terrain‑aware navigation, allowing autonomous movement through complex environments.
safety_features
3D LiDAR + depth camera
price in usd
89.741
max_speed_kmh
5
camera_system
The Unitree H1 features a full 360° perception camera system powered by a 3D LiDAR sensor and an Intel RealSense depth camera, enabling high‑precision spatial mapping, obstacle detection, and autonomous navigation in complex environments. This depth‑sensing suite provides real‑time panoramic awareness and supports advanced tasks such as SLAM and terrain analysis.
ram_gb
datasheet_pdf
9
DELIVERY TIME
12
payload_kg
15
lidar
360° IMU + Depth-camera-Integration
ai_capabilities
The Unitree H1 delivers advanced AI‑driven locomotion, balance, and perception capabilities designed for high‑performance research and real‑world navigation. Its control system enables dynamic gait planning, real‑time balance correction, and high‑speed motion, allowing the robot to reach running speeds of over 3.3 m/s. These abilities are powered by Unitree’s high‑torque actuators and an optimized software stack for agile movement and fast decision‑making.
The robot integrates a full 360° perception system that combines 3D LiDAR, depth cameras, IMU data, and joint‑level feedback sensors. This multimodal sensing architecture supports environment mapping, obstacle detection, SLAM, and autonomous terrain navigation, enabling the H1 to adapt its movements in real time and operate in complex indoor and outdoor environments.
For AI development, the H1 supports ROS and ROS2, as well as Python and C++ SDKs. Research configurations also offer NVIDIA Jetson Orin for accelerated computing. These platforms enable machine‑learning workflows, reinforcement‑learning locomotion policies, perception pipelines, and advanced human‑robot interaction experiments. Its dual‑layer compute architecture separates motion control from user development, giving researchers a stable and powerful base for experimentation.
review_rating
