Unitree Go2 Robot Process Architecture

Overview of the Unitree Go2 Robot Architecture

The Unitree Go2 is a high-performance quadruped robot built for agility and autonomy. It consists of a mechanical structure (four legs, body, head) and an electronics stack with a distributed control system. The Go2’s architecture is modular, with separate hardware and software components working in harmony. The diagram below illustrates a high-level view of this modular design, showing the main components and their interconnections.

Data Source: ,

Core hardware components include: joint motors with built-in controllers, an embedded PC for AI and perception, and various sensors (IMU, cameras, LiDAR, microphones). The software architecture is designed to run on multiple threads or processes that handle different aspects of the robot’s operation (motor control, motion control, perception, AI, etc.). These processes communicate via message passing or shared memory, ensuring the robot can react quickly to sensor inputs and control commands.

Processes Running on the Unitree Go2 Hardware

The Unitree Go2’s software is executed on a combination of on-board processors and edge computing units. Below we detail the main processes or software modules that run on the robot and their responsibilities:

Motor Controller Processes

At the lowest level, each leg’s joint motor has a dedicated motor controller process (typically implemented on a microcontroller or FPGA within the motor unit). These controllers are responsible for real-time control of motor position, velocity, and torque. They execute low-level control algorithms (such as PID or Field-Oriented Control) to achieve precise motion of each joint. The motor controllers continuously read encoder feedback and apply control commands (position setpoints or torque commands) to the motor coils. They operate at high frequency (on the order of 1 kHz or more) to ensure smooth and responsive joint movements. This distributed control ensures that even if one motor fails, the others can continue to operate (with degraded functionality).

On the Go2, Unitree’s motor controllers are based on custom hardware and firmware. For example, each leg uses a brushless DC (BLDC) motor with integrated drivers. The Go2’s joint motors have a peak torque of 45 N·m , and the controllers use Field-Oriented Control (FOC) to achieve high-performance torque control . These controllers are programmed to run autonomously with minimal external intervention, focusing solely on their joint’s dynamics. A diagram of this low-level control loop illustrates how the motor controller processes continuously adjust the motor’s behavior based on feedback.

Data Source: ,

Motion Control Processes

Above the motor controllers, a motion control process (often running on a main CPU) handles the robot’s gait generation and overall locomotion. This process is responsible for higher-level commands such as walking, running, and standing. It typically runs at a lower frequency (around 10–100 Hz) compared to the motor controllers. The motion control module takes high-level commands (e.g. move forward at X speed, or execute a specific gait pattern) and translates them into joint position or torque trajectories for the motors. It uses algorithms like inverse kinematics to compute the desired joint angles for the legs to achieve a given body pose, and gait generation algorithms (like ZMP-based balance control or Central Pattern Generators) to ensure stable walking. Additionally, the motion control process manages the robot’s posture and balance – adjusting leg movements in real time to compensate for disturbances or terrain changes. It interfaces with the perception system to incorporate obstacle avoidance and terrain adaptation into its motion plans.

Unitree’s motion control is built on a hierarchical control framework. For example, the robot can be in different gait modes (e.g. walking, running, dance, etc.), each handled by its own sub-process or state machine. The high-level motion controller might coordinate these modes, switching between them based on the robot’s task or the user’s commands. This modular approach allows adding new gaits or behaviors by implementing new processes or modules that plug into the motion control system. The motion control process also receives feedback from the IMU and foot force sensors to adjust the robot’s balance. In summary, the motion control processes ensure the robot can move dynamically (walking, running, jumping) while maintaining stability, using algorithms that are tuned for the Go2’s specific capabilities.

Perception Processes

The perception layer on the Go2 runs processes dedicated to sensing and understanding the environment. Key perception modules include: LiDAR processing, camera image processing, and IMU/state estimation. These processes run on the embedded PC or edge computer and use the robot’s sensors to gather data. For instance, the Go2 is equipped with a 4D LiDAR (Unitree’s L1) that provides 360° x 90° field-of-view . A LiDAR processing process is responsible for reading the raw LiDAR point cloud data and converting it into a useful representation (like obstacle maps or depth images). This process might perform real-time filtering, segmentation, and mapping. Similarly, a vision processing process handles camera inputs (e.g. stereo cameras or a single RGB camera) to detect objects, recognize faces, or interpret the surroundings. It could use computer vision algorithms (object detection, image segmentation) to identify obstacles or people in the robot’s path.

Another important perception process is the state estimation filter. This process continuously estimates the robot’s pose and velocity using sensor fusion. It typically uses an Extended Kalman Filter (EKF) or Complementary Filter that fuses data from the IMU, wheel encoders (if any), and sometimes vision or LiDAR measurements. The goal is to provide accurate state information (position, orientation, velocity) even when individual sensors are noisy or unreliable. For example, the EKF can combine the IMU’s acceleration and gyro data with wheel odometry to estimate the robot’s movement over time. This estimated state is used by the higher-level control processes to make informed decisions. The perception processes run in parallel, each updating at their own rates. The LiDAR and camera processing might run at 10–20 Hz, while the state estimation filter runs at a few hundred Hz. All these perception processes feed information into the control system, enabling the robot to perceive its environment and adjust its actions accordingly.

AI and Computation Processes

The Unitree Go2 is equipped with an embedded AI computing unit (often a Jetson or similar) that runs higher-level artificial intelligence processes. These include AI inference engines, machine learning models, and task-level decision processes. One key process is the AI module that can execute models for tasks like object recognition, gesture recognition, or even reinforcement learning. For example, the robot can be programmed to recognize voice commands or follow a person – these functions rely on AI models running on the robot. The AI process might interface with cloud services via the robot’s network (if connected) to offload complex computations or receive updates. However, many AI tasks are performed on-board to ensure real-time operation.

Another important aspect is the robot’s OS and runtime. The Go2 likely runs a real-time operating system (RTOS) or a Linux-based system with real-time extensions. The RTOS process manages the scheduling of all the lower-level tasks (motor controllers, etc.) to ensure they meet timing constraints. In a Linux context, this might involve using POSIX threads with real-time priorities or preempt-rt patches to ensure the motor control loops get the CPU time they need. The RTOS ensures that critical processes (like the motor controllers) are scheduled first and can’t be starved by other tasks. This is crucial for maintaining the robot’s stability.

The AI and computation processes also include communication and control logic that allows external users or other robots to interact with the Go2. For instance, the robot might run a Web server process that provides an API for remote control or status monitoring . It could also run a communication service (using protocols like ROS or DDS) to receive high-level commands or send sensor data. These processes ensure that the Go2 can be controlled via a smartphone app, a remote controller, or even other robots in a swarm.

Communication and Control Processes

Communication is a critical part of the Unitree Go2’s architecture. The robot has multiple communication processes to facilitate control and data exchange. On-board, there is likely a local communication bus (such as CAN or Ethernet) connecting the motor controllers to the main computer. The CAN bus process on the main computer reads data from the motor controllers (e.g. joint positions, currents) and sends control commands back. This process ensures low-latency communication between the main CPU and the motors. Additionally, the Go2 supports wireless communication for user control and network connectivity. For example, it can connect to a smartphone app via Wi-Fi or Bluetooth. A wireless communication process handles the app’s control signals and status updates, allowing users to send commands (like walking or performing tricks) and receive sensor data in real time. This process might also handle voice command processing if the robot has a microphone – interpreting spoken commands and triggering actions.

For developers, Unitree provides SDKs and APIs to communicate with the robot. The Unitree SDK (software development kit) allows external programs to send high-level commands to the robot (e.g. walk at a certain speed) or to receive data (like sensor readings) . This SDK is a set of libraries and example code that run on the user’s computer or another robot. It typically uses TCP/IP or UDP communication over the network. The SDK process on the Go2 receives these commands and translates them into actions. Similarly, if using ROS (Robot Operating System), the Go2 can run ROS nodes that publish sensor topics and subscribe to command topics. The ROS node process is responsible for mapping ROS messages to the robot’s internal control and data structures. This allows integration with ROS-based software stacks (for navigation, SLAM, etc.).

Finally, the Go2 has processes for autonomous control that handle decision-making. For example, if the robot is in an autonomous navigation mode, there might be a navigation process that takes sensor inputs (like the LiDAR map and estimated position) and computes a path. This process might use algorithms like A* for path planning and send velocity commands to the motion control process to follow the path. If obstacle avoidance is needed, another process can react to the LiDAR or camera data, quickly adjusting the robot’s movement to avoid collisions. All these processes work together to form a coordinated system: sensors feed data into the perception and state estimation processes, which in turn feed into the control processes that send commands to the motors.

Bounded Contexts and Inter-Process Communication

The Unitree Go2’s architecture can be thought of as a collection of bounded contexts – separate, specialized software modules each handling a distinct responsibility. Each bounded context (e.g. motor control, motion control, perception, etc.) operates with its own data and logic, and communicates with other contexts through well-defined interfaces. This modular design makes the system scalable and maintainable, as changes in one context (like improving the gait algorithm) can be made without affecting others, as long as the communication interface remains consistent.

The communication between these processes is achieved through inter-process communication (IPC) mechanisms. In the Go2’s case, IPC is handled via a combination of low-level hardware buses and high-level software protocols. At the hardware level, each motor controller is a separate bounded context that communicates with the main processor over a dedicated bus (CAN bus or similar). This direct communication ensures that motor commands are sent with minimal latency. At the software level, the main CPU uses message passing to coordinate tasks. For example, when the motion control process needs to update the leg positions, it sends a message to the motor control process specifying the target angles or forces. The motor control process then updates the motor controllers accordingly. This message passing can be implemented using a publish-subscribe or client-server pattern. Unitree’s SDK and ROS interfaces abstract some of this IPC, but internally, the robot’s processes use efficient communication to avoid bottlenecks.

Another important aspect of bounded contexts is the concept of data ownership and coordination. Each context has its own data structures (for example, the perception context owns the LiDAR point cloud data, the motion context owns the current target gait parameters). When one context needs data from another, it requests it via the interface. For instance, the navigation context might request the current estimated position from the state estimation context to plan a path. The communication between contexts is usually asynchronous, allowing the sender to continue without waiting for a response (unless a synchronous call is needed, like an immediate stop command).

In summary, the Unitree Go2’s architecture is divided into clear bounded contexts with well-defined communication. This structure ensures that each part of the robot’s operation is isolated and can be optimized independently. It also allows the robot to handle complex tasks by breaking them into manageable sub-tasks, each handled by a specialized process. The IPC mechanisms (both hardware and software) ensure that these contexts can exchange information in a timely and reliable manner, enabling the robot to react to its environment and execute commands with precision.

Key Features of the Unitree Go2 Robot

The Unitree Go2 is packed with advanced features that set it apart. Below are some of the key features of the Go2:

These features make the Unitree Go2 a versatile and capable quadruped robot. Whether used for research, education, or entertainment, the Go2’s combination of mobility, intelligence, and programmability allows it to perform a wide range of tasks autonomously. The following chart summarizes some of the key performance metrics that highlight the Go2's capabilities.

Data Source: , , ,

Core Scientific and Academic Concepts

Developing and using the Unitree Go2 involves understanding several core scientific and academic concepts from fields like robotics, control theory, and artificial intelligence. Here are some of the key concepts:

In summary, the Unitree Go2 is a practical implementation of many academic concepts in robotics and AI. Developers and researchers using the Go2 will benefit from a strong foundation in control theory, sensor fusion, legged locomotion, and AI. These concepts not only explain how the robot works but also provide a framework for improving and extending its capabilities. The Go2 serves as a testbed for exploring these academic ideas in a real-world, dynamic environment, making it a valuable tool for both learning and cutting-edge research in robotics.

Reference

[1]
[Unitree Go1] Comparing Go1 and Go2 - Robot Forum
https://forum.mybotshop.de/t/unitree-go1-comparing-go1-and-go2/1076
[2]
[3]
[4]
[5]
Multi-sensor Fusion Localization and Terrain Reconstruction ...
https://link.springer.com/content/pdf/10.1007/978-981-96-0789-1_6.pdf?pdf=inline%20link
[6]
Robust State Estimation for Legged Robots with Dual Beta ...
https://arxiv.org/html/2411.11483v1
[7]
[8]
[11]
Unofficial ROS2 SDK support for Unitree GO2 AIR/PRO/EDU
https://github.com/abizovnuralem/go2_ros2_sdk
[17]
Robot Dog Go2_Quadruped_Robot Dog Company
https://www.unitree.com/go2
[19]
Unitree Go1 vs. Go2: Which Robot Dog is Best for You?
https://blog.robozaps.com/b/unitree-go1-vs-go2
[21]
[22]
Robust State Estimation for Legged Robots with Dual Beta ...
https://arxiv.org/abs/2411.11483
[24]
[25]
Unofficial ROS2 SDK support for Unitree GO2 AIR/PRO/EDU
https://github.com/abizovnuralem/go2_ros2_sdk
[26]
[27]
How We Gave Life to an AI Agent with the Unitree Go2 Robot
https://wso2.com/library/blogs/how-we-gave-life-to-an-ai-agent-with-unitree-go2-robot/
[31]
Unitree Go2 Robot Dog Quadruped Robotics for Adults ...
https://www.amazon.com/Unitree-Quadruped-Robotics-Adults-Embodied/dp/B07TTRPFBT
[33]