Home
/
Robots
/
AlphaBot 2

AlphaBot 2

AlphaBot 2 by AI² Robotics is a 34-DOF humanoid with embodied AI for autonomous cooking, inspection, and multi-step task execution across biotech, manufacturing, and service industries.
Software Type
Closed Source
Software Package
Runs the AlphaBrain embodied AI platform with unified vision-language-action modeling, real-time motion planning, deep task decomposition, facial expression analysis, and autonomous task execution.
Actuators
Equipped with high-precision actuators enabling 34 degrees of freedom, supporting humanlike flexibility, extended reach, and fine force feedback for compliant manipulation.
Compiute
Powered by onboard AI processors optimized for real-time perception, reasoning, and control, supporting complex multi-modal sensor fusion and large-scale language models.
Sensors
Includes multi-camera 360° vision, force sensors for feedback, microphone arrays, and environmental sensors for spatial awareness and interaction.
Max Op. time
mins

Robot Brief

AlphaBot 2, developed by Shenzhen-based AI² Robotics and unveiled in April 2025, is a next-generation general-purpose humanoid robot powered by the proprietary embodied AI platform called AlphaBrain. This advanced AI integrates unified vision, language, and action capabilities (GOVLA model) to enable real-time perception, reasoning, and full-body control. AlphaBot 2 features a dual “Fast + Slow” system architecture where a fast real-time motion planner handles immediate responses, while a slower, deeper reasoning engine manages complex task decomposition and conversational reasoning. The robot’s design includes a lifting and tilting waist-leg structure that supports a full operational height range from 0 to 2.4 meters, allowing it to work across a wide vertical workspace. Its fine force feedback system enables a delicate balance of strength and compliance, allowing AlphaBot 2 to perform complex physical tasks such as preparing traditional items (“Four Treasures of the Study”), cooking, sterile filling, unpacking, disinfection, and visual inspection autonomously. The robot’s AI can interpret nuanced human intentions through facial expression analysis and natural language, facilitating smooth human-robot collaboration. AlphaBot 2’s 34 degrees of freedom and humanlike flexibility, with an arm span of 700 mm and reach up to 240 cm, allow it to adapt to diverse environments ranging from desktop setups to factory floors without manual reprogramming. Early industrial deployments include partnerships with biotechnology and semiconductor companies, and plans exist for rollout in airports and residential communities in China by late 2025.

Use Cases

AlphaBot 2 autonomously performs a wide range of tasks including cooking, serving meals, sterile filling, unpacking, disinfection, and visual inspection on production lines. It perceives its environment with 360° vision, plans and executes complex multi-step tasks without specialized training, and collaborates naturally with humans by interpreting facial expressions and intentions. The robot flexibly adapts to new tasks and environments, from laboratory automation to household assistance.

Industries

  • Biotechnology & Pharmaceuticals: Automates sterile filling, unpacking, and disinfection to reduce contamination risks.
  • Manufacturing & Semiconductor: Supports visual inspection and complex assembly tasks.
  • Hospitality & Food Service: Prepares and serves meals autonomously.
  • Logistics & Warehousing: Handles unpacking and sorting operations.
  • Residential & Public Spaces: Planned deployment for assistance and service in airports and communities.

Specifications

Length
mm
Width
mm
Height (ResT)
mm
Height (Stand)
-
2400
mm
Height (Min)
mm
Height (Max)
mm
Weight (With Batt.)
-
kg
Weight (NO Batt.)
-
kg
Max Step Height
-
mm
Max Slope
+/-
-
°
Op. Temp (min)
-
°C
Op. Temp (Max)
-
°C
Ingress Rating
-
No items found.

Intro

AlphaBot 2 features a humanoid form with 34 degrees of freedom, including arms with a 700 mm span and an extended reach up to 240 cm. Its waist and legs incorporate a lifting and tilting mechanism allowing a 0–2.4 meter operational height range. The robot is equipped with a sophisticated force feedback system enabling “rigid yet compliant” manipulation. AlphaBot 2’s AI platform, AlphaBrain, integrates vision, language, and action for real-time environment perception and task planning. It interprets human facial expressions to infer intentions and applies deep semantic understanding to translate abstract commands into precise physical actions. The robot’s hardware and software support rapid adaptation to new environments without manual reprogramming, making it suitable for both desktop and industrial-scale applications.

Connectivity

  • 360° vision system (multi-camera setup)
  • WiFi and Ethernet for cloud connectivity and updates
  • Microphone array for voice interaction and ambient sound detection
  • Integration with industrial automation systems
  • Support for OTA software updates

Capabilities

AlphaBot 2 can:

  • Perform multi-step autonomous tasks like cooking, sterile filling, unpacking, and inspection
  • Navigate and operate across a 0–2.4 m vertical workspace
  • Use fine force feedback for delicate and stable object handling
  • Interpret human facial expressions and natural language for smooth collaboration
  • Adapt to new environments and tasks without manual reprogramming
  • Coordinate multi-body actions for complex workflows
  • Provide real-time 360° environmental perception and spatial awareness