HomeRobots

Ultra Magnus

Ultra Magnus by APLUX is an AI-powered humanoid robot with advanced voice interaction and autonomous task execution, designed for urban management and commercial applications.
Software Type
Closed Source
Software Package
Fusion architecture operating system for integrated control. On-device large language model for voice interaction. Visual perception and motion planning software. Multi-OS environment supporting Android and Linux. AI modules enabling real-time decision-making and task execution.
Actuators
Ultra Magnus features agile limb actuators enabling smooth, precise bipedal movement and dexterous manipulation, suitable for complex tasks in dynamic environments.
Compiute
Powered by Qualcomm’s QCS8550 processor combined with two MeiG Smart SNM970 AI modules, delivering nearly 100 TOPS of AI computing power for perception, control, and interaction.
Sensors
Equipped with advanced visual sensors for object and visitor recognition, supporting autonomous navigation and task execution. Sensor details include high-performance AI-driven perception modules.
Max Op. time
mins

Robot Brief

Ultra Magnus is a cutting-edge bipedal humanoid robot developed by Chengdu APLUX Intelligence Technology Co. Ltd. and unveiled at CES 2025 in collaboration with Qualcomm and Ti5 Robot. Equipped with agile limbs and a sleek metallic finish, Ultra Magnus is designed for versatile applications including urban management and commercial environments. Powered by Qualcomm’s QCS8550 processor and MeiG Smart’s high-performance AI modules, it delivers powerful computing for perception, decision-making, motion control, and voice interaction. The robot supports on-device large language models (LLM) enabling natural voice interaction and user intent understanding. It can autonomously recognize and greet visitors, understand complex commands, and perform tasks such as object recognition, grasping, and delivery. Ultra Magnus is part of Chengdu’s strategic push to integrate smart robotics into urban management, demonstrated by its role directing traffic as an automated “officer” in a business district. This humanoid robot embodies APLUX’s vision of embodied intelligence, combining edge AI, sensor fusion, and advanced motion control to enhance practical robotics applications in complex real-world scenarios.

Use Cases

Ultra Magnus autonomously interacts with people through natural voice communication, recognizes objects, and performs complex tasks such as grasping and delivering items. It can serve in urban management roles like directing traffic and providing service in commercial settings. Its AI-driven perception and decision-making enable it to operate efficiently and safely in dynamic environments.

Industries

  • Urban Management: Traffic direction and public interaction in city environments.
  • Commercial Services: Customer greeting, assistance, and delivery tasks.
  • AI & Robotics Development: Demonstrates advanced edge AI and humanoid robotics capabilities.
  • Smart City Initiatives: Supports Chengdu’s AI industry growth and smart urban infrastructure.

Specifications

Length
-
mm
Width
-
mm
Height (ResT)
-
mm
Height (Stand)
-
mm
Height (Min)
mm
Height (Max)
mm
Weight (With Batt.)
-
kg
Weight (NO Batt.)
-
kg
Max Step Height
-
mm
Max Slope
+/-
-
°
Op. Temp (min)
-
°C
Op. Temp (Max)
-
°C
Ingress Rating
-
No items found.

Intro

Ultra Magnus is a bipedal humanoid robot featuring agile limbs and a metallic exterior, developed through a collaboration between APLUX, Qualcomm, and Ti5 Robot. It incorporates Qualcomm’s QCS8550 processor and MeiG Smart’s AI modules to provide robust edge AI computing for perception, motion control, and voice interaction. The robot runs a fusion architecture operating system that supports natural language understanding via on-device large language models. Ultra Magnus can recognize visitors, understand their needs, and execute complex tasks like object recognition, grasping, and delivery smoothly and efficiently. It was showcased at CES 2025 and has been deployed in real-world urban management scenarios such as automated traffic direction.

Connectivity

  • Qualcomm QCS8550 processor with integrated AI modules (MeiG SNM970).
  • Supports multiple operating systems simultaneously (Android, Linux).
  • Edge AI large model deployment for voice and vision processing.
  • Wireless connectivity (details not explicitly stated but implied for urban deployment).
  • Fusion architecture OS integrating perception, control, and interaction

Capabilities

  • Natural voice interaction with user intent understanding via on-device LLM.
  • Autonomous visitor recognition and greeting.
  • Visual perception for object recognition and manipulation.
  • Agile bipedal locomotion and motion control.
  • Complex task execution including grasping and delivery.
  • Real-time decision-making powered by Qualcomm QCS8550 and MeiG AI modules.
  • Multi-OS support (Android and Linux) for flexible software ecosystem.