Home
/
Robots
/
Menteebot

Menteebot

MenteeBot is a 175 cm humanoid robot with 25 kg payload, AI-driven Sim2Real learning, 3D environment mapping, and LLM-based task planning for industrial and household automation.
Software Type
Closed Source
Software Package
Runs advanced AI software integrating Sim2Real-trained locomotion, NeRF-based 3D mapping, LLM-powered natural language processing, autonomous navigation, and manipulation control.
Actuators
Custom-designed actuators deliver up to three times more power than typical counterparts, enabling strong, precise, and efficient movements, including heavy lifting and fine manipulation.
Compiute
Onboard computing supports real-time AI inference, large language model processing, sensor fusion, and control algorithms for locomotion and manipulation.
Sensors
Equipped with multiple cameras, fisheye lenses for 360° vision, depth sensors, and tactile sensors for environment mapping, obstacle detection, and object recognition.
Max Op. time
300
mins

Robot Brief

MenteeBot is an advanced humanoid robot developed by Israeli company Mentee Robotics, founded in 2022 by experts in AI, computer vision, and machine learning. Standing 175 cm tall and weighing 70 kg, it is designed for both household and industrial applications, capable of performing labor-intensive and dexterous tasks. MenteeBot V3.0 features custom actuators delivering up to three times more power than typical robots, enabling it to lift up to 25 kg and perform precise handling tasks such as assembly, packaging, loading, and unloading. It incorporates a hot-swappable battery for continuous 24/7 operation. The robot employs a Simulator-to-Reality (Sim2Real) machine learning approach for locomotion and task learning, allowing extensive training in virtual environments before adapting to real-world scenarios with minimal additional data. MenteeBot uses NeRF-based algorithms for real-time 3D environment mapping and localization, creating cognitive semantic maps for dynamic navigation and obstacle avoidance. Its AI integrates Large Language Models (LLMs) for natural language understanding and task planning, enabling it to interpret verbal commands, engage in conversations, and autonomously plan complex tasks. The robot’s dexterity is enhanced by coordinated locomotion and hand movements, allowing it to balance dynamically while carrying weights or reaching out. Mentee Robotics plans to launch a production-ready version by early 2025, featuring camera-only sensing, proprietary electric motors for enhanced dexterity, and fully integrated AI for complex reasoning and on-the-fly learning.

Use Cases

MenteeBot autonomously walks, runs, and navigates complex environments at speeds up to 1.5 m/s, lifting and carrying objects up to 25 kg. It performs industrial tasks such as loading, unloading, assembly, and packaging with high precision. The robot understands and responds to natural language commands, plans task execution using AI reasoning, and adapts movements dynamically for balance and dexterity. It can perform household chores like cleaning and laundry handling and learns new tasks through verbal instruction and visual imitation.

Industries

  • Manufacturing & Industrial Automation: Heavy lifting, assembly, packaging, and logistics.
  • Warehousing & Logistics: Loading, unloading, and material handling.
  • Household & Service Robotics: Cleaning, laundry, and domestic assistance.
  • Research & Development: Platform for AI and robotics innovation.
  • Healthcare & Assistance: Potential for caregiving and support tasks.

Specifications

Length
mm
Width
mm
Height (ResT)
mm
Height (Stand)
1750
mm
Height (Min)
mm
Height (Max)
mm
Weight (With Batt.)
kg
Weight (NO Batt.)
70
kg
Max Step Height
mm
Max Slope
+/-
°
Op. Temp (min)
°C
Op. Temp (Max)
°C
Ingress Rating
No items found.

Intro

MenteeBot stands 175 cm tall, weighs 70 kg, and features 40 degrees of freedom. It uses custom high-power actuators for enhanced strength and precision, with hands capable of a pinch force of 30N per finger. The robot’s mobility includes walking at 1.5 m/s and running sideways with dynamic balance. It employs NeRF-based 3D mapping and localization, enabling semantic understanding of its environment. Large Language Models facilitate natural language interaction and autonomous task planning. The robot’s design emphasizes integration of locomotion and manipulation for human-like dexterity, with the ability to carry loads while maintaining balance.

Connectivity

  • WiFi and standard wireless communication (inferred)
  • Multiple cameras including fisheye for 360° vision
  • Depth sensors for environment perception
  • AI processing units supporting LLMs and real-time control

Capabilities

  • Lifts and carries up to 25 kg
  • Walking speed of 1.5 m/s and running capability
  • 40 degrees of freedom for flexible, human-like motion
  • Pinch force of 30N per finger for precise grasping
  • Real-time 3D environment mapping using NeRF algorithms
  • Large Language Model-based natural language understanding and task planning
  • Simulator-to-Reality machine learning for locomotion and task adaptation
  • Autonomous navigation with obstacle avoidance
  • Dynamic balance while manipulating objects