InMoov

InMoov is the first open-source, 3D printable life-sized humanoid robot, controllable via Arduino and MyRobotLab, designed for education, research, and maker projects.
Made by
InMoov
Software Type
Open Source
Software Package
MyRobotLab: open-source robotics framework primarily in Java with Python bindings. Web-based UI for remote control. Supports speech recognition, gesture control, and vision processing. Virtual InMoov simulation for development without physical hardware.
Actuators
Uses multiple servo motors as actuators to enable smooth, precise movements of limbs, fingers, and head components. The servos provide the necessary degrees of freedom for humanoid articulation.
Compiute
Controlled by Arduino microcontrollers managing servo motors and sensors. The higher-level processing and AI functions run on connected computers using MyRobotLab software.
Sensors
Equipped with micro-cameras for vision, Kinect sensors for 3D depth perception, touch sensors, and PIR sensors for motion detection and environmental awareness.
Max Op. time
mins

Robot Brief

InMoov is the first life-sized open-source humanoid robot fully constructed from 3D printable plastic components, created by French sculptor Gaël Langevin in 2012. Designed to be reproducible on any small-format 3D printer (12x12x12 cm), InMoov serves as a versatile platform for development, learning, and experimentation in robotics. Controlled by Arduino microcontrollers and running the open-source MyRobotLab software, it can perceive sound, see using micro-cameras and Kinect sensors, speak, and move independently. The robot recognizes voice commands, analyzes its 3D environment, and incorporates various sensors including touch and PIR sensors. Its open-source nature and modular design have fostered a global community of makers, researchers, and hobbyists who continuously extend its capabilities. InMoov has been used in artistic projects, prosthetics development, and educational settings, making it a pioneering model for accessible humanoid robotics.

Use Cases

InMoov can independently perceive its environment through cameras and sensors, recognize voice commands, speak, and perform movements such as grasping objects and gesturing. It serves as a development platform for robotics learning, capable of gesture control and interaction, making it suitable for artistic, educational, and prosthetic research purposes.

Industries

  • Education: Provides a hands-on platform for learning robotics and programming.
  • Research: Basis for AI, prosthetics, and robotics experimentation.
  • Art & Design: Used in artistic installations and creative projects.
  • Hobbyist/Maker Community: Enables affordable DIY humanoid robotics development.

Specifications

Length
-
mm
Width
-
mm
Height (ResT)
-
mm
Height (Stand)
-
mm
Height (Min)
mm
Height (Max)
mm
Weight (With Batt.)
-
kg
Weight (NO Batt.)
-
kg
Max Step Height
-
mm
Max Slope
+/-
-
°
Op. Temp (min)
-
°C
Op. Temp (Max)
-
°C
Ingress Rating
-
No items found.

Intro

InMoov is a humanoid robot constructed from 3D printed plastic parts designed to fit on small consumer 3D printers. It is controlled by Arduino microcontrollers and uses MyRobotLab, an open-source robotics framework, for operation. The robot features independent eye movements, gesture capturing, and 3D depth recognition. It can perceive sound, see with micro-cameras and Kinect sensors, and respond to voice commands. Its modular design allows users to build and customize parts easily, making it ideal for educational and development purposes. The project emphasizes community collaboration and open-source sharing, enabling continuous improvement and diverse applications.

Connectivity

  • Arduino microcontroller interface.
  • USB and serial communication for control.
  • MyRobotLab software with Web UI (AngularJS) for remote operation.
  • Camera and sensor integration (micro-cameras, Kinect).
  • Open source software and hardware connectivity.

Capabilities

  • Voice command recognition and speech output.
  • Independent eye and head movement.
  • Gesture capturing for interactive control.
  • 3D depth perception using Kinect and cameras.
  • Touch and PIR sensors for environmental awareness.
  • Remote control via web UI.
  • Compatible with Java and Python through MyRobotLab.
  • Modular and customizable design