InMoov & My Robot Lab for Dummies (part 1)

Say hello to Qwerty 🙂

InMoov is a robot created in 2012 by a french sculptor named Gael Langevin. The cool thing about this project is that it is completely open source, printable using almost any 3D printer (like my Anet A8), and customizable with Arduino and many sensors.

Thanks to MyRobotLab, an open source software developed by many volunteers, working with Arduino, servos, cameras and other hardware is easier than write all the code from scratch.

This is my first serious work with my Anet A8, I decided to start printing the head and some part of torso: just to test the printer and my skills with Arduino.

During the building of the model I made some mistakes, learned a lot stuff and burned some components 😂: I dediced then to share my findings and what I used to help other people to build their own robot.















Many info can be found in the official site (eg: here and here and here). This article contains my consdireations and suggestions.

Printer and configuration

I tried different slicers for my Anet A8, but my prefered one is Simplify3D. After some tuning this is my configuration file.

The pieces are very well designed and easy to print, so even a standard configuration of a printer is good enough.

3D Model

All the parts can be downloaded from the gallery section of the project site. In particular you need to download and print:

  • Eyes-mechanism
  • Face-and-Jaw
  • Neck
  • Skull-and-Ears (I printed the EarSpeakers one because I mounted the speakers in the ears)
  • Torso (pritend them, except the Kinect parts)

Hardware used


A PC with Java installed (more on second part) or a Windows tablet is needed to communicate with the Arduino (which acts like a proxy between My Robot Lab and the peripherals). For developing purpouse a PC is ok, but for a “demo unit” a tablet is more convenient. I borrowed a Trekstor Wintron 10.1 from my office: not very fast but do the work fairly.


For the eyes, 2 little servos are needed (one for the horizontal movements and one for the vertical). Gael suggests using Tower Pro SG90, but I prefere the MG90S : it has the same torque and speed, but has a metal gear and should be shock proof. The price is a bit higher (5€ each vs. 4€) but nothing you can be worried about.


If you want to use the recognition features using OpenCV, you’ll obviously need a webcam. I’ve used the Hercules HD Twist camera: it’s cheap, has a good quality image, and it is official supported by InMoov. I’ve used this nice tutorial for disassemble the camera.

Head Rotation horizontal and vertical

Need to use 2 HS-805BB servos for head rotation (x axis) and neck (y axis). Probably overkill due to the high torque (20 kg/cm).

It draws a lot of juice, about 800mA, so be careful to use a decent battery (see below).

Jaw mechanism

Gael uses Hobby King HK 15298B, it’s a bit expensive but works well, until you destroy them using high voltage batteries 😅. It has a 90 degree rotation and it is perfect for the jaw.

I’m insted using a MG996R: it is cheaper, but has a 180 degrees of rotation, so be careful when moving the jaw (you can break the mechanism if the servo rotates a lot).

You can alternatively use JX PDI-6221MG too.



Servos juice batteries. Lot of batteries. I tried many different configurations, but in the end, the best, suggested and safe approarch is to use a 6V SLA battery or a high output (amps) power supply.

I’ve used this 6V 12Ah battery.


An Arduino Mega (or compatible) is suggested. For the head and torso an Arduino Uno is ok too, but I used the Elegoo Mega 2560 (a cheaper Arduino Mega).

For testing purpouse I’ve used a breadboard.


  • PC speakers, I used a spare ones laying in my drawer. The speaker diameter is 5cm and fits perfectly in the ear model. Probably these are good.
  • A controller if you want to move the head, I’ve used the one of my loved PS3.

Important note

Before installing each servo, make sure it is in the correct “rest” position. For example, for the eyes servo you need to rotate the gear to  90 degrees, so when assemblying the eyes they will able to rotate left and right.

You can use this snippet in Arduino to rotate the gear:

#include <Servo.h>

Servo myservo;           // create servo object to control a servo

int pos = 90;            // variable to store the servo position

void setup() {
   myservo.attach(9);    // attaches the servo on pin 9 to the servo object

void loop() {
    myservo.write(pos);  // tell servo to go to position in variable 'pos'

I’ve used this setup:

  • Servo eyes x and y: set position to 90°
  • Neck and head: set position to 90°
  • Jaw: set position to 0°

You can (and will) change the mapping and rest position of each servo, but starting from a correct position will save a lot of time and require less fine tuning.


Once all parts are printed, you need to assemble everything. This video explain briefly the necessary steps:














Some pictures about my long jouney

My beloved Anet A8. It’s heavily customized. I should write an entire post about it.

First parts printed and assembled

First part of head, neck and neck support

I printed an additional head stand from thingiverse, for easily assemble the neck and other stuff.

First servo and jaw support

Mounted the first servo and first part of the jaw support

First video

I can’t resist to make a first video 😊

You can see the size of the model: almost 1:1












Love the click sound effect ❤️

Look at me

In this picture you can see the eyes mechanism assembled. At that time the eyeballs were moved by SG90 servos, replaced afterward by MG90s.

Iris is painted. No camera yet


Eyes with mask

Eyes test. I wrote a little program to move the eyes using an Arduino joystick.










Assembly continues

The old H-King servo for the jaw died suddently.

His memory is alive in our hearts 😢

A mess of wires and a breadboard. Still life


Soldering is not one of my best skill…


…organizing calbes neither

Camera tests! 📷

Herclues Camera mounted on left eye. Had to completely disassemble it.

Scognito’s status: satisfied









Video quality could be better, but I have only two arms (and a toilet roll 🧻).

Eyes are moved using the joystick: what they see is displayed on PC (yes, it’s a normal webcam connected to the PC (yes it’s a Mac, not a PC…))

Voice commands

This is a funny interlude, not relevant for this post. Part 3 of this article will talk about software.






The video is in italian. It recognizes various commands (looking in 4 directions). Querty is a bit touchy, wonder why…


This is the less detailed part. There is no footage on how I my colleague disassambled the speakers.

Essentially you have to take apart all the plastic stuff from the the speakers.

Screw each speaker in the ear support. Soldering / cut some wire could be necessary.

As you can see from the video, the volume know is now screwed inside the head.

The audio jack will be inserted in the host device (tablet or PC).




Neck servo

For the up/down head movement another servo is necessary.

I had to print all the remaining neck parts and some of the torso.

Head stand is not needed anymore.

Wiring all the stuff

There are 7 wires that starts from inside the head (4 servos + 1 for speakers jack + 1 for speakers power + 1 for webcam).

The best thing is to pass them through the neck, since it has enough room.

The backbone

Then it’s time to join the wires to Arduino. I’m using a breadboard for testing.

I’ve used a paper tape for avoid confusion

Each servo has 3 wires (usally black, red, and white (or yellow):

  • Red is used for the 5v
  • Black for the ground
  • White or yellow is the “command”

In the picture above you can check that each red is connected to the + binary of the breadboard, the blacks to the – binary, and the commands directly to the the Arduino.

More specifically (from left to right on the breadboard):

  • Ground (blue)
  • Pin 13 (green): head rotation
  • Pin 12 (orange): neck
  • Pin 26 (blue): jaw
  • Pin 24 (brown): eye Y
  • Pin 22 (yellow): eye X

Do not forget to connect the Arduino ground to the – binary!

One more thing

Object tracking 😎

In the second part we’ll see how to configure My Robot Lab.

8 pensieri su “InMoov & My Robot Lab for Dummies (part 1)

  1. Ho iniziato a stampare le parti del collo e mento, non sono gli stessi pezzi usati nel tutorial
    Dove è possibile trovare un tutorial o anche semplici immagini per capire come montare i pezzi ?

    "Mi piace"

  2. Ho iniziato a montare gli occhi e a testare i movimenti. Il servo dell’asse y degli occhi è impostato come valori min/max a 60-120 ma i movimenti in su e giù degli occhi è minimo.
    Se aumento i valori min/max a 0-150 il movimento è molto più ampio ma non utilizzabile perché il servo va a sbattere contro il pezzo che contiene la cassa che simula la bocca ( sotto ) e il supporto del cranio ( sopra )
    Hai usato un settaggio particolare per avere un movimento così ampio con valori così bassi del min – max ?

    "Mi piace"

  3. Nell’esempio del video quando dici guarda a sinistra , destra ecc, si nìmuovono gli occhi mentre a me si gira la testa.
    Hai creato tu la gesture ?

    "Mi piace"


Inserisci i tuoi dati qui sotto o clicca su un'icona per effettuare l'accesso:

Logo di

Stai commentando usando il tuo account Chiudi sessione /  Modifica )

Foto Twitter

Stai commentando usando il tuo account Twitter. Chiudi sessione /  Modifica )

Foto di Facebook

Stai commentando usando il tuo account Facebook. Chiudi sessione /  Modifica )

Connessione a %s...

Questo sito utilizza Akismet per ridurre lo spam. Scopri come vengono elaborati i dati derivati dai commenti.