Abel is a new generation hyper realistic humanoid robot, conceived to be a research platform for social interaction, emotion modeling, and studies on embodied intelligence. Its appearance resembles that of an 11–12 year old boy. It is a unique piece, resulting from the collaboration between the Enrico Piaggio Research Center of the University of Pisa and Gustav Hoegen, from Biomimic Studio, basedin London.


AbelAbel is physically made up of the head and the upper part of the torso with arms and hands, all of these are robotic parts moved by the latest generation of Futaba, MKS, and Dynamixel servo motors. Twenty-one servo motors are inside the Abel’s head, they are dedicated to the movement of the facial expression, to perform gaze, and simulate speaking: four move the brow, eight move the eyes, one moves the jaw, and eight are for the movement of mouth, lips, and cheeks. Five motors are dedicated to neck and head movement. Then, five servo motors are mounted in each arm (three for the shoulder, one for the elbow, one to twist the arm), and three servo motors are in each hand, for a total of 42 degrees of freedom.


The mechatronics of the head confers to Abel the possibility to express a wide spectrum of emotions by means of facial expressions, which are accompanied by a body which is also designed to represent emotional and meaningful body gestures.


The humanoid is equipped with an integrated camera into the torso and integrated binaural microphones, which are specifically designed to emulate the acoustic perception of the human listener. The robot also has an internal speaker to reproduce its voice.


The cognitive system of Abel is a bioinspired hybrid cognitive architecture we devised and implemented, inspired by neuroscience, with a particular attention to the Damasio’s theory of mind and consciousness. The architecture is modular, conceptually divided in Sense, Plan, and Act functional blocks. Modules are grouped forming services, which are designed as standalone applications. A service can be an application for image analysis, an animator of a robot part, or a component of the cognitive system which is dedicated to the processing of a specific domain of information. These services can be distributed in different computers and the communication among them is implemented both with the YARP and ROS middleware, by which we create a local network dedicated to the robot control. The framework allows both low-level reactive control, by means of direct connection between perception and actuation control services and a high-level deliberative control, which includes the possibility to plan context-dependent actions and to perform abstract
reasoning on the acquired sensory data by means of symbolic manipulation.


Moreover, the perception of the robot can be extended with the acquisition of physiological data
(e.g., EEG signals, EDA, thermal images, HRV) from unobtrusive sensors that can be contactless,
worn by the interacting human subjects, or distributed in the experimental set up.


Therefore, Abel represents an incredible robotic platform to implement and test theories coming
from neuroscience, psychology and sociology, with very promising applications in therapy and
diagnosis of mental illness, learning disabilities, autism spectrum, and dementia.


It also generally gives us the opportunity to embed ideally every capability given by Artificial
Intelligence in a human-like body capable of expressing and estimating the emotions of its human
interlocutors.


Abel represents the point of confluence of many human centric research activities carried out at the Research Center “E. Piaggio”.

Research lines
Abel