Robots have nowadays the potential to interact with people in daily life. It is believed that, based on this ability, they will play an essential role in human society in the not-so-distant future.
In this not so futuristic scenario, it will be of fundamental importance to benefit from the bidirectional active (active response to a stimulus) and passive (observation) interactions that will take place between robots and humans. Those exchanges will make available an unprecedented amount of behavioral data, infinite sequences of input-output stimulus- responses and observations, from which computational models must be able to extract sense and knowledge to improve its predicting capabilities of user states and intentions. Such interactions must happen in a context where we assume that robots will be able to learn and increase its skills. If this is not the case, a robot function is condemned to be restricted to its first and best pre-programmed tasks in very narrow controllable situations with the same initial conditions.
To forbid this to happen we have to ensure that computational models embodied in agents and robots will be able to take advantage and learn over long periods of time from this bidirectional human ↔ robot interaction.
EASEL will explore and develop a theoretical understanding of human-robot symbiotic interaction (HRSI) where symbiosis is defined as the capacity of the robot and the person to mutually influence each other, and alter each other’s behaviour over different time-scales (for instance, within encounters and across encounters). Symbiosis requires that the robot can read, and be responsive to, the behaviour and emotional state of the person, and adapt its own behaviour to this in ways that have predictable effects on the person. We will develop a theory of the dynamics of human-robot symbiosis identifying the main parameters that could influence them.
The big challenge of the project is being able to extract usable knowledge from symbiotic robot ↔ human interactions that go beyond instantaneous recognition of user states and intentions, or immediate stimulus-response associations.
A fundamental objective of the project is to go beyond the view of human-robot communication as transferring task-based information via basic communication channels, we believe that creating a new generation of symbiotic robots will require designers and scientists to address the emotional and inter-personal dimensions of social interaction.
It is in this context that a new class of Robotic Based Tutoring Solutions (RBTS) must be introduced. EASEL will deliver and validate a unique and beyond the state of the art social robot based tutoring system that comprises:

  • A model of the user aimed to assess interaction strategies during play and during learning; a model able to capture and synthesize long term interactions histories;
  • A system agent able to establish a symbiotic interaction on the basis of personal attitudes and behavior (non-rational decision making) and the acquisition of social affordances;
  • A robot able to act as facilitator establish symbiotic interaction involving the learner in teaching games;
  • Multi parametric analysis of subject behavior through gesture, facial expression and social extraction and analysis on the basis of the developed computational framework of social affordances;
  • Psycho-physiological signals analysis for user’s affective state analysis to be used for the development of the user model and for the evaluation of the platform.

Despite the advances in technology enhanced teaching systems, current ICT applications for teaching and tutoring lack in their abilities for adaptation and personalization and only deal with limited problem spaces. To directly address this challenge we will develop and deploy a RBTS that incorporates key features of human tutors but also includes proven features of tutoring that are not easily or consistently used by human tutors.