Abstract

The fast technological evolution and dissemination of multimodal sensors and compliant actuators bring a new human-centric perspective to robotics. The variety of human-robot interactions that stem from these new capabilities unveil compelling challenges for machine learning. %An attractive approach to the problem of transferring skills to robots is to take inspiration from the way humans learn by imitation, adaptation and self-refinement. Such learning strategies require various types of interaction with the end-users and with the robot's environment. The overall skill acquisition process can hardly be segmented or sequenced in a specific way in advance. This indicates the importance of finding a representation of skills that can be shared by different learning strategies and that can accommodate multimodal continuous data streams for both analysis and synthesis purposes. %The aim is to provide robots with a representation of rich motor skills able to handle recognition, prediction, synthesis and refinement in a continuous and synergistic way. The representation also requires to be robust to various sources of perturbation, persistently arising from the environment, from the user, and from the robot. I will present an approach exploiting the variability of multiple demonstrations and the co-variability of sensorimotor signals to extract the important characteristics of a task/skill. This information is used within an optimal control strategy to provide the robot with a minimal intervention controller regulating the stiffness and damping characteristics of the robot's actions according to the estimated precision and coordination requirements. %Examples of applications with a compliant humanoid, a continuum flexible surgical robot and a set of gravity-compensated manipulators will be showcased.

Bibtex reference

@inproceedings{Calinon14URAI,
author="Calinon, S.",
title="Skills Learning in Robots by Interaction with Users and Environment",
booktitle="Proc. Intl Conf. on Ubiquitous Robots and Ambient Intelligence ({URAI})",
year="2014",
month="November",
}