- Published in: Ieee Transactions On Haptics (ISSN: 1939-1412), vol. 7, num. 3, p. 367-380
- Los Alamitos: Ieee Computer Soc, 2014
Robot Learning from Demonstration (RLfD) has been identified as a key element for making robots useful in daily lives. A wide range of techniques has been proposed for deriving a task model from a set of demonstrations of the task. Most previous works use learning to model the kinematics of the task, and for autonomous execution the robot then relies on a stiff position controller. While many tasks can and have been learned this way, there are tasks in which controlling the position alone is insufficient to achieve the goals of the task. These are typically tasks that involve contact or require a specific response to physical perturbations. The question of how to adjust the compliance to suit the need of the task has not yet been fully treated in Robot Learning from Demonstration. In this paper, we address this issue and present interfaces that allow a human teacher to indicate compliance variations by physically interacting with the robot during task execution. We validate our approach in two different experiments on the 7 DoF Barrett WAM and KUKA LWR robot manipulators. Furthermore, we conduct a user study to evaluate the usability of our approach from a non-roboticists perspective.
Reference
- Detailed record: https://infoscience.epfl.ch/record/205639?ln=en
- EPFL-ARTICLE-205639
- doi:10.1109/Toh.2013.54
- View record in Web of Science