The teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces.
Reference
- Published in: Proceedings of the National Academy of Sciences of the United States of America, 2018
- DOI: 10.1073/pnas.1718648115
- Date: 2018