Human Machine Interfaces
Between 2014 and 2018, NCCR robotics researchers worked on perfecting brain-computer interfaces (BCI) that can be used to control soft exoskeletons such as the one developed for restoring hand functions.
In particular, José Millàn’s group at EPFL studied the right mix of information and electrical stimulation to facilitate patients in learning how to control an external robotic device.
Two end-user participants for the Cybathlon BCI race demonstrated that mutual learning (where the subject and the pattern recognition system learn from each other) is a critical factor for successful BCI translational applications.
Another goal was to create Brain-Computer Interfaces that can provide combine wearable vibrotactile stimulation vibrotactile stimulations using novel soft pneumatic actuators (SPAs). Most BCIs only use visual or acoustic information, but multi-sensory feedback is vital for more efficient BCIs.
NCCR Robotics researchers developed a wearable control interface for controlling drones with the body.
Drones were chosen because they have recently experienced the fastest user growth and because of their remarkable capability to extend human perception and range of action. Piloting drones with current interfaces, such as joy-sticks and remote controllers, requires extensive training and constant cognitive effort during flight. Here our researchers aimed at achieving a bidirectional link between the physical bodies and control systems of the robot and of the human with the hypothesis that this would allow not only more intuitive control of drones, even for novices, but also provide users with immersive sensations of flight that current interfaces do not offer.
The result is the FlyJacket, a sensorized exoskeleton with artificial intelligence software for immersive flight of simulated and real drones. This patent-pending technology can be used for immersive video games or intuitive control of personal and professional drones.