8 Oct – 9 Oct 2018
Aerial Futures: The Drone Frontier @ HUBweek
Boston District Hall, Boston
|Swissnex Boston is gathering a selection of some of the most exciting drone exhibitors from Switzerland and the United States to bring to HUBweek. Expect an eclectic selection of UAVs...|
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
Research made over the past decade shows the use of increasingly complex methods and heavy platforms to achieve autonomous flight in cluttered environments. However, efficient behaviors can be found in nature where limited sensing is used, such as in insects progressing toward a light at night. Interestingly, their success is based on their ability to recover from the numerous collisions happening along their imperfect flight path. The goal of the AirBurr project is to take inspiration from these insects and develop a new class of flying robots that can recover from collisions and even exploit them. Such robots are designed to be robust to crashes and can take-off again without human intervention. They navigate in a reactive way and, unlike conventional approaches, they don’t need heavy modelling in order to fly autonomously. We believe that this new paradigm will bring flying robots out of the laboratory environment and allow them to tackle unstructured, cluttered environments. This paper aims at presenting the vision of the AirBurr project, as well as the latest results in the design of a platform capable of sustaining collisions and self-recovering after crashes.
Robotic technologies, whether they are remotely operated vehicles, autonomous agents, assistive devices, or novel control interfaces, offer many promising capabilities for deployment in real‐world environments. Post disaster scenarios area particularly relevant target for applying such technologies, due to the challenging conditions faced by rescue workers and the possibility to increase their efficacy while decreasing the …
Vision Tape is a novel class of flexible compound-eye-like linear vision sensor dedicated to motion extraction and proximity estimation. This novel sensor possesses intrinsic mechanical flexibility that provides wide-range adaptive shape, allowing adjustable field of view as well as integration with numerous substrates and curvatures. Vision Tape extracts Optic Flow of the visual scene to calculate the motion vector, which allows proximity estimation based on the motion parallax principle.
Presented at: 18th International Conference on Control, Automation, and Systems, PyeongChang, GangWon Province, Korea, October 17-20, 2018 This paper addresses the trajectory tracking problem of a 2D caged flying robot in contact with a wall. To simplify the contact problem, the models are constructed on a vertical two-dimensional plane, and our objective is to …