- Presented at: e ICRA Workshop on Perception, Inference, and Learning for Joint Semantic, Geometric, and Physical Understanding, Brisbane, Australia
Lightweight, autonomous drones are soon expected to be used in a wide variety of tasks such as aerial surveillance, delivery, or monitoring of existing architectures. A large body of literature in robotic perception and control exists. Existing methods are mature, but not robust, therefore hindering drones’ deployment in natural environments, like a city or a forest. Indeed, in unstructured and dynamic scenarios, drones face numerous challenges to navigate autonomously in a feasible and safe way. A recent line of research has exploited the “perception-awareness” of deep learning techniques to unlock autonomous flight in uncontrolled environments. This is motivated by the insight that traditional methods relying on global state estimates in the form of robot poses are doomed to fail because of the inherent difficulties of pose estimation at high speed along with their inability to adequately cope with dynamic environments. In this paper, we survey existing learning-based methods for drone navigation and identify open areas of research for future work.
Reference
- Detailed record: infoscience.epfl.ch/record/256656
- Date: 2018