Can't see who you were looking for? You might want to try browsing by lab or looking in the A-Z people list.

Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.

Contact-based navigation for an autonomous flying robot

  • Authors: Briod, Adrien; Kornatowski, Przemyslaw Mariusz; Klaptocz, Adam; Garnier, Arnaud; Pagnamenta, Marco; Zufferey, Jean-Christophe; Floreano, Dario

Autonomous navigation in obstacle-dense indoor environments is very challenging for flying robots due to the high risk of collisions, which may lead to mechanical damage of the platform and eventual failure of the mission. While conventional approaches in autonomous navigation favor obstacle avoidance strategies, recent work showed that collision-robust flying robots could hit obstacles without breaking and even self-recover after a crash to the ground. This approach is particularly interesting for autonomous navigation in complex environments where collisions are unavoidable, or for reducing the sensing and control complexity involved in obstacle avoidance. This paper aims at showing that collision-robust platforms can go a step further and exploit contacts with the environment to achieve useful navigation tasks based on the sense of touch. This approach is typically useful when weight restrictions prevent the use of heavier sensors, or as a low-level detection mechanism supplementing other sensing modalities. In this paper, a solution based on force and inertial sensors used to detect obstacles all around the robot is presented. Eight miniature force sensors, weighting 0.9g each, are integrated in the structure of a collision-robust flying platform without affecting its robustness. A proof-of-concept experiment demonstrates the use of contact sensing for exploring autonomously a room in 3D, showing significant advantages compared to a previous strategy. To our knowledge this is the first fully autonomous flying robot using touch sensors as only exteroceptive sensors.

Posted on: July 1, 2013

Fuzzy Control System for Autonomous Navigation and Parking of Thymio II Mobile Robots

  • Authors: Boufera, Fatma; Debbat, Fatima; Mondada, Francesco; Khelfi, M. Fayçal

This paper proposed a fuzzy controller for the autonomous navigation problem of robotic systems in a dynamic and uncertain environment. In particular, we are interested in determining the robot motion to reach the target while ensuring their own safety and that of different agents that surround it. To achieve these goals, we have adopted a fuzzy controller for navigation and avoidance obstacle, taking into account the changing nature of the environment. The approach has been tested and validated on a Thymio II robots set. As application field, we have chosen a parking problem.

Posted on: July 26, 2014

How fast is too fast? The role of perception latency in high-speed sense and avoid.

Authors: Davide Falanga, Suseong Kim, & Davide Scaramuzza

 

Abstract

In this letter, we study the effects that perception latency has on the maximum speed a robot can reach to safely navigate through an unknown cluttered environment. We provide a general analysis that can serve as a baseline for future quantitative reasoning for design tradeoffs in autonomous robot navigation. We consider the case where the robot is modeled as a linear secondorder system with bounded input and navigates through static obstacles. Also, we focus on a scenario where the robot wants to reach a target destination in as little time as possible, and therefore cannot change its longitudinal velocity to avoid obstacles. We show how the maximum latency that the robot can tolerate to guarantee safety is related to the desired speed, the range of its sensing pipeline, and the actuation limitations of the platform (i.e., the maximum acceleration it can produce). As a particular case study, we compare monocular and stereo frame-based cameras against novel, low-latency sensors, such as event cameras, in the case of quadrotor flight. To validate our analysis, we conduct experiments on a quadrotor platform equipped with an event camera to detect and avoid obstacles thrown towards the robot. To the best of our knowledge, this is the first theoretical work in which perception and actuation limitations are jointly considered to study the performance of a robotic platform in high-speed navigation.

Reference

  • Published in: IEEE Robotics and Automation Letters
  • DOI: 10.1109/LRA.2019.2898117
  • Read paper
  • Date: 2019
Posted on: December 17, 2020

Voxgraph: Globally Consistent, Volumetric Mapping using Signed Distance Function Submaps

Authors: Victor Reijgwart*, Alexander Millane*, Helen Oleynikova, Roland Siegwart, Cesar Cadena, Juan Nieto

 

Abstract

Globally consistent dense maps are a key requirement for long-term robot navigation in complex environments. While previous works have addressed the challenges of dense mapping and global consistency, most require more computational resources than may be available on-board small robots. We propose a framework that creates globally consistent volumetric maps on a CPU and is lightweight enough to run on computationally constrained platforms.
Our approach represents the environment as a collection of overlapping Signed Distance Function (SDF) submaps, and maintains global consistency by computing an optimal alignment of the submap collection. By exploiting the underlying SDF representation, we generate correspondence-free constraints between submap pairs that are computationally efficient enough to optimize the global problem each time a new submap is added. We deploy the proposed system on a hexacopter MAV with an Intel i7-8650U CPU in two realistic scenarios: mapping a large-scale area using a 3D LiDAR, and mapping an industrial space using an RGB-D camera. In the large-scale outdoor experiments, the system optimizes a 120x80m map in less than 4s and produces absolute trajectory RMSE of less than 1m over 400m trajectories. Our complete system, called voxgraph, is available as open source (https://github.com/ethz-asl/voxgraph).

Reference

  • Published in: IEEE Robotics and Automation Letters (Volume: 5, Issue: 1, Jan. 2020)
  • DOI: 10.1109/LRA.2019.2953859
  • Read paper
  • Date: 2019
Posted on: November 27, 2020