8 Oct – 9 Oct 2018
Aerial Futures: The Drone Frontier @ HUBweek
Boston District Hall, Boston
|Swissnex Boston is gathering a selection of some of the most exciting drone exhibitors from Switzerland and the United States to bring to HUBweek. Expect an eclectic selection of UAVs...|
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
This paper addresses the problem of adequately protecting flying robots from damage resulting from collisions that may occur when exploring constrained and cluttered environments. A method for designing protective structures to meet the specific constraints of flying systems is presented and applied to the protection of a small coaxial hovering platform. Protective structures in the form of Euler springs in a tetrahedral configuration are designed and optimised to elastically absorb the energy of an impact while simultaneously minimizing the forces acting on the robot’s stiff inner frame. These protective structures are integrated into a 282 g hovering platform and shown to consistently withstand dozens of collisions undamaged.
Current drones are developed with a fixed morphology that can limit their versatility and mission capabilities. There is biological evidence that adaptive morphological changes can not only extend dynamic performances, but also provide new functionalities. In this paper, we present different drones from our recent developments where folding is used as a mean of morphological adaptation. First, we show how foldable wings can enable the transition between aerial and ground locomotion or to fly in different aerodynamic conditions, advancing the development of multi-modal drones with an extended mission envelope. Secondly, we show how foldable structures allow to transport drones easily without sacrificing payload or flight endurance. Thirdly, we present a foldable frame that makes drones to withstand collisions. However, the real potential of foldable drones is often limited by the use of conventional design strategies and rigid materials, which motivates to use smart, functional materials. Lastly, we describe a dielectric elastomer based foldable actuator, and a variable stiffness fiber using low melting point alloy for drones. The foldable actuator acts as an active compliant joint with folding functionality and mechanical robustness in drones, thanks to the compliance of dielectric elastomer, a class of smart materials. We also show re-configuration of a drone enabled by the variable stiffness fiber that can transition between rigid and soft states.
The ease of use and versatility of drones has contributed to their deployment in several fields, from entertainment to search and rescue. However, drones remain vulnerable to collisions due to pilot mistakes or various system failures. This paper presents a bioinspired strategy for the design of quadcopters resilient to collisions. Abstracting the biomechanical strategy of collision resilient insects’ wings, the quadcopter has a dual-stiffness frame that rigidly withstands aerodynamic loads within the flight envelope, but can soften and fold during a collision to avoid damage. The dual-stiffness frame works in synergy with specific energy absorbing materials that protect the sensitive components of the drone hosted in the central case. The proposed approach is compared to other state-of- the art collision-tolerance strategies and is validated in a 50g quadcopter that can withstand high speed collisions.
For micro aerial vehicles (MAVs) involved in search and rescue missions, the ability to locate the source of a distress sound signal is significantly important and allows fast localization of victims and rescuers during nighttime, through foliage and in dust, fog, and smoke. Most emergency sound sources, such as safety whistles and personal alarms, generate a narrowband signal that is difficult to localize by human listeners or with the common localization methods suitable for broadband sounds. In this paper, we present three methods for MAV-based emergency sound localization system. The first method involves designing a new emergency source for immediate localization by the MAV using a common localization method. The other two novel methods allow localizing the currently available emergency sources, or other narrowband sounds in general, that are difficult to localize due to the periodicity in the sequence of sound samples. The second method exploits the Doppler shift in the sound frequency, caused due to the motion of the MAV and the dynamics of the MAV to assist with the localization. The third method involves active control of the robot’s attitude and fusing acoustic and attitude measurements for achieving accurate and robust estimates. We evaluate our methods in real-world experiments with real flying robots.
A new method for the estimation of ego-motion (the direction and amplitude of the velocity) of a mobile device comprising optic-flow and inertial sensors (hereinafter the apparatus). The velocity is expressed in the apparatus’s reference frame, which is moving with the apparatus. The method relies on short-term inertial navigation and the direction of the translational optic- flow in order to estimate ego-motion, defined as the velocity estimate (that describes the speed amplitude and the direction of motion).A key characteristic of the invention is the use of optic- flow without the need for any kind of feature tracking. Moreover, the algorithm uses the direction of the optic-flow and does not need the amplitude, thanks to the fact that the scale of the velocity is solved by the use of inertial navigation and changes in direction of the apparatus.
In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, and high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging, because it requires accurate alignment of the photoreceptive and optical components on a curved surface. Here we describe a novel design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array and a flexible printed circuit board, that are stacked, cut and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up new vistas for a broad range of applications where wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.
We aim at developing autonomous miniature hov- ering flying robots capable of navigating in unstructured GPS- denied environments. A major challenge is the miniaturization of the embedded sensors and processors allowing such platforms to fly autonomously. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real- time on a microcontroller. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional distance sensors or assumptions about the environment. Key to this method is the introduction of the translational optic-flow direction constraint (TOFDC), which does not use the optic-flow scale, but only its direction to correct for inertial sensor drift during changes of direction. This solution requires comparatively much simpler electronics and sensors and works in environments of any geometries. We demonstrate the implementation of this algorithm on a miniature 46g quadrotor for closed-loop position control.
Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the ‘EPFL jump-glider’. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs.
We are witnessing the advent of a new era of robots — drones — that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.
This paper presents an affordable, fully automated and accurate mapping solutions based on ultra-light UAV imagery. Several datasets are analysed and their accuracy is estimated. We show that the accuracy highly depends on the ground resolution (flying height) of the input imagery. When chosen appropriately this mapping solution can compete with traditional mapping solutions that capture fewer high-resolution images from airplanes and that rely on highly accurate orientation and positioning sensors on board. Due to the careful integration with recent computer vision techniques, the post processing is robust and fully automatic and can deal with inaccurate position and orientation information which are typically problematic with traditional techniques.