scispace - formally typeset
Open AccessJournal ArticleDOI

A contribution to vision-based autonomous helicopter flight in urban environments

TLDR
A navigation strategy that exploits the optic flow and inertial information to continuously avoid collisions with both lateral and frontal obstacles has been used to control a simulated helicopter flying autonomously in a textured urban environment.
About
This article is published in Robotics and Autonomous Systems.The article was published on 2005-03-31 and is currently open access. It has received 125 citations till now. The article focuses on the topics: Obstacle avoidance & Obstacle.

read more

Citations
More filters
Journal ArticleDOI

The Perception of the Visual World. By James J. Gibson. U.S.A.: Houghton Mifflin Company, 1950 (George Allen & Unwin, Ltd., London). Price 35s.

TL;DR: In this paper, the authors offer a new book that enPDFd the perception of the visual world to read, which they call "Let's Read". But they do not discuss how to read it.
Journal ArticleDOI

An Introduction to Inertial and Visual Sensing

TL;DR: In this article, the authors present a tutorial introduction to two important senses for biological and robotic systems -inertial and visual perception, and discuss the complementarity of these sensors, describe some fundamental approaches to fusing their outputs and survey the field.
Journal ArticleDOI

Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles

TL;DR: Experimental results show that the proposed vision-based autopilot enabled a small rotorcraft to achieve fully-autonomous flight using information extracted from optic flow.
Journal ArticleDOI

Fly-inspired visual steering of an ultralight indoor aircraft

TL;DR: Models and algorithms which allow for efficient course stabilization and collision avoidance using optic flow and inertial information are described.
Proceedings ArticleDOI

MAV navigation through indoor corridors using optical flow

TL;DR: An approach for wall collision avoidance using a depth map based on optical flow from on board camera images using an omnidirectional fisheye camera as a primary sensor and IMU data for compensating rotational effects of the optical flow is presented.
References
More filters
Book

The Ecological Approach to Visual Perception

TL;DR: The relationship between Stimulation and Stimulus Information for visual perception is discussed in detail in this article, where the authors also present experimental evidence for direct perception of motion in the world and movement of the self.
Proceedings Article

An iterative image registration technique with an application to stereo vision

TL;DR: In this paper, the spatial intensity gradient of the images is used to find a good match using a type of Newton-Raphson iteration, which can be generalized to handle rotation, scaling and shearing.
Journal ArticleDOI

Determining optical flow

TL;DR: In this paper, a method for finding the optical flow pattern is presented which assumes that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image, and an iterative implementation is shown which successfully computes the Optical Flow for a number of synthetic image sequences.
Proceedings ArticleDOI

Determining Optical Flow

TL;DR: In this article, a method for finding the optical flow pattern is presented which assumes that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image, and an iterative implementation is shown which successfully computes the Optical Flow for a number of synthetic image sequences.
Book

A robust layered control system for a mobile robot

TL;DR: A new architecture for controlling mobile robots is described, building a robust and flexible robot control system that has been used to control a mobile robot wandering around unconstrained laboratory areas and computer machine rooms.
Related Papers (5)
Frequently Asked Questions (16)
Q1. What have the authors contributed in "A contribution to vision-based autonomous helicopter flight in urban environments" ?

The paper provides comparisons with related work, and discusses the applicability of the approach to real platforms. 

This will be the objective of their future work. 

Pooling pixels by blocks of p × p multiplies the amplitude of detectable motions by p, and reduces the number of pixels considered by p × p. 

if the housefly seems to be shortening its turning periods as much as possible, it nevertheless does not give up any control ability and uses the inertial information provided by its halteres to compensate for the rotational component of the optic flow. 

The standard deviation of velocity is closely linked to the standard deviation of the mean obstacle distance: in cluttered environments, this value remains relatively constant and low, whereas, in less cluttered environments, it ranges from high values - when the robot is in an open space - to low values - when the robot reaches a corridor or approaches a wall. 

the rotational component of the optic flow does not depend on distance and, therefore, corrupts the measurement of the translational component. 

Bees also exhibit a clutter response that allows their speed to be adapted to the width of the tunnel by maintaining a constant average motion [38]. 

VxPx + ωy(1 + r 2 j ) − ωz(ri.rj) − ωx.ri (15)Because the low-level controller prevents the helicopter from skidding aside, the lateral velocity Vy can be set to zero. 

Beyond their flying manoeuvrability, they benefit from a reactive navigation system that enables them to wander in cluttered environments thanks to on-line monitoring of the optic flow. 

in order to estimate the robustness of their controller with respect to sensor errors, 5 series of 20 experiments using different noise levels were performed in environment 3 - the most cluttered. 

Introducing a low-pass filter on η, M righth and M left h (see Fig. 2) makes the resulting strategy more efficient, even if it adds a delay in the control loop. 

The experimental results described below demonstrate that this low-level controller provides a high close-loop bandwidth and affords efficient capacities for perturbation resistance. 

to determine this motion is computationally expensive and hardly applicable to real-time applications because of the quadratic dependency of the search upon n, the maximum motion detected. 

The simulated robot is a rotary-wing UAV inspired from the Concept 60 SR II (Fig. 1), a remote-controlled helicopter produced by Kyosho 3 . 

It is only beyond unrealistic noise levels, attaining or exceeding three times the nominal values, that crashes cannot be avoided. 

From Fig. 5, the following relationship may be derived:h Y = f X (2)As the low-level controller prevents the helicopter from skidding aside, the authors make the assumption than the lateral velocity Vy is null.