scispace - formally typeset
Open AccessJournal ArticleDOI

Real-time implementation of airborne inertial-SLAM

Reads0
Chats0
TLDR
This paper addresses some challenges to the real-time implementation of Simultaneous Localisation and Mapping (SLAM) on a UAV platform using an Extended Kalman Filter (EKF), which fuses data from an Inertial Measurement Unit (IMU) with data from a passive vision system.
About
This article is published in Robotics and Autonomous Systems.The article was published on 2007-01-01 and is currently open access. It has received 182 citations till now. The article focuses on the topics: Extended Kalman filter & Inertial measurement unit.

read more

Figures
Citations
More filters
Journal ArticleDOI

Keyframe-based visual-inertial odometry using nonlinear optimization

TL;DR: This work forms a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms and compares the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter.
Journal ArticleDOI

High-precision, consistent EKF-based visual-inertial odometry

TL;DR: A novel, real-time EKF-based VIO algorithm is proposed, which achieves consistent estimation by ensuring the correct observability properties of its linearized system model, and performing online estimation of the camera-to-inertial measurement unit (IMU) calibration parameters.
Journal ArticleDOI

Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems

TL;DR: The main objective of this paper is to present a comprehensive survey of RUAS research that captures all seminal works and milestones in each GNC area, with a particular focus on practical methods and technologies that have been demonstrated in flight tests.
Journal ArticleDOI

An Introduction to Inertial and Visual Sensing

TL;DR: In this article, the authors present a tutorial introduction to two important senses for biological and robotic systems -inertial and visual perception, and discuss the complementarity of these sensors, describe some fundamental approaches to fusing their outputs and survey the field.
Journal ArticleDOI

Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias Determination

TL;DR: This paper investigates the problem of vision and inertial data fusion with the introduction of a very simple and powerful new method that is able to simultaneously estimate all the observable modes with no need for any initialization or a priori knowledge.
References
More filters
Book

Strapdown inertial navigation technology

TL;DR: In this paper, the physical principles of inertial navigation, the associated growth of errors and their compensation, and their application in a broad range of applications are discussed, drawing current technological developments and providing an indication of potential future trends.
Book ChapterDOI

Estimating uncertain spatial relationships in robotics

TL;DR: A representation for spatial information, called the stochastic map, and associated procedures for building it, reading information from it, and revising it incrementally as new information is obtained, providing a general solution to the problem of estimating uncertain relative spatial relationships.
Journal ArticleDOI

Optimization of the simultaneous localization and map-building algorithm for real-time implementation

TL;DR: Addresses real-time implementation of the simultaneous localization and map-building (SLAM) algorithm and presents optimal algorithms that consider the special form of the matrices and a new compressed filler that can significantly reduce the computation requirements when working in local areas or with high frequency external sensors.
Journal ArticleDOI

Simultaneous Localization and Mapping with Sparse Extended Information Filters

TL;DR: It is shown that when represented in the information form, map posteriors are dominated by a small number of links that tie together nearby features in the map, which is developed into a sparse variant of the EIF, called the sparse extended information filter (SEIF).
Journal ArticleDOI

Hierarchical SLAM: real-time accurate mapping of large environments

TL;DR: A close to optimal loop closing method is proposed that, while maintaining independence at the local level, imposes consistency at the global level at a computational cost that is linear with the size of the loop.
Related Papers (5)
Frequently Asked Questions (15)
Q1. What have the authors contributed in "Real-time implementation of airborne inertial-slam" ?

This paper addresses some challenges to the real-time implementation of Simultaneous Localisation and Mapping ( SLAM ) on a UAV platform. 

Although airborne SLAM is still in its infancy, with many exciting areas for future research, the results presented here have clearly illustrated its capability as a reliable and accurate airborne navigation and mapping system. SLAM consistency and robustness needs to be further investigated. 

Advances in cost effective inertial sensors and accurate navigation aids, such as the Global Navigation Satellite System (GNSS), have been key determinants of the feasibility of UAV systems. 

The map uncertainty decreases monotonically towards the lower limit, and it becomes less sensitive to the addition of further information/information gain. 

After take-off, the vehicle underwent autonomous flight in an oval trajectory, and then SLAM was activated from the ground station. 

The relative position vector of the map from the sensor rssm = [x y z]T is computed from the range, bearing and elevation measurements using a polar-to-Cartesian transformation. 

There are still a number of theoretical, technical, and practical issues that need to be resolved, including SLAM consistency, data synchronisation between vision and INS, real-time implementation of the indirect filter, natural featuredetection and representation, and the incorporation of sub-map techniques for large scale deployment. 

The Brumby airframes shown in Fig. 2(a) are capable of flying at approximately 50 m/s and have an endurance of the order of 45 min with a 20 kg payload. 

By using WindowsTM development tools, the algorithm can be easily debugged and verified, reducing the overall development time needed. 

The i th feature position, mni in the navigation frame, is a function of the vehicle position pn , the sensor lever-arm offset from the body centre rbbs , and the relative position of the feature, as measured from the sensor location rssm in the sensor frame [11]:mni = p n + 

In parallel to these efforts, there have been attempts to develop SLAM for 3D environments, for example: the use of rotating laser range finders in mining applications [8], and the use of stereo vision systems for lowdynamic aerial vehicles [9]. 

Unmanned Aerial Vehicles (UAVs) have attracted much attention from robotics researchers in both civilian and defense industries over the past few years. 

(2)Here Cbs is a DCM which transforms the vector in the sensor frame to the body frame, and is defined for each payload sensor installment. 

This can be improved by performing a more precise error analysis, using the errors arising from the inertial sensors and during initialisation. 

For airborne applications, to the best of their knowledge there have been only three attempts up to now: SLAM on a blimp-type (thus low-dynamic) platform using a stereo vision system [9]; inertial SLAM in a laboratory environment [10]; and SLAM on a fixed-wing UAV with inertial sensors and a single vision system by the present authors [11].