scispace - formally typeset
Proceedings ArticleDOI

Tightly-Coupled Aided Inertial Navigation with Point and Plane Features

TLDR
This paper presents a tightly-coupled aided inertial navigation system (INS) with point and plane features, a general sensor fusion framework applicable to any visual and depth sensor (e.g., RGBD, LiDAR) configuration, in which the camera is used for point feature tracking anddepth sensor for plane extraction.
Abstract
This paper presents a tightly-coupled aided inertial navigation system (INS) with point and plane features, a general sensor fusion framework applicable to any visual and depth sensor (e.g., RGBD, LiDAR) configuration, in which the camera is used for point feature tracking and depth sensor for plane extraction. The proposed system exploits geometrical structures (planes) of the environments and adopts the closest point (CP) for plane parameterization. Moreover, we distinguish planar point features from non-planar point features in order to enforce point-on-plane constraints which are used in our state estimator, thus further exploiting structural information from the environment. We also introduce a simple but effective plane feature initialization algorithm for feature-based simultaneous localization and mapping (SLAM). In addition, we perform online spatial calibration between the IMU and the depth sensor as it is difficult to obtain this critical calibration parameter in high precision. Both Monte-Carlo simulations and real-world experiments are performed to validate the proposed approach.

read more

Citations
More filters
Proceedings ArticleDOI

OpenVINS: A Research Platform for Visual-Inertial Estimation

TL;DR: This paper performs comprehensive validation of the proposed OpenVINS against state-of-the-art open sourced algorithms, showing its competing estimation performance.
Proceedings ArticleDOI

Visual-Inertial Navigation: A Concise Review

TL;DR: Visual-inertial navigation systems (VINS) have become ubiquitous in a wide range of applications from mobile augmented reality to aerial navigation to autonomous driving, in part because of the complementary sensing capabilities and the decreasing costs and size of the sensors as discussed by the authors.
Posted Content

Visual-Inertial Navigation: A Concise Review

TL;DR: This paper surveys thoroughly the research efforts taken in visual-inertial navigation research and strives to provide a concise but complete review of the related work in the hope to accelerate the VINS research and beyond in the authors' society as a whole.
Journal ArticleDOI

Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry

TL;DR: An efficient multi-sensor odometry system for mobile platforms that jointly optimizes visual, lidar, and inertial information within a single integrated factor graph that runs in real-time at full framerate using fixed lag smoothing is presented.
Journal ArticleDOI

Observability Analysis of Aided INS With Heterogeneous Features of Points, Lines, and Planes

TL;DR: A thorough observability analysis for linearized inertial navigation systems (INS) aided by exteroceptive range and/or bearing sensors (such as cameras, LiDAR, and sonars) with different geometric features is performed, and it is proved that there are at least five (or seven) unobservable directions for the linearized aided INS with a single line (plane) feature.
References
More filters
Proceedings Article

An iterative image registration technique with an application to stereo vision

TL;DR: In this paper, the spatial intensity gradient of the images is used to find a good match using a type of Newton-Raphson iteration, which can be generalized to handle rotation, scaling and shearing.
Journal ArticleDOI

VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator

TL;DR: In this article, a robust and versatile monocular visual-inertial state estimator is presented, which is the minimum sensor suite (in size, weight, and power) for the metric six degrees of freedom (DOF) state estimation.
Related Papers (5)