scispace - formally typeset
Topic

Visual odometry

About: Visual odometry is a(n) research topic. Over the lifetime, 2925 publication(s) have been published within this topic receiving 85605 citation(s).

...read more

Papers
  More

Proceedings ArticleDOI: 10.1109/CVPR.2012.6248074
Andreas Geiger1, Philip Lenz1, Raquel Urtasun2Institutions (2)
16 Jun 2012-
Abstract: Today, visual recognition systems are still rarely employed in robotics applications. Perhaps one of the main reasons for this is the lack of demanding benchmarks that mimic such scenarios. In this paper, we take advantage of our autonomous driving platform to develop novel challenging benchmarks for the tasks of stereo, optical flow, visual odometry/SLAM and 3D object detection. Our recording platform is equipped with four high resolution video cameras, a Velodyne laser scanner and a state-of-the-art localization system. Our benchmarks comprise 389 stereo and optical flow image pairs, stereo visual odometry sequences of 39.2 km length, and more than 200k 3D object annotations captured in cluttered scenarios (up to 15 cars and 30 pedestrians are visible per image). Results from state-of-the-art algorithms reveal that methods ranking high on established datasets such as Middlebury perform below average when being moved outside the laboratory to the real world. Our goal is to reduce this bias by providing challenging benchmarks with novel difficulties to the computer vision community. Our benchmarks are available online at: www.cvlibs.net/datasets/kitti

...read more

Topics: Visual odometry (61%), Stereo cameras (58%), Optical flow (53%) ...read more

7,520 Citations


Proceedings ArticleDOI: 10.1109/ISMAR.2007.4538852
Georg Klein1, David W. Murray1Institutions (1)
13 Nov 2007-
Abstract: This paper presents a method of estimating camera pose in an unknown scene. While this has previously been attempted by adapting SLAM algorithms developed for robotic exploration, we propose a system specifically designed to track a hand-held camera in a small AR workspace. We propose to split tracking and mapping into two separate tasks, processed in parallel threads on a dual-core computer: one thread deals with the task of robustly tracking erratic hand-held motion, while the other produces a 3D map of point features from previously observed video frames. This allows the use of computationally expensive batch optimisation techniques not usually associated with real-time operation: The result is a system that produces detailed maps with thousands of landmarks which can be tracked at frame-rate, with an accuracy and robustness rivalling that of state-of-the-art model-based systems.

...read more

3,776 Citations


Open accessJournal ArticleDOI: 10.1109/TPAMI.2007.1049
Abstract: We present a real-time algorithm which can recover the 3D trajectory of a monocular camera, moving rapidly through a previously unknown scene. Our system, which we dub MonoSLAM, is the first successful application of the SLAM methodology from mobile robotics to the "pure vision" domain of a single uncontrolled camera, achieving real time but drift-free performance inaccessible to structure from motion approaches. The core of the approach is the online creation of a sparse but persistent map of natural landmarks within a probabilistic framework. Our key novel contributions include an active approach to mapping and measurement, the use of a general motion model for smooth camera movement, and solutions for monocular feature initialization and feature orientation estimation. Together, these add up to an extremely efficient and robust algorithm which runs at 30 Hz with standard PC and camera hardware. This work extends the range of robotic systems in which SLAM can be usefully applied, but also opens up new areas. We present applications of MonoSLAM to real-time 3D localization and mapping for a high-performance full-size humanoid robot and live augmented reality with a hand-held camera

...read more

Topics: Camera auto-calibration (65%), Smart camera (62%), Visual odometry (54%) ...read more

3,319 Citations


Open accessJournal ArticleDOI: 10.1109/TRO.2017.2705103
Raul Mur-Artal1, Juan D. Tardós1Institutions (1)
Abstract: We present ORB-SLAM2, a complete simultaneous localization and mapping (SLAM) system for monocular, stereo and RGB-D cameras, including map reuse, loop closing, and relocalization capabilities. The system works in real time on standard central processing units in a wide variety of environments from small hand-held indoors sequences, to drones flying in industrial environments and cars driving around a city. Our back-end, based on bundle adjustment with monocular and stereo observations, allows for accurate trajectory estimation with metric scale. Our system includes a lightweight localization mode that leverages visual odometry tracks for unmapped regions and matches with map points that allow for zero-drift localization. The evaluation on 29 popular public sequences shows that our method achieves state-of-the-art accuracy, being in most cases the most accurate SLAM solution. We publish the source code, not only for the benefit of the SLAM community, but with the aim of being an out-of-the-box SLAM solution for researchers in other fields.

...read more

2,408 Citations


Open accessProceedings ArticleDOI: 10.1109/IROS.2012.6385773
Jrgen Sturm1, Nikolas Engelhard2, Felix Endres2, Wolfram Burgard2  +1 moreInstitutions (2)
24 Dec 2012-
Abstract: In this paper, we present a novel benchmark for the evaluation of RGB-D SLAM systems. We recorded a large set of image sequences from a Microsoft Kinect with highly accurate and time-synchronized ground truth camera poses from a motion capture system. The sequences contain both the color and depth images in full sensor resolution (640 × 480) at video frame rate (30 Hz). The ground-truth trajectory was obtained from a motion-capture system with eight high-speed tracking cameras (100 Hz). The dataset consists of 39 sequences that were recorded in an office environment and an industrial hall. The dataset covers a large variety of scenes and camera motions. We provide sequences for debugging with slow motions as well as longer trajectories with and without loop closures. Most sequences were recorded from a handheld Kinect with unconstrained 6-DOF motions but we also provide sequences from a Kinect mounted on a Pioneer 3 robot that was manually navigated through a cluttered indoor environment. To stimulate the comparison of different approaches, we provide automatic evaluation tools both for the evaluation of drift of visual odometry systems and the global pose error of SLAM systems. The benchmark website [1] contains all data, detailed descriptions of the scenes, specifications of the data formats, sample code, and evaluation tools.

...read more

  • Fig. 1. We present a large dataset for the evaluation of RGB-D SLAM systems in (a) a typical office environment and (b) an industrial hall. We obtained the ground truth camera position from a motion capture system using reflective markers on (c) a hand-held and (d) a robot-mounted Kinect sensor.
    Fig. 1. We present a large dataset for the evaluation of RGB-D SLAM systems in (a) a typical office environment and (b) an industrial hall. We obtained the ground truth camera position from a motion capture system using reflective markers on (c) a hand-held and (d) a robot-mounted Kinect sensor.
  • Fig. 5. (a) Checkerboard used for the calibration and the time synchronization. (b) Analysis of the time delay between the motion capture system and the color camera of the Kinect sensor.
    Fig. 5. (a) Checkerboard used for the calibration and the time synchronization. (b) Analysis of the time delay between the motion capture system and the color camera of the Kinect sensor.
  • Fig. 6. Evaluating the drift by means of the relative pose error (RPE) of two visual odometry approaches on the fr1/desk sequence. As can be seen from this plot, RBM is has lower drift and fewer outliers than GICP. For more details, see [44].
    Fig. 6. Evaluating the drift by means of the relative pose error (RPE) of two visual odometry approaches on the fr1/desk sequence. As can be seen from this plot, RBM is has lower drift and fewer outliers than GICP. For more details, see [44].
  • Fig. 2. Four examples of sequences contained in our dataset. Whereas the top row shows an example image, the bottom row shows the ground truth trajectory. The fr1/xyz sequence contains isolated motions along the coordinate axes, fr1/room and fr2/desk are sequences with several loop closures in two different office scenes, and fr2/slam was recorded from a Kinect mounted on a Pioneer 3 robot in a search-and-rescue scenario.
    Fig. 2. Four examples of sequences contained in our dataset. Whereas the top row shows an example image, the bottom row shows the ground truth trajectory. The fr1/xyz sequence contains isolated motions along the coordinate axes, fr1/room and fr2/desk are sequences with several loop closures in two different office scenes, and fr2/slam was recorded from a Kinect mounted on a Pioneer 3 robot in a search-and-rescue scenario.
  • Fig. 7. (a) Visualization of the absolute trajectory error (ATE) on the “fr1/desk2” sequence. (b) Comparison of ATE and RPE measures. Both plots were generated from trajectories estimated by the RGB-D SLAM system [47].
    Fig. 7. (a) Visualization of the absolute trajectory error (ATE) on the “fr1/desk2” sequence. (b) Comparison of ATE and RPE measures. Both plots were generated from trajectories estimated by the RGB-D SLAM system [47].
  • + 4

Topics: Visual odometry (53%), Video tracking (52%), Pose (51%) ...read more

2,269 Citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
2021206
2020280
2019350
2018342
2017316

Top Attributes

Show by:

Topic's top 5 most impactful authors

Davide Scaramuzza

27 papers, 4.5K citations

Daniel Cremers

26 papers, 4.2K citations

Roland Siegwart

22 papers, 2.6K citations

Timothy D. Barfoot

21 papers, 439 citations

Larry Matthies

14 papers, 860 citations

Network Information
Related Topics (5)
Mobile robot

66.7K papers, 1.1M citations

93% related
Pose

15.5K papers, 431.6K citations

92% related
Motion planning

32.8K papers, 553.5K citations

91% related
Robot control

35.2K papers, 578.8K citations

91% related
Robot kinematics

18.1K papers, 308K citations

90% related