scispace - formally typeset
Search or ask a question
Topic

Inertial measurement unit

About: Inertial measurement unit is a research topic. Over the lifetime, 13326 publications have been published within this topic receiving 189083 citations. The topic is also known as: IMU.


Papers
More filters
Journal ArticleDOI
TL;DR: A novel closed-form measurement model based on the image data and IMU output signals is introduced for a vision-aided inertial navigation system and is independent of the underlying vision algorithm for image motion estimation such as optical flow algorithms for camera motion estimation.
Abstract: In this paper, a motion estimation approach is introduced for a vision-aided inertial navigation system. The system consists of a ground-facing monocular camera mounted on an inertial measurement unit (IMU) to form an IMU-camera sensor fusion system. The motion estimation procedure fuses inertial data from the IMU and planar features on the ground captured by the camera. The main contribution of this paper is a novel closed-form measurement model based on the image data and IMU output signals. In contrast to existing methods, our algorithm is independent of the underlying vision algorithm for image motion estimation such as optical flow algorithms for camera motion estimation. The algorithm has been implemented using an unscented Kalman filter, which propagates the current and the last state of the system updated in the previous measurement instant. The validity of the proposed navigation method is evaluated both by simulation studies and by real experiments.

70 citations

Patent
26 Nov 2002
TL;DR: In this article, a cross-coupling algorithm is used to cross couple different sensed inertial motions and a variety of mapping schemes may be used to map the detected inertial motion to corresponding motion within the evocative scene displayed on the display.
Abstract: A motion-coupled visual environment prevents, reduces and/or treats motion sickness by sensing inertial motion and providing a corresponding evocative image for a subject to view. Inertial sensors may include accelerometers, gyroscopes or a variety of other different sensor types. A cross-coupling algorithm may be used to cross couple different sensed inertial motions. A variety of mapping schemes may be used to map sensed inertial motion to corresponding motion within the evocative scene displayed on the display. Applications include reducing motion sickness on passenger vehicles such as airplanes, trains and cars; on military vehicles such as ships, airplanes, helicopters and the like; and reducing “cybersickness” in the context of simulations on moving platforms.

69 citations

Journal ArticleDOI
TL;DR: The position and velocity accuracy of the integrated system during complete and partial GPS data outages is investigated and the benefit of using inertial data to improve the ambiguity resolution process after such dataOutages is addressed.
Abstract: Integration of GPS with inertial sensors can provide many benefits for navigation, from improved accuracy to increased reliability. The extent of such benefits, however, is typically a function of the quality of the inertial system used. Traditionally, high-cost, navigation-grade inertial measurement units (IMUs) have been used to obtain the highest position and velocity accuracies. However, the work documented in this paper uses a Honeywell HG-1700 IMU (1 deg/h) to assess the benefits of a tactical-grade IMU in aiding GPS for high-accuracy (centimeter-level) applications. To this end, the position and velocity accuracy of the integrated system during complete and partial GPS data outages is investigated. The benefit of using inertial data to improve the ambiguity resolution process after such data outages is also addressed in detail. Centralized and decentralized filtering strategies are compared in terms of system performance.

69 citations

Journal ArticleDOI
TL;DR: An efficient multi-sensor odometry system for mobile platforms that jointly optimizes visual, lidar, and inertial information within a single integrated factor graph that runs in real-time at full framerate using fixed lag smoothing is presented.
Abstract: We present an efficient multi-sensor odometry system for mobile platforms that jointly optimizes visual, lidar, and inertial information within a single integrated factor graph. This runs in real-time at full framerate using fixed lag smoothing. To perform such tight integration, a new method to extract 3D line and planar primitives from lidar point clouds is presented. This approach overcomes the suboptimality of typical frame-to-frame tracking methods by treating the primitives as landmarks and tracking them over multiple scans. True integration of lidar features with standard visual features and IMU is made possible using a subtle passive synchronization of lidar and camera frames. The lightweight formulation of the 3D features allows for real-time execution on a single CPU. Our proposed system has been tested on a variety of platforms and scenarios, including underground exploration with a legged robot and outdoor scanning with a dynamically moving handheld device, for a total duration of 96 min and 2.4 km traveled distance. In these test sequences, using only one exteroceptive sensor leads to failure due to either underconstrained geometry (affecting lidar) or textureless areas caused by aggressive lighting changes (affecting vision). In these conditions, our factor graph naturally uses the best information available from each sensor modality without any hard switches.

69 citations

Journal ArticleDOI
TL;DR: This paper presents a new lightweight real-time onboard object tracking approach with multi-inertial sensing data, wherein a highly energy-efficient drone is built based on the Snapdragon flight board of Qualcomm.
Abstract: Real-time object tracking on a drone under a dynamic environment has been a challenging issue for many years, with existing approaches using off-line calculation or powerful computation units on board. This paper presents a new lightweight real-time onboard object tracking approach with multi-inertial sensing data, wherein a highly energy-efficient drone is built based on the Snapdragon flight board of Qualcomm. The flight board uses a digital signal processor core of the Snapdragon 801 processor to realize PX4 autopilot, an open-source autopilot system oriented toward inexpensive autonomous aircraft. It also uses an ARM core to realize Linux, robot operating systems, open-source computer vision library, and related algorithms. A lightweight moving object detection algorithm is proposed that extracts feature points in the video frame using the oriented FAST and rotated binary robust independent elementary features algorithm and adapts a local difference binary algorithm to construct the image binary descriptors. The K-nearest neighbor method is then used to match the image descriptors. Finally, an object tracking method is proposed that fuses inertial measurement unit data, global positioning system data, and the moving object detection results to calculate the relative position between coordinate systems of the object and the drone. All the algorithms are run on the Qualcomm platform in real time. Experimental results demonstrate the superior performance of our method over the state-of-the-art visual tracking method.

69 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
81% related
Wireless sensor network
142K papers, 2.4M citations
81% related
Control theory
299.6K papers, 3.1M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Wireless
133.4K papers, 1.9M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,067
20222,256
2021852
20201,150
20191,181
20181,162