scispace - formally typeset
Search or ask a question

Showing papers on "Inertial measurement unit published in 2013"


Proceedings ArticleDOI
01 Nov 2013
TL;DR: A novel framework for jointly estimating the temporal offset between measurements of different sensors and their spatial displacements with respect to each other is presented, enabled by continuous-time batch estimation and extends previous work by seamlessly incorporating time offsets within the rigorous theoretical framework of maximum likelihood estimation.
Abstract: In order to increase accuracy and robustness in state estimation for robotics, a growing number of applications rely on data from multiple complementary sensors. For the best performance in sensor fusion, these different sensors must be spatially and temporally registered with respect to each other. To this end, a number of approaches have been developed to estimate these system parameters in a two stage process, first estimating the time offset and subsequently solving for the spatial transformation between sensors. In this work, we present on a novel framework for jointly estimating the temporal offset between measurements of different sensors and their spatial displacements with respect to each other. The approach is enabled by continuous-time batch estimation and extends previous work by seamlessly incorporating time offsets within the rigorous theoretical framework of maximum likelihood estimation. Experimental results for a camera to inertial measurement unit (IMU) calibration prove the ability of this framework to accurately estimate time offsets up to a fraction of the smallest measurement period.

626 citations


Journal ArticleDOI
Shaeffer Derek K1
TL;DR: This tutorial provides an overview of MEMS technology and describes the essential features of the mechanical systems underlying the most common sensors accelerometers and gyroscopes, and reviews multisensor silicon MEMS/CMOS monolithic integration, which is driving the cost and form factor reduction behind the current proliferation of these devices.
Abstract: Inertial sensors based on MEMS technology are fast becoming ubiquitous with their adoption into many types of consumer electronics products, including smart phones, tablets, gaming systems, TV remotes, toys, and even (more recently) power tools and wearable sensors. Now a standard feature of most smart phones, MEMS-based motion tracking enhances the user interface by allowing response to user motions, complements the GPS receiver by providing dead-reckoning indoor navigation and supporting location-based services, and holds the promise of enabling handset optical image stabilization in next-generation handsets by virtue of its lower cost and small form factor. This tutorial provides an overview of MEMS technology and describes the essential features of the mechanical systems underlying the most common sensors accelerometers and gyroscopes. It also highlights some fundamental trade-offs related to mechanical system dynamics, force and charge transduction methods, and their implications for the mixed-signal systems that process the sensor outputs. The presentation of an energy-based metric allows a comparison of the performance of competing sensor solutions. For each type of sensor, descriptions of the underlying mechanical theory, canonical sensor architectures, and key design challenges are also presented. Finally, the tutorial reviews multisensor silicon MEMS/CMOS monolithic integration, which is driving the cost and form factor reduction behind the current proliferation of these devices.

286 citations


Proceedings ArticleDOI
23 Jun 2013
TL;DR: Combining turning and braking events helps better differentiate between two similar drivers when using supervised learning techniques compared to separate events alone, albeit with anemic performance.
Abstract: Currently there are many research focused on using smartphone as a data collection device. Many have shown its sensors ability to replace a lab test bed. These inertial sensors can be used to segment and classify driving events fairly accurately. In this research we explore the possibility of using the vehicle's inertial sensors from the CAN bus to build a profile of the driver to ultimately provide proper feedback to reduce the number of dangerous car maneuver. Braking and turning events are better at characterizing an individual compared to acceleration events. Histogramming the time-series values of the sensor data does not help performance. Furthermore, combining turning and braking events helps better differentiate between two similar drivers when using supervised learning techniques compared to separate events alone, albeit with anemic performance.

272 citations


Journal ArticleDOI
TL;DR: An equivalent IMU factor based on a recently developed technique for IMU pre-integration is introduced, drastically reducing the number of states that must be added to the system.

230 citations


Journal ArticleDOI
01 Jan 2013
TL;DR: A literature review of several current IMU categories and applications is presented and current methods being used to improve the accuracy of the output from IMU are presented to avoid the errors that latest IMU is facing.
Abstract: Inertial Measurement Unit (IMU) sensors are used widely in many different movable applications. Across many years, the improvements and applications of IMU have increased through various areas such as manufacturing, navigation, and robotics. This paper presents a literature review of several current IMU categories and applications. A few considerations on choosing an IMU for different applications are summarized and current methods being used to improve the accuracy of the output from IMU are also presented to avoid the errors that latest IMU is facing. Improvement methods include the control algorithms and type of filters for the sensor. Pros and cons of the types and algorithms used are also discussed in relation to different applications. 

229 citations


Proceedings ArticleDOI
24 Jun 2013
TL;DR: A novel approach to tightly integrate visual measurements with readings from an Inertial Measurement Unit (IMU) in SLAM using the powerful concept of ‘keyframes’ to maintain a bounded-sized optimization window, ensuring real-time operation.
Abstract: The fusion of visual and inertial cues has become popular in robotics due to the complementary nature of the two sensing modalities. While most fusion strategies to date rely on filtering schemes, the visual robotics community has recently turned to non-linear optimization approaches for tasks such as visual Simultaneous Localization And Mapping (SLAM), following the discovery that this comes with significant advantages in quality of performance and computational complexity. Following this trend, we present a novel approach to tightly integrate visual measurements with readings from an Inertial Measurement Unit (IMU) in SLAM. An IMU error term is integrated with the landmark reprojection error in a fully probabilistic manner, resulting to a joint non-linear cost function to be optimized. Employing the powerful concept of ‘keyframes’ we partially marginalize old states to maintain a bounded-sized optimization window, ensuring real-time operation. Comparing against both vision-only and loosely-coupled visual-inertial algorithms, our experiments confirm the benefits of tight fusion in terms of accuracy and robustness.

225 citations


Journal ArticleDOI
TL;DR: In this article, an adaptive Kalman filter (AKF) with linear models is proposed to improve the computational efficiency and dynamic performance of low cost Inertial Measurement Unit (IMU)/magnetometer integrated Attitude and Heading Reference Systems (AHRS).
Abstract: To improve the computational efficiency and dynamic performance of low cost Inertial Measurement Unit (IMU)/magnetometer integrated Attitude and Heading Reference Systems (AHRS), this paper has proposed an effective Adaptive Kalman Filter (AKF) with linear models; the filter gain is adaptively tuned according to the dynamic scale sensed by accelerometers. This proposed approach does not need to model the system angular motions, avoids the non-linear problem which is inherent in the existing methods, and considers the impact of the dynamic acceleration on the filter. The experimental results with real data have demonstrated that the proposed algorithm can maintain an accurate estimation of orientation, even under various dynamic operating conditions.

198 citations


Proceedings ArticleDOI
02 Mar 2013
TL;DR: In this article, a small-scale UAV capable of performing inspection tasks in enclosed industrial environments is presented, which relies solely on measurements from an on-board MEMS inertial measurement unit and a pair of cameras arranged in a classical stereo configuration.
Abstract: This work presents a small-scale Unmanned Aerial System (UAS) capable of performing inspection tasks in enclosed industrial environments. Vehicles with such capabilities have the potential to reduce human involvement in hazardous tasks and can minimize facility outage periods. The results presented generalize to UAS exploration tasks in almost any GPS-denied indoor environment. The contribution of this work is twofold. First, results from autonomous flights inside an industrial boiler of a power plant are presented. A lightweight, vision-aided inertial navigation system provides reliable state estimates under difficult environmental conditions typical for such sites. It relies solely on measurements from an on-board MEMS inertial measurement unit and a pair of cameras arranged in a classical stereo configuration. A model-predictive controller allows for efficient trajectory following and enables flight in close proximity to the boiler surface. As a second contribution, we highlight ongoing developments by displaying state estimation and structure recovery results acquired with an integrated visual/inertial sensor that will be employed on future aerial service robotic platforms. A tight integration in hardware facilitates spatial and temporal calibration of the different sensors and thus enables more accurate and robust ego-motion estimates. Comparison with ground truth obtained from a laser tracker shows that such a sensor can provide motion estimates with drift rates of only few centimeters over the period of a typical flight.

186 citations


Proceedings ArticleDOI
01 Nov 2013
TL;DR: To the best of the knowledge, this is the first autonomously flying system with complete on-board processing that performs waypoint navigation with obstacle avoidance in geometrically unconstrained, complex indoor/outdoor environments.
Abstract: We introduce our new quadrotor platform for realizing autonomous navigation in unknown indoor/outdoor environments. Autonomous waypoint navigation, obstacle avoidance and flight control is implemented on-board. The system does not require a special environment, artificial markers or an external reference system. We developed a monolithic, mechanically damped perception unit which is equipped with a stereo camera pair, an Inertial Measurement Unit (IMU), two processor-and an FPGA board. Stereo images are processed on the FPGA by the Semi-Global Matching algorithm. Keyframe-based stereo odometry is fused with IMU data compensating for time delays that are induced by the vision pipeline. The system state estimate is used for control and on-board 3D mapping. An operator can set waypoints in the map, while the quadrotor autonomously plans its path avoiding obstacles. We show experiments with the quadrotor flying from inside a building to the outside and vice versa, traversing a window and a door respectively. A video of the experiments is part of this work. To the best of our knowledge, this is the first autonomously flying system with complete on-board processing that performs waypoint navigation with obstacle avoidance in geometrically unconstrained, complex indoor/outdoor environments.

176 citations


Proceedings ArticleDOI
06 May 2013
TL;DR: This paper presents an extended Kalman filter (EKF)-based method for visual-inertial odometry, which fuses the IMU measurements with observations of visual feature tracks provided by the camera, and is able to track the position of a mobile phone moving in an unknown environment with an error accumulation of approximately 0.8% of the distance travelled.
Abstract: All existing methods for vision-aided inertial navigation assume a camera with a global shutter, in which all the pixels in an image are captured simultaneously. However, the vast majority of consumer-grade cameras use rolling-shutter sensors, which capture each row of pixels at a slightly different time instant. The effects of the rolling shutter distortion when a camera is in motion can be very significant, and are not modelled by existing visual-inertial motion-tracking methods. In this paper we describe the first, to the best of our knowledge, method for vision-aided inertial navigation using rolling-shutter cameras. Specifically, we present an extended Kalman filter (EKF)-based method for visual-inertial odometry, which fuses the IMU measurements with observations of visual feature tracks provided by the camera. The key contribution of this work is a computationally tractable approach for taking into account the rolling-shutter effect, incurring only minimal approximations. The experimental results from the application of the method show that it is able to track, in real time, the position of a mobile phone moving in an unknown environment with an error accumulation of approximately 0.8% of the distance travelled, over hundreds of meters.

171 citations


Journal ArticleDOI
TL;DR: The efficiency of the presented monocular vision system is demonstrated comprehensively by comparing it to ground truth data provided by a tracking system and by using its pose estimates as control inputs to autonomous flights of a quadrotor.
Abstract: In this paper, we present an onboard monocular vision system for autonomous takeoff, hovering and landing of a Micro Aerial Vehicle (MAV). Since pose information with metric scale is critical for autonomous flight of a MAV, we present a novel solution to six degrees of freedom (DOF) pose estimation. It is based on a single image of a typical landing pad which consists of the letter "H" surrounded by a circle. A vision algorithm for robust and real-time landing pad recognition is implemented. Then the 5 DOF pose is estimated from the elliptic projection of the circle by using projective geometry. The remaining geometric ambiguity is resolved by incorporating the gravity vector estimated by the inertial measurement unit (IMU). The last degree of freedom pose, yaw angle of the MAV, is estimated from the ellipse fitted from the letter "H". The efficiency of the presented vision system is demonstrated comprehensively by comparing it to ground truth data provided by a tracking system and by using its pose estimates as control inputs to autonomous flights of a quadrotor.

Journal ArticleDOI
TL;DR: An affirmative answer to the question of whether V‐ INSs can be used to sustain prolonged real‐world GPS‐denied flight is provided by presenting a V‐INS that is validated through autonomous flight‐tests over prolonged closed‐loop dynamic operation in both indoor and outdoor GPS‐ denied environments with two rotorcraft unmanned aircraft systems (UASs).
Abstract: GPS-denied closed-loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V-INSs) have been too computationally intensive or do not have sufficient integrity for closed-loop flight. We provide an affirmative answer to the question of whether V-INSs can be used to sustain prolonged real-world GPS-denied flight by presenting a V-INS that is validated through autonomous flight-tests over prolonged closed-loop dynamic operation in both indoor and outdoor GPS-denied environments with two rotorcraft unmanned aircraft systems (UASs). The architecture efficiently combines visual feature information from a monocular camera with measurements from inertial sensors. Inertial measurements are used to predict frame-to-frame transition of online selected feature locations, and the difference between predicted and observed feature locations is used to bind in real-time the inertial measurement unit drift, estimate its bias, and account for initial misalignment errors. A novel algorithm to manage a library of features online is presented that can add or remove features based on a measure of relative confidence in each feature location. The resulting V-INS is sufficiently efficient and reliable to enable real-time implementation on resource-constrained aerial vehicles. The presented algorithms are validated on multiple platforms in real-world conditions: through a 16-min flight test, including an autonomous landing, of a 66 kg rotorcraft UAV operating in an unconctrolled outdoor environment without using GPS and through a Micro-UAV operating in a cluttered, unmapped, and gusty indoor environment. © 2013 Wiley Periodicals, Inc.

Journal ArticleDOI
24 Jul 2013-Sensors
TL;DR: A method is introduced that compensates for error terms of low-cost INS (MEMS grade) sensors by doing a complete analysis of Allan variance, wavelet de-nosing and the selection of the level of decomposition for a suitable combination between these techniques.
Abstract: Advances in the development of micro-electromechanical systems (MEMS) have made possible the fabrication of cheap and small dimension accelerometers and gyroscopes, which are being used in many applications where the global positioning system (GPS) and the inertial navigation system (INS) integration is carried out, i.e., identifying track defects, terrestrial and pedestrian navigation, unmanned aerial vehicles (UAVs), stabilization of many platforms, etc. Although these MEMS sensors are low-cost, they present different errors, which degrade the accuracy of the navigation systems in a short period of time. Therefore, a suitable modeling of these errors is necessary in order to minimize them and, consequently, improve the system performance. In this work, the most used techniques currently to analyze the stochastic errors that affect these sensors are shown and compared: we examine in detail the autocorrelation, the Allan variance (AV) and the power spectral density (PSD) techniques. Subsequently, an analysis and modeling of the inertial sensors, which combines autoregressive (AR) filters and wavelet de-noising, is also achieved. Since a low-cost INS (MEMS grade) presents error sources with short-term (high-frequency) and long-term (low-frequency) components, we introduce a method that compensates for these error terms by doing a complete analysis of Allan variance, wavelet de-nosing and the selection of the level of decomposition for a suitable combination between these techniques. Eventually, in order to assess the stochastic models obtained with these techniques, the Extended Kalman Filter (EKF) of a loosely-coupled GPS/INS integration strategy is augmented with different states. Results show a comparison between the proposed method and the traditional sensor error models under GPS signal blockages using real data collected in urban roadways.

Book
02 Jan 2013
TL;DR: The importance and role of avionics in the avionic environment is highlighted, as well as the importance of unmanned air vehicles, in the context of commercial off-the-shelf (COTS).
Abstract: Foreword. Preface. Acknowledgements. 1: Introduction. 1.1. Importance and role of avionics. 1.2. The avionic environment. 1.3. Choice of units. 2: Displays and man-machine interaction. 2.1. Introduction. 2.2. aHead up displays. 2.3. Helmet mounted displays. 2.4. Computer aided optical design. 2.5. Discussion of HUDs vs HMDs. 2.6. Head down displays. 2.7. Data fusion. 2.8. Intelligent displays management. 2.9. Displays technology. 2.10. Control and data entry. Further reading. 3: Aerodynamics and aircraft control. 3.1. Introduction. 3.2. aBasic aerodynamics. 3.3. Aircraft stability. 3.4. Aircraft dynamics. 3.5. Longitudinal control and response. 3.6. Lateral control. 3.7. Powered flying controls. 3.8. Auto-stabilisation systems. Further reading. 4: Fly-by-wire flight control. 4.1. Introduction. 4.2. aFly-by-wire flight control features and advantages. 4.3. Control laws. 4.4. Redundancy and failure survival. 4.5. Digital implementation. 4.6. Fly-by-light flight control. Further reading. 5: Inertial sensors and attitude derivation. 5.1. Introduction. 5.2. Gyros and accelerometers. 5.3. Attitude derivation. Further reading. 6: Navigation systems. 6.1. Introduction and basic principles. 6.2. Inertial navigation. 6.3. Aided IN systems and Kalman filters. 6.4. Attitude and heading reference systems. 6.5. GPS - global positioning systems. 6.6. Terrain reference navigation. Further reading. 7: Air data and air data systems. 7.1. Introduction. 7.2. Air data information and its use. 7.3. Derivation of air data laws and relationships. 7.4. Air data sensors and computing. Further reading. 8: Autopilots and flight management systems. 8.1. Introduction. 8.2. Autopilots. 8.3. Flight management systems. Further reading. 9: Avionic systems integration. 9.1. Introduction and background. 9.2. Data bus systems. 9.3. Integrated modular avionics. 9.4. Commercial off-the-shelf (COTS). Further reading. 10: Unmanned air vehicles. 10.1. Importance of unmanned air vehicles. 10.2. UAV avionics. Further reading. Glossary of terms. List of symbols. List of abbreviations. Index.

Journal ArticleDOI
TL;DR: A method to measure stride-to-stride foot placement in unconstrained environments is developed, and whether it can accurately quantify gait parameters over long walking distances is tested.

Journal ArticleDOI
TL;DR: The final goal of this work is to realize an upgraded application-specified integrated circuit that controls the microelectromechanical systems (MEMS) sensor and integrates the ASIP, which will allow the MEMS sensor gyro plus accelerometer and the angular estimation system to be contained in a single package.
Abstract: This paper presents an application-specific integrated processor for an angular estimation system that works with 9-D inertial measurement units. The application-specific instruction-set processor (ASIP) was implemented on field-programmable gate array and interfaced with a gyro-plus-accelerometer 6-D sensor and with a magnetic compass. Output data were recorded on a personal computer and also used to perform a live demo. During system modeling and design, it was chosen to represent angular position data with a quaternion and to use an extended Kalman filter as sensor fusion algorithm. For this purpose, a novel two-stage filter was designed: The first stage uses accelerometer data, and the second one uses magnetic compass data for angular position correction. This allows flexibility, less computational requirements, and robustness to magnetic field anomalies. The final goal of this work is to realize an upgraded application-specified integrated circuit that controls the microelectromechanical systems (MEMS) sensor and integrates the ASIP. This will allow the MEMS sensor gyro plus accelerometer and the angular estimation system to be contained in a single package; this system might optionally work with an external magnetic compass.

Journal ArticleDOI
TL;DR: This paper introduces an approach for the indoor localization of a mini UAV based on Ultra-WideBand technology, low cost IMU and vision based sensors, and an Extended Kalman Filter (EKF) is introduced as a possible technique to improve the localization.
Abstract: Indoor localization of mobile agents using wireless technologies is becoming very important in military and civil applications. This paper introduces an approach for the indoor localization of a mini UAV based on Ultra-WideBand technology, low cost IMU and vision based sensors. In this work an Extended Kalman Filter (EKF) is introduced as a possible technique to improve the localization. The proposed approach allows to use a low-cost Inertial Measurement Unit (IMU) in the prediction step and the integration of vision-odometry for the detection of markers nearness the touchdown area. The ranging measurements allow to reduce the errors of inertial sensors due to the limited performance of accelerometers and gyros. The obtained results show that an accuracy of 10 cm can be achieved.

Journal ArticleDOI
TL;DR: In this paper, an experimental platform was built to evaluate the accuracy of 50-Hz PPP displacement waveforms and found that high-rate PPP can produce absolute horizontal displacement waveform at the accuracy 2-4mm and absolute vertical displacement waveframe at the sub-centimeter level of accuracy within a short period of time.
Abstract: High-rate GPS has been widely used to construct displacement waveforms and to invert for source parameters of earthquakes. Almost all works on internal and external evaluation of high-rate GPS accuracy are based on GPS relative positioning. We build an experimental platform to externally evaluate the accuracy of 50-Hz PPP displacement waveforms. Since the shake table allows motion in any of six degrees of freedom, we install an inertial measurement unit (IMU) to measure the attitude of the platform and transform the IMU displacements into the GPS coordinate system. The experimental results have shown that high-rate PPP can produce absolute horizontal displacement waveforms at the accuracy of 2–4 mm and absolute vertical displacement waveforms at the sub-centimeter level of accuracy within a short period of time. The significance of the experiments indicates that high-rate PPP is capable of detecting absolute seismic displacement waveforms at the same high accuracy as GPS relative positioning techniques, but requires no fixed datum station. We have also found a small scaling error of IMU and a small time offset of misalignment between high-rate PPP and IMU displacement waveforms by comparing the amplitudes of and cross-correlating both the displacement waveforms.

Journal ArticleDOI
TL;DR: This tutorial outlines a simple yet effective approach for implementing a reasonably accurate pedestrian inertial tracking system using an error-state Kalman filter for zero-velocity updates (ZUPTs) and orientation estimation.
Abstract: Shoe-mounted inertial sensors offer a convenient way to track pedestrians in situations where other localization systems fail. This tutorial outlines a simple yet effective approach for implementing a reasonably accurate tracker. This Web extra presents the Matlab implementation and a few sample recordings for implementing the pedestrian inertial tracking system using an error-state Kalman filter for zero-velocity updates (ZUPTs) and orientation estimation.

Journal ArticleDOI
TL;DR: In this article, an autonomous holonomic mobile robot is used as a platform to carry various NDE sensing systems for simultaneous and fast data collection, including ground penetrating radar arrays, acoustic/seismic arrays, electrical resistivity sensors, and video cameras.
Abstract: The condition of bridges is critical for the safety of the traveling public. Bridges deteriorate with time as a result of material aging, excessive loading, environmental effects, and inadequate maintenance. The current practice of nondestructive evaluation (NDE) of bridge decks cannot meet the increasing demands for highly efficient, cost-effective, and safety-guaranteed inspection and evaluation. In this paper, a mechatronic systems design for an autonomous robotic system for highly efficient bridge deck inspection and evaluation is presented. An autonomous holonomic mobile robot is used as a platform to carry various NDE sensing systems for simultaneous and fast data collection. The robot's NDE sensor suite includes ground penetrating radar arrays, acoustic/seismic arrays, electrical resistivity sensors, and video cameras. Besides the NDE sensors, the robot is also equipped with various onboard navigation sensors such as global positioning system (GPS), inertial measurement units (IMU), laser scanner, etc. An integration scheme is presented to fuse the measurements from the GPS, the IMU and the wheel encoders for high-accuracy robot localization. The performance of the robotic NDE system development is demonstrated through extensive testing experiments and field deployments.

Journal ArticleDOI
TL;DR: The achieved relative figure of merits using the collected data validates the reliability of the proposed methods for the desired applications and permits the potential application of the current study in camera-aided inertial navigation for positioning and personal assistance for future research works.
Abstract: This paper presents a method for pedestrian activity classification and gait analysis based on the microelectromechanical-systems inertial measurement unit (IMU). The work targets two groups of applications, including the following: 1) human activity classification and 2) joint human activity and gait-phase classification. In the latter case, the gait phase is defined as a substate of a specific gait cycle, i.e., the states of the body between the stance and swing phases. We model the pedestrian motion with a continuous hidden Markov model (HMM) in which the output density functions are assumed to be Gaussian mixture models. For the joint activity and gait-phase classification, motivated by the cyclical nature of the IMU measurements, each individual activity is modeled by a “circular HMM.” For both the proposed classification methods, proper feature vectors are extracted from the IMU measurements. In this paper, we report the results of conducted experiments where the IMU was mounted on the humans' chests. This permits the potential application of the current study in camera-aided inertial navigation for positioning and personal assistance for future research works. Five classes of activity, including walking, running, going upstairs, going downstairs, and standing, are considered in the experiments. The performance of the proposed methods is illustrated in various ways, and as an objective measure, the confusion matrix is computed and reported. The achieved relative figure of merits using the collected data validates the reliability of the proposed methods for the desired applications.

Proceedings ArticleDOI
06 May 2013
TL;DR: This paper proposes a vision-based state estimation approach that does not drift when the vehicle remains stationary and shows indoor experimental results with performance benchmarking and illustrates the autonomous operation of the system in challenging indoor and outdoor environments.
Abstract: In this paper, we consider the development of a rotorcraft micro aerial vehicle (MAV) system capable of vision-based state estimation in complex environments. We pursue a systems solution for the hardware and software to enable autonomous flight with a small rotorcraft in complex indoor and outdoor environments using only onboard vision and inertial sensors. As rotorcrafts frequently operate in hover or nearhover conditions, we propose a vision-based state estimation approach that does not drift when the vehicle remains stationary. The vision-based estimation approach combines the advantages of monocular vision (range, faster processing) with that of stereo vision (availability of scale and depth information), while overcoming several disadvantages of both. Specifically, our system relies on fisheye camera images at 25 Hz and imagery from a second camera at a much lower frequency for metric scale initialization and failure recovery. This estimate is fused with IMU information to yield state estimates at 100 Hz for feedback control. We show indoor experimental results with performance benchmarking and illustrate the autonomous operation of the system in challenging indoor and outdoor environments.

Journal ArticleDOI
TL;DR: The experimental result shows that the novel indoor localization and monitoring system is able to track person indoors in both walking and running cases, and to monitor the body movement during whole period of experiment.
Abstract: This paper presents a novel indoor localization and monitoring system based on inertial sensors for emergency responders. The system utilizes acceleration, angular rate and magnetic field sensors and consists of three parts. The first part is a modified Kalman filtering which implements the sensor data fusion and meanwhile detects and minimizes the magnetic field disturbances, so as to provide a long term stable orientation solution. The second part is zero velocity updating which resets the velocity within still phase to deliver accurate position information. The last part of the system is body movement monitoring, which is achieved by calculating the relative position of each body segment based on the transformation of coordinate frame of each body segment. The experimental result shows that the system is able to track person indoors in both walking and running cases, and to monitor the body movement during whole period of experiment.

Proceedings ArticleDOI
06 May 2013
TL;DR: This work proposes an online approach for estimating the time offset between the data obtained from different sensors in extended Kalman filter (EKF)-based methods, and demonstrates that the proposed approach yields high-precision, consistent estimates in scenarios involving both constant and time-varying offsets.
Abstract: When measurements from multiple sensors are combined for real-time motion estimation, the time instant at which each measurement was recorded must be precisely known. In practice, however, the timestamps of each sensor's measurements are typically affected by a delay, which is different for each sensor. This gives rise to a temporal misalignment (i.e., a time offset) between the sensors' data streams. In this work, we propose an online approach for estimating the time offset between the data obtained from different sensors. Specifically, we focus on the problem of motion estimation using visual and inertial sensors in extended Kalman filter (EKF)-based methods. The key idea proposed here is to explicitly include the time offset between the camera and IMU in the EKF state vector, and estimate it online along with all other variables of interest (the IMU pose, the camera-to-IMU calibration, etc). Our proposed approach is general, and can be employed in several classes of estimation problems, such as motion estimation based on mapped features, EKF-based SLAM, or visual-inertial odometry. Our simulation and experimental results demonstrate that the proposed approach yields high-precision, consistent estimates, in scenarios involving both constant and time-varying offsets.

Proceedings ArticleDOI
Jiuchao Qian1, Jiabin Ma1, Rendong Ying1, Peilin Liu1, Ling Pei1 
01 Oct 2013
TL;DR: An improved indoor localization method based on smartphone inertial sensors is presented, which can achieve significant performance improvements in terms of efficiency, accuracy and reliability.
Abstract: In this paper, an improved indoor localization method based on smartphone inertial sensors is presented. Pedestrian dead reckoning (PDR), which determines the relative location change of a pedestrian without additional infrastructure supports, is combined with a floor plan for a pedestrian positioning in our work. To address the challenges of low sampling frequency and limited processing power in smartphones, reliable and efficient PDR algorithms have been proposed. A robust step detection technique leaves out the preprocessing of raw signal and reduces complex computation. Given the fact that the precision of the stride length estimation is influenced by different pedestrians and motion modes, an adaptive stride length estimation algorithm based on the motion mode classification is developed. Heading estimation is carried out by applying the principal component analysis (PCA) to acceleration measurements projected to the global horizontal plane, which is independent of the orientation of a smartphone. In addition, to eliminate the sensor drift due to the inaccurate distance and direction estimations, a particle filter is introduced to correct the drift and guarantee the localization accuracy. Extensive field tests have been conducted in a laboratory building to verify the performance of proposed algorithm. A pedestrian held a smartphone with arbitrary orientation in the tests. Test results show that the proposed algorithm can achieve significant performance improvements in terms of efficiency, accuracy and reliability.

01 Apr 2013
TL;DR: The significance of the experiments indicates that high-rate PPP is capable of detecting absolute seismic displacement waveforms at the same high accuracy as GPS relative positioning techniques, but requires no fixed datum station.
Abstract: High-rate GPS has been widely used to construct displacement waveforms and to invert for source parameters of earthquakes. Almost all works on internal and external evaluation of high-rate GPS accuracy are based on GPS relative positioning. We build an experimental platform to externally evaluate the accuracy of 50-Hz PPP displacement waveforms. Since the shake table allows motion in any of six degrees of freedom, we install an inertial measurement unit (IMU) to measure the attitude of the platform and transform the IMU displacements into the GPS coordinate system. The experimental results have shown that high-rate PPP can produce absolute horizontal displacement waveforms at the accuracy of 2–4 mm and absolute vertical displacement waveforms at the sub-centimeter level of accuracy within a short period of time. The significance of the experiments indicates that high-rate PPP is capable of detecting absolute seismic displacement waveforms at the same high accuracy as GPS relative positioning techniques, but requires no fixed datum station. We have also found a small scaling error of IMU and a small time offset of misalignment between high-rate PPP and IMU displacement waveforms by comparing the amplitudes of and cross-correlating both the displacement waveforms.

Journal ArticleDOI
TL;DR: A wireless micro inertial measurement unit (IMU) that meets the design prerequisites of a space-saving design and eliminates the need for hard-wired data communication, while still being competitive with state-of-the-art commercially available MEMS IMUs.
Abstract: In this paper, we present a wireless micro inertial measurement unit (IMU) with the smallest volume and weight requirements available at the moment. With a size of 22 mm × 14 mm × 4 mm (1.2 cm3), this IMU provides full control over the data of a three-axis accelerometer, a three-axis gyroscope, and a three-axis magnetometer. It meets the design prerequisites of a space-saving design and eliminates the need for hard-wired data communication, while still being competitive with state-of-the-art commercially available MEMS IMUs. A CC430 microcontroller sends the collected raw data to a base station wirelessly with a maximum sensor sample rate of 640 samples/s. Thereby, the IMU performance is optimized by moving data post processing to the base station. This development offers important features in portable applications with their significant size and weight requirements. Due to its small size, the IMU can be integrated into clothes or shoes for accurate position estimation in mobile applications and location-based services. We demonstrate the performance of the wireless micro IMU in a localization experiment where it is placed on a shoe for pedestrian tracking. With sensor data-fusion based on a Kalman filter combined with the zero velocity update, we can precisely track a person in an indoor area.

Journal ArticleDOI
04 Feb 2013-Sensors
TL;DR: Measurements from a monocular vision system are fused with inertial/magnetic measurements from an Inertial Measurement Unit (IMU) rigidly connected to the camera to estimate the pose of the IMU/camera sensor moving relative to a rigid scene (ego-motion).
Abstract: In this paper measurements from a monocular vision system are fused with inertial/magnetic measurements from an Inertial Measurement Unit (IMU) rigidly connected to the camera. Two Extended Kalman filters (EKFs) were developed to estimate the pose of the IMU/camera sensor moving relative to a rigid scene (ego-motion), based on a set of fiducials. The two filters were identical as for the state equation and the measurement equations of the inertial/magnetic sensors. The DLT-based EKF exploited visual estimates of the ego-motion using a variant of the Direct Linear Transformation (DLT) method; the error-driven EKF exploited pseudo-measurements based on the projection errors from measured two-dimensional point features to the corresponding three-dimensional fiducials. The two filters were off-line analyzed in different experimental conditions and compared to a purely IMU-based EKF used for estimating the orientation of the IMU/camera sensor. The DLT-based EKF was more accurate than the error-driven EKF, less robust against loss of visual features, and equivalent in terms of computational complexity. Orientation root mean square errors (RMSEs) of 1° (1.5°), and position RMSEs of 3.5 mm (10 mm) were achieved in our experiments by the DLT-based EKF (error-driven EKF); by contrast, orientation RMSEs of 1.6° were achieved by the purely IMU-based EKF.

Book ChapterDOI
01 Jan 2013
TL;DR: This paper proposes an Observability-Constrained VINS (OC-VINS) methodology that explicitly adheres to the observability properties of the true system, and applies this approach to the Multi-State Constraint Kalman Filter (MSC-KF).
Abstract: In this paper, we study estimator inconsistency in Vision-aided Inertial Navigation Systems (VINS). We show that standard (linearized) estimation approaches, such as the Extended Kalman Filter (EKF), can fundamentally alter the system observability properties, in terms of the number and structure of the unobservable directions. This in turn allows the influx of spurious information, leading to inconsistency. To address this issue, we propose an Observability-Constrained VINS (OC-VINS) methodology that explicitly adheres to the observability properties of the true system.We apply our approach to the Multi-State Constraint Kalman Filter (MSC-KF), and provide both simulation and experimental validation of the effectiveness of our method for improving estimator consistency.

Proceedings ArticleDOI
01 Dec 2013
TL;DR: A novel sensor fusion approach for real-time full body tracking that succeeds in such difficult situations, and takes inspiration from previous tracking solutions, and combines a generative tracker and a discriminative tracker retrieving closest poses in a database.
Abstract: In recent years, the availability of inexpensive depth cameras, such as the Microsoft Kinect, has boosted the research in monocular full body skeletal pose tracking. Unfortunately, existing trackers often fail to capture poses where a single camera provides insufficient data, such as non-frontal poses, and all other poses with body part occlusions. In this paper, we present a novel sensor fusion approach for real-time full body tracking that succeeds in such difficult situations. It takes inspiration from previous tracking solutions, and combines a generative tracker and a discriminative tracker retrieving closest poses in a database. In contrast to previous work, both trackers employ data from a low number of inexpensive body-worn inertial sensors. These sensors provide reliable and complementary information when the monocular depth information alone is not sufficient. We also contribute by new algorithmic solutions to best fuse depth and inertial data in both trackers. One is a new visibility model to determine global body pose, occlusions and usable depth correspondences and to decide what data modality to use for discriminative tracking. We also contribute with a new inertial-based pose retrieval, and an adapted late fusion step to calculate the final body pose.