Topic
Inertial measurement unit
About: Inertial measurement unit is a research topic. Over the lifetime, 13326 publications have been published within this topic receiving 189083 citations. The topic is also known as: IMU.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The aim in the present study was to test the accuracy of various instruments utilizing global satellite navigation systems (GNSS) in motion under forest canopies of varying densities to enable us to get an understanding of the current state-of-the-art in GNSS-based positioning under forestCanopies.
Abstract: A harvester enables detailed roundwood data to be collected during harvesting operations by means of the measurement apparatus integrated into its felling head. These data can be used to improve the efficiency of wood procurement and also replace some of the field measurements, and thus provide both less costly and more detailed ground truth for remote sensing based forest inventories. However, the positional accuracy of harvester-collected tree data is not sufficient currently to match the accuracy per individual trees achieved with remote sensing data. The aim in the present study was to test the accuracy of various instruments utilizing global satellite navigation systems (GNSS) in motion under forest canopies of varying densities to enable us to get an understanding of the current state-of-the-art in GNSS-based positioning under forest canopies. Tests were conducted using several different combinations of GNSS and inertial measurement unit (IMU) mounted
98 citations
••
27 Mar 2008
TL;DR: In this article, a low-cost, low-power, and small form factor solution to drift-free high-resolution vertical positioning by fusing MEMS accelerometers with MEMS barometric altimeter is demonstrated.
Abstract: We demonstrate a low-cost, low-power, and small form factor solution to drift-free high-resolution vertical positioning by fusing MEMS accelerometers with MEMS barometric altimeter. In this system, the highly responsive but drift-prone aspect of the MEMS accelerometers is stabilized by barometric altimeter and high-fidelity height tracking is achieved. Typical vertical human movements such as walking up or down a staircase can be tracked in real-time with this system. The height tracking performance is benchmarked against a reference system using a tactical-grade IMU and an error analysis is performed.
97 citations
••
TL;DR: In this paper, the interleaved operation of a cold-atom gyroscope is described, where three atomic clouds are interrogated simultaneously in an atom interferometer featuring a sampling rate of 3.75 Hz and an interrogation time of 801 ms.
Abstract: Cold-atom inertial sensors target several applications in navigation, geoscience, and tests of fundamental physics. Achieving high sampling rates and high inertial sensitivities, obtained with long interrogation times, represents a challenge for these applications. We report on the interleaved operation of a cold-atom gyroscope, where three atomic clouds are interrogated simultaneously in an atom interferometer featuring a sampling rate of 3.75 Hz and an interrogation time of 801 ms. Interleaving improves the inertial sensitivity by efficiently averaging vibration noise and allows us to perform dynamic rotation measurements in a so far unexplored range. We demonstrate a stability of 3 × 10-10 rad s-1 , which competes with the best stability levels obtained with fiber-optic gyroscopes. Our work validates interleaving as a key concept for future atom-interferometry sensors probing time-varying signals, as in on-board navigation and gravity gradiometry, searches for dark matter, or gravitational wave detection.
97 citations
••
TL;DR: The results proved that a single, trunk-mounted IMU is suitable to estimate stance and stride duration during sprint running, providing the opportunity to collect information in the field, without constraining or limiting athletes' and coaches' activities.
97 citations
••
TL;DR: The experimental results indicate that the proposed scheme achieves better recognition results as compared to the state of the art, and the feature-level fusion of RGB and inertial sensors provides the overall best performance for the proposed system.
Abstract: Automated recognition of human activities or actions has great significance as it incorporates wide-ranging applications, including surveillance, robotics, and personal health monitoring. Over the past few years, many computer vision-based methods have been developed for recognizing human actions from RGB and depth camera videos. These methods include space-time trajectory, motion encoding, key poses extraction, space-time occupancy patterns, depth motion maps, and skeleton joints. However, these camera-based approaches are affected by background clutter and illumination changes and applicable to a limited field of view only. Wearable inertial sensors provide a viable solution to these challenges but are subject to several limitations such as location and orientation sensitivity. Due to the complementary trait of the data obtained from the camera and inertial sensors, the utilization of multiple sensing modalities for accurate recognition of human actions is gradually increasing. This paper presents a viable multimodal feature-level fusion approach for robust human action recognition, which utilizes data from multiple sensors, including RGB camera, depth sensor, and wearable inertial sensors. We extracted the computationally efficient features from the data obtained from RGB-D video camera and inertial body sensors. These features include densely extracted histogram of oriented gradient (HOG) features from RGB/depth videos and statistical signal attributes from wearable sensors data. The proposed human action recognition (HAR) framework is tested on a publicly available multimodal human action dataset UTD-MHAD consisting of 27 different human actions. K-nearest neighbor and support vector machine classifiers are used for training and testing the proposed fusion model for HAR. The experimental results indicate that the proposed scheme achieves better recognition results as compared to the state of the art. The feature-level fusion of RGB and inertial sensors provides the overall best performance for the proposed system, with an accuracy rate of 97.6%.
97 citations