Topic
Inertial measurement unit
About: Inertial measurement unit is a research topic. Over the lifetime, 13326 publications have been published within this topic receiving 189083 citations. The topic is also known as: IMU.
Papers published on a yearly basis
Papers
More filters
•
10 Mar 2008
TL;DR: In this article, a method and arrangement for determining inattentiveness of an operator of a vehicle includes determining the position of the vehicle, analyzing the determining position of vehicle relative to a map, monitoring motion of vehicle to determine whether the vehicle is deviating from the map, and providing a warning when the monitored motion of the device deviates from normal motion or operation.
Abstract: Method and arrangement for determining inattentiveness of an operator of a vehicle includes determining the position of the vehicle, analyzing the determining position of the vehicle relative to a map, monitoring motion of the vehicle relative to the map to determine whether the vehicle is deviating from the map, and providing a warning when the monitored motion of the vehicle deviates from normal motion or operation of the vehicle. The position of the vehicle may be determined using a satellite-based positioning system and an inertial measurement unit (IMU). The IMU may be a MEMS-packaged IMU integrated with the satellite-based positioning systems.
50 citations
••
TL;DR: Evaluated and compares linear acceleration trajectories obtained from two different 3D accelerometers and derived from Vicon position data for an upper limb "reach & grasp" task, finding good correspondence between the three measurement systems.
50 citations
01 Dec 2009
TL;DR: A new method based on partial redundancies is introduced to formalize the determination of optimal geometry of multi-IMU systems and shows that, when dealing with IMU triads, the optimality of such systems is independent of the geometry between them.
Abstract: Although experimental results have demonstrated that redundant MEMS-IMUs integrated with GPS are an efficient way to improve navigation performances, the precise relationship between the number of sensors employed and the accuracy enhancement remains unclear. This article aims at demonstrating, with the help of simulations, that multiple MEMS-IMU systems can be designed according to specifications. This enables to better define the relationship between the number of sensors employed and the accuracy improvement as well as to ascertain the precise number of sensors needed to fulfill the system's requirements. This proves to be highly helpful in designing navigation systems for applications that require a specific precision. This article also aims at demonstrating the impact of sensors' orientation on the system performances. To achieve this, a new method based on partial redundancies is introduced to formalize the determination of optimal geometry of multi-IMU systems. It shows that, when dealing with IMU triads, the optimality of such systems is independent of the geometry between them. This result, moreover, presents important practical implications since it demonstrates that complicated geometries, traditionally employed in such systems, can be avoided. Additionally, it also proves that navigation performances obtained by simulations with a certain number of sensors are valid independently from orientation amongst these sensors.
50 citations
••
TL;DR: The test in the real tunnel case shows that in weak environmental feature areas where the LiDAR-SLAM can barely work, the assistance of the odometer in the pre-integration is critical and can effectually reduce the positioning drift along the forward direction and maintain the SLAM in the short-term.
Abstract: In this paper, we proposed a multi-sensor integrated navigation system composed of GNSS (global navigation satellite system), IMU (inertial measurement unit), odometer (ODO), and LiDAR (light detection and ranging)-SLAM (simultaneous localization and mapping). The dead reckoning results were obtained using IMU/ODO in the front-end. The graph optimization was used to fuse the GNSS position, IMU/ODO pre-integration results, and the relative position and relative attitude from LiDAR-SLAM to obtain the final navigation results in the back-end. The odometer information is introduced in the pre-integration algorithm to mitigate the large drift rate of the IMU. The sliding window method was also adopted to avoid the increasing parameter numbers of the graph optimization. Land vehicle tests were conducted in both open-sky areas and tunnel cases. The tests showed that the proposed navigation system can effectually improve accuracy and robustness of navigation. During the navigation drift evaluation of the mimic two-minute GNSS outages, compared to the conventional GNSS/INS (inertial navigation system)/ODO integration, the root mean square (RMS) of the maximum position drift errors during outages in the proposed navigation system were reduced by 62.8%, 72.3%, and 52.1%, along the north, east, and height, respectively. Moreover, the yaw error was reduced by 62.1%. Furthermore, compared to the GNSS/IMU/LiDAR-SLAM integration navigation system, the assistance of the odometer and non-holonomic constraint reduced vertical error by 72.3%. The test in the real tunnel case shows that in weak environmental feature areas where the LiDAR-SLAM can barely work, the assistance of the odometer in the pre-integration is critical and can effectually reduce the positioning drift along the forward direction and maintain the SLAM in the short-term. Therefore, the proposed GNSS/IMU/ODO/LiDAR-SLAM integrated navigation system can effectually fuse the information from multiple sources to maintain the SLAM process and significantly mitigate navigation error, especially in harsh areas where the GNSS signal is severely degraded and environmental features are insufficient for LiDAR-SLAM.
50 citations
••
03 Dec 2002TL;DR: An algorithm is presented that computes optimal vehicle motion estimates by considering all of the measurements from a camera, rate gyro, and accelerometer simultaneously, and shows that using image and inertial data together can produce highly accurate estimates even when the results produced by each modality alone are very poor.
Abstract: Cameras and inertial sensors are good candidates to be deployed together for autonomous vehicle motion estimation, since each can be used to resolve the ambiguities in the estimated motion that results from using the other modality alone. We present an algorithm that computes optimal vehicle motion estimates by considering all of the measurements from a camera, rate gyro, and accelerometer simultaneously. Such optimal estimates are useful in their own right, and as a gold standard for the comparison of online algorithms. By comparing the motions estimated using visual and inertial measurements, visual measurements only, and inertial measurements only against ground truth, we show that using image and inertial data together can produce highly accurate estimates even when the results produced by each modality alone are very poor Our test datasets include both conventional and omnidirectional image sequences, and an image sequence with a high percentage of missing data.
50 citations