scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Weather Influence and Classification with Automotive Lidar Sensors

TL;DR: An in-depth analysis of automotive lidar performance under harsh weather conditions, i.e. heavy rain and dense fog, and introduces a novel approach to detect and classify rain or fog with lidar sensors only and achieve an mean union over intersection of 97.14%.
Abstract: Lidar sensors are often used in mobile robots and autonomous vehicles to complement camera, radar and ultrasonic sensors for environment perception. Typically, perception algorithms are trained to only detect moving and static objects as well as ground estimation, but intentionally ignore weather effects to reduce false detections. In this work, we present an in-depth analysis of automotive lidar performance under harsh weather conditions, i.e. heavy rain and dense fog. An extensive data set has been recorded for various fog and rain conditions, which is the basis for the conducted in-depth analysis of the point cloud under changing environmental conditions. In addition, we introduce a novel approach to detect and classify rain or fog with lidar sensors only and achieve an mean union over intersection of 97.14% for a data set in controlled environments. The analysis of weather influences on the performance of lidar sensors and the weather detection are important steps towards improving safety levels for autonomous driving in adverse weather conditions by providing reliable information to adapt vehicle behavior.
Citations
More filters
Proceedings ArticleDOI
14 Jun 2020
TL;DR: In this paper, a multimodal dataset acquired in over 10,000~km of driving in northern Europe is presented, with 100k labels for lidar, camera, radar, and gated NIR sensors.
Abstract: The fusion of multimodal sensor streams, such as camera, lidar, and radar measurements, plays a critical role in object detection for autonomous vehicles, which base their decision making on these inputs. While existing methods exploit redundant information in good environmental conditions, they fail in adverse weather where the sensory streams can be asymmetrically distorted. These rare ``edge-case'' scenarios are not represented in available datasets, and existing fusion architectures are not designed to handle them. To address this challenge we present a novel multimodal dataset acquired in over 10,000~km of driving in northern Europe. Although this dataset is the first large multimodal dataset in adverse weather, with 100k labels for lidar, camera, radar, and gated NIR sensors, it does not facilitate training as extreme weather is rare. To this end, we present a deep fusion network for robust fusion without a large corpus of labeled training data covering all asymmetric distortions. Departing from proposal-level fusion, we propose a single-shot model that adaptively fuses features, driven by measurement entropy. We validate the proposed method, trained on clean data, on our extensive validation dataset. Code and data are available here https://github.com/princeton-computational-imaging/SeeingThroughFog.

213 citations

Journal ArticleDOI
TL;DR: Challenges to identifying adverse weather and other situations that make driving difficult, thus complicating the introduction of automated vehicles to the market are discussed.
Abstract: During automated driving in urban areas, decisions must be made while recognizing the surrounding environment using sensors such as camera, Light Detection and Ranging (LiDAR), millimeter-wave radar (MWR), and the global navigation satellite system (GNSS). The ability to drive under various environmental conditions is an important issue for automated driving on any road. In order to introduce the automated vehicles into the markets, the ability to evaluate various traffic conditions and navigate safely presents serious challenges. Another important challenge is the development of a robust recognition system can account for adverse weather conditions. Sun glare, rain, fog, and snow are adverse weather conditions that can occur in the driving environment. This paper summarizes research focused on automated driving technologies and discuss challenges to identifying adverse weather and other situations that make driving difficult, thus complicating the introduction of automated vehicles to the market.

83 citations

Proceedings ArticleDOI
20 Jun 2021
TL;DR: In this article, a two-stage deep fusion detector is proposed, which first generates proposals from two sensors and then fuses region-wise features between multimodal sensor streams to improve final detection results.
Abstract: Vehicle detection with visual sensors like lidar and camera is one of the critical functions enabling autonomous driving. While they generate fine-grained point clouds or high-resolution images with rich information in good weather conditions, they fail in adverse weather (e.g., fog) where opaque particles distort lights and significantly reduce visibility. Thus, existing methods relying on lidar or camera experience significant performance degradation in rare but critical adverse weather conditions. To remedy this, we resort to exploiting complementary radar, which is less impacted by adverse weather and becomes prevalent on vehicles. In this paper, we present Multimodal Vehicle Detection Network (MVDNet), a two-stage deep fusion detector, which first generates proposals from two sensors and then fuses region-wise features between multimodal sensor streams to improve final detection results. To evaluate MVDNet, we create a procedurally generated training dataset based on the collected raw lidar and radar signals from the open-source Oxford Radar Robotcar. We show that the proposed MVDNet surpasses other state-of-the-art methods, notably in terms of Average Precision (AP), especially in adverse weather conditions. The code and data are available at https://github.com/qiank10/MVDNet.

65 citations

Journal ArticleDOI
10 Feb 2020
TL;DR: This letter presents the first CNN-based approach to understand and filter out adverse weather effects in point cloud data using a large data set obtained in controlled weather environments and demonstrates a significant performance improvement of the method over state-of-the-art involving geometric filtering.
Abstract: Lidar sensors are frequently used in environment perception for autonomous vehicles and mobile robotics to complement camera, radar, and ultrasonic sensors. Adverse weather conditions are significantly impacting the performance of lidar-based scene understanding by causing undesired measurement points that in turn effect missing detections and false positives. In heavy rain or dense fog, water drops could be misinterpreted as objects in front of the vehicle which brings a mobile robot to a full stop. In this letter, we present the first CNN -based approach to understand and filter out such adverse weather effects in point cloud data. Using a large data set obtained in controlled weather environments, we demonstrate a significant performance improvement of our method over state-of-the-art involving geometric filtering. Data is available at https://github.com/rheinzler/PointCloudDeNoising .

61 citations

Journal ArticleDOI
15 Nov 2020-Sensors
TL;DR: A fusion perspective is proposed that can fill gaps and increase the robustness of the perception system and the difference between the current and expected states of performance is determined by the use of spider charts.
Abstract: Perception is a vital part of driving. Every year, the loss in visibility due to snow, fog, and rain causes serious accidents worldwide. Therefore, it is important to be aware of the impact of weather conditions on perception performance while driving on highways and urban traffic in all weather conditions. The goal of this paper is to provide a survey of sensing technologies used to detect the surrounding environment and obstacles during driving maneuvers in different weather conditions. Firstly, some important historical milestones are presented. Secondly, the state-of-the-art automated driving applications (adaptive cruise control, pedestrian collision avoidance, etc.) are introduced with a focus on all-weather activity. Thirdly, the most involved sensor technologies (radar, lidar, ultrasonic, camera, and far-infrared) employed by automated driving applications are studied. Furthermore, the difference between the current and expected states of performance is determined by the use of spider charts. As a result, a fusion perspective is proposed that can fill gaps and increase the robustness of the perception system.

56 citations


Cites background from "Weather Influence and Classificatio..."

  • ...In [114], the depth of lidar performance in fog is studied and the observed light is scattered by fog particles, which not only reduces the detection range dramatically, but also leads to false detections....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: A novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research, using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras and a high-precision GPS/IMU inertial navigation system.
Abstract: We present a novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research. In total, we recorded 6 hours of traffic scenarios at 10-100 Hz using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras, a Velodyne 3D laser scanner and a high-precision GPS/IMU inertial navigation system. The scenarios are diverse, capturing real-world traffic situations, and range from freeways over rural areas to inner-city scenes with many static and dynamic objects. Our data is calibrated, synchronized and timestamped, and we provide the rectified and raw image sequences. Our dataset also contains object labels in the form of 3D tracklets, and we provide online benchmarks for stereo, optical flow, object detection and other tasks. This paper describes our recording platform, the data format and the utilities that we provide.

7,153 citations


"Weather Influence and Classificatio..." refers background in this paper

  • ...[2], [3]), there exists a sizable amount of literature about the impact of harsh weather conditions such as fog, rain, dust or snow for lidar sensors [4]–[16]:...

    [...]

Journal ArticleDOI
TL;DR: A novel nearest neighbor-based feature weighting algorithm, which learns a feature Weighting vector by maximizing the expected leave-one-out classification accuracy with a regularization term, is proposed.
Abstract: Feature selection is of considerable importance in data mining and machine learning, especially for high dimensional data. In this paper, we propose a novel nearest neighbor-based feature weighting algorithm, which learns a feature weighting vector by maximizing the expected leave-one-out classification accuracy with a regularization term. The algorithm makes no parametric assumptions about the distribution of the data and scales naturally to multiclass problems. Experiments conducted on artificial and real data sets demonstrate that the proposed algorithm is largely insensitive to the increase in the number of irrelevant features and performs better than the state-of-the-art methods in most cases.

401 citations


"Weather Influence and Classificatio..." refers methods in this paper

  • ...The features are down selected by a neighboring component analysis to find the parameters with the highest effect [25]....

    [...]

Journal ArticleDOI
TL;DR: The vehicle path trajectory in these data sets contains several large- and small-scale loop closures, which should be useful for testing various state-of-the-art computer vision and simultaneous localization and mapping algorithms.
Abstract: In this paper we describe a data set collected by an autonomous ground vehicle testbed, based upon a modified Ford F-250 pickup truck. The vehicle is outfitted with a professional (Applanix POS-LV) and consumer (Xsens MTi-G) inertial measurement unit, a Velodyne three-dimensional lidar scanner, two push-broom forward-looking Riegl lidars, and a Point Grey Ladybug3 omnidirectional camera system. Here we present the time-registered data from these sensors mounted on the vehicle, collected while driving the vehicle around the Ford Research Campus and downtown Dearborn, MI, during November-December 2009. The vehicle path trajectory in these data sets contains several large- and small-scale loop closures, which should be useful for testing various state-of-the-art computer vision and simultaneous localization and mapping algorithms.

343 citations


"Weather Influence and Classificatio..." refers background in this paper

  • ...[2], [3]), there exists a sizable amount of literature about the impact of harsh weather conditions such as fog, rain, dust or snow for lidar sensors [4]–[16]:...

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a fast transmission relationship based on an exact Mie theory calculation valid in the 0.69- to 1.55-µm spectral bands.
Abstract: The principal disadvantage of using free space optics (FSO) telecommunication systems is the disturbing role played by the atmosphere on light propagation and thus on the channel capacity, availability, and link reliability. The wavelength choice is currently a subject of disagreement among designers and users of FSO equipments. Generally this equipment operates in the visible and the near IR at 690, 780, 850, and 1550 nm. Several authors affirm that equipment working at 1550 nm presents less atmospheric attenuation in the presence of fog and thus better link availability. Others consider that for dense fogs (visibility<500 m), all wavelengths are attenuated in the same way (wavelength independence). Fog attenuation in the visible and IR regions is reviewed from an empirical and theoretical point of view. Laser system performance in the presence of fog (advection and convection) in the 0.4- to 15-µm spectral zone is investigated using FASCOD computation. A transmission gain of 42% for a lasercom system working at 780 nm is observed compared to the same system working at 1550 nm. This gain reaches 48% if the same system works at 690 nm. Finally, we propose a fast transmission relationship based on an exact Mie theory calculation valid in the 0.69- to 1.55-µm spectral bands. It enables us to predict fog attenuation according to visibility without using heavy computer codes.

323 citations


"Weather Influence and Classificatio..." refers background in this paper

  • ...Lidar sensor performance significantly depends on the environmental conditions as demonstrated in [4], [6], [10]– [14]....

    [...]

  • ...[2], [3]), there exists a sizable amount of literature about the impact of harsh weather conditions such as fog, rain, dust or snow for lidar sensors [4]–[16]:...

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors provide an overview on the different physical principles responsible for laser radar signal disturbance and theoretical investigations for estimation of their influence, which are applied for signal generation in a newly developed laser radar target simulator providing the worldwide first HIL test capability for automotive laser radar systems.
Abstract: . Laser radar (lidar) sensors provide outstanding angular resolution along with highly accurate range measurements and thus they were proposed as a part of a high performance perception system for advanced driver assistant functions. Based on optical signal transmission and reception, laser radar systems are influenced by weather phenomena. This work provides an overview on the different physical principles responsible for laser radar signal disturbance and theoretical investigations for estimation of their influence. Finally, the transmission models are applied for signal generation in a newly developed laser radar target simulator providing – to our knowledge – worldwide first HIL test capability for automotive laser radar systems.

237 citations

Trending Questions (1)
How do I set weather on noise Colorfit Pro 3?

The analysis of weather influences on the performance of lidar sensors and the weather detection are important steps towards improving safety levels for autonomous driving in adverse weather conditions by providing reliable information to adapt vehicle behavior.