U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

High Temperature Ultrasonic Transducers: A Review

Affiliation.

  • 1 Ultrasound Research Institute, Kaunas University of Technology, Barsausko st. 59, LT-51368 Kaunas, Lithuania.
  • PMID: 34062979
  • PMCID: PMC8125082
  • DOI: 10.3390/s21093200

There are many fields such as online monitoring of manufacturing processes, non-destructive testing in nuclear plants, or corrosion rate monitoring techniques of steel pipes in which measurements must be performed at elevated temperatures. For that high temperature ultrasonic transducers are necessary. In the presented paper, a literature review on the main types of such transducers, piezoelectric materials, backings, and the bonding techniques of transducers elements suitable for high temperatures, is presented. In this review, the main focus is on ultrasonic transducers with piezoelectric elements suitable for operation at temperatures higher than of the most commercially available transducers, i.e., 150 °C. The main types of the ultrasonic transducers that are discussed are the transducers with thin protectors, which may serve as matching layers, transducers with high temperature delay lines, wedges, and waveguide type transducers. The piezoelectric materials suitable for high temperature applications such as aluminum nitride, lithium niobate, gallium orthophosphate, bismuth titanate, oxyborate crystals, lead metaniobate, and other piezoceramics are analyzed. Bonding techniques used for joining of the transducer elements such as joining with glue, soldering, brazing, dry contact, and diffusion bonding are discussed. Special attention is paid to efficient diffusion and thermo-sonic diffusion bonding techniques. Various types of backings necessary for improving a bandwidth and to obtain a short pulse response are described.

Keywords: aluminum nitride; backings; bismuth titanate; bonding technique; gallium orthophosphate; high temperature piezoelectric materials; high temperature ultrasonic transducers; lead metaniobate; lithium niobate; oxyborate crystals; waveguide transducers.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Ultrasonic transducers for extreme conditions:…

Ultrasonic transducers for extreme conditions: ( a ) with a thin protective layer;…

Temperature distribution along the buffer…

Temperature distribution along the buffer rod made of glass ceramic ZERODUR.

Buffer rods with circular cross-section…

Buffer rods with circular cross-section for reducing trailing waves; ( a ) tapered…

Ultrasonic transducer with a tapered…

Ultrasonic transducer with a tapered buffer rod for a process control.

Ultrasonic transducers with threaded buffer…

Ultrasonic transducers with threaded buffer rods: ( a ) with longitudinal waves; (…

Excitation of the shear-horizontal wave…

Excitation of the shear-horizontal wave in a rectangular waveguide.

High temperature ultrasonic transducer with…

High temperature ultrasonic transducer with an integral concaved backing.

Ultrasonic transducer with a graphite…

Ultrasonic transducer with a graphite bronze backing on the top.

Waveform of the ultrasonic pulse…

Waveform of the ultrasonic pulse obtained in a pulse echo mode in a…

Similar articles

  • State-of-the-Art and Practical Guide to Ultrasonic Transducers for Harsh Environments Including Temperatures above 2120 °F (1000 °C) and Neutron Flux above 10 13 n/cm 2 . Tittmann BR, Batista CFG, Trivedi YP, Lissenden Iii CJ, Reinhardt BT. Tittmann BR, et al. Sensors (Basel). 2019 Nov 1;19(21):4755. doi: 10.3390/s19214755. Sensors (Basel). 2019. PMID: 31683921 Free PMC article. Review.
  • A New High-Temperature Ultrasonic Transducer for Continuous Inspection. Amini MH, Sinclair AN, Coyle TW. Amini MH, et al. IEEE Trans Ultrason Ferroelectr Freq Control. 2016 Mar;63(3):448-55. doi: 10.1109/TUFFC.2016.2519348. Epub 2016 Jan 25. IEEE Trans Ultrason Ferroelectr Freq Control. 2016. PMID: 26829787
  • High temperature ultrasonic transducers for imaging and measurements in a liquid Pb/Bi eutectic alloy. Kazys R, Voleisis A, Sliteris R, Mazeika L, Van Nieuwenhove R, Kupschus P, Abderrahim HA. Kazys R, et al. IEEE Trans Ultrason Ferroelectr Freq Control. 2005 Apr;52(4):525-37. doi: 10.1109/tuffc.2005.1428033. IEEE Trans Ultrason Ferroelectr Freq Control. 2005. PMID: 16060499
  • Air-Coupled Low Frequency Ultrasonic Transducers and Arrays with PMN-32%PT Piezoelectric Crystals. Kazys RJ, Sliteris R, Sestoke J. Kazys RJ, et al. Sensors (Basel). 2017 Jan 6;17(1):95. doi: 10.3390/s17010095. Sensors (Basel). 2017. PMID: 28067807 Free PMC article.
  • Thermal Sprayed Lead-Free Piezoelectric Ceramic Coatings for Ultrasonic Structural Health Monitoring. Yin J, Chen S, Wong VK, Yao K. Yin J, et al. IEEE Trans Ultrason Ferroelectr Freq Control. 2022 Nov;69(11):3070-3080. doi: 10.1109/TUFFC.2022.3176488. Epub 2022 Nov 2. IEEE Trans Ultrason Ferroelectr Freq Control. 2022. PMID: 35584063 Review.
  • Biomimetic Ultrasonic Vibrator with Broadband Characteristics Inspired by Leaf-Cutting Ants. Wu W, Yao G, Zhang M, Jiang X, Zhang D. Wu W, et al. Biomimetics (Basel). 2024 Apr 19;9(4):247. doi: 10.3390/biomimetics9040247. Biomimetics (Basel). 2024. PMID: 38667257 Free PMC article.
  • Time Delay Study of Ultrasonic Gas Flowmeters Based on VMD-Hilbert Spectrum and Cross-Correlation. Kong L, Zhang L, Guo H, Zhao N, Xu X. Kong L, et al. Sensors (Basel). 2024 Feb 23;24(5):1462. doi: 10.3390/s24051462. Sensors (Basel). 2024. PMID: 38474997 Free PMC article.
  • A Review of Approaches for Mitigating Effects from Variable Operational Environments on Piezoelectric Transducers for Long-Term Structural Health Monitoring. Brunner AJ. Brunner AJ. Sensors (Basel). 2023 Sep 19;23(18):7979. doi: 10.3390/s23187979. Sensors (Basel). 2023. PMID: 37766034 Free PMC article. Review.
  • Failure Analysis for Gold Wire Bonding of Sensor Packaging Based on Experimental and Numerical Methods. Sun Y, Ma K, Song Y, Zi T, Liu X, Feng Z, Zhou Y, Liu S. Sun Y, et al. Micromachines (Basel). 2023 Aug 30;14(9):1695. doi: 10.3390/mi14091695. Micromachines (Basel). 2023. PMID: 37763858 Free PMC article.
  • Experimental Investigation on the Corrosion Detectability of A36 Low Carbon Steel by the Method of Phased Array Corrosion Mapping. Tai JL, Grzejda R, Sultan MTH, Łukaszewicz A, Shahar FS, Tarasiuk W, Rychlik A. Tai JL, et al. Materials (Basel). 2023 Jul 27;16(15):5297. doi: 10.3390/ma16155297. Materials (Basel). 2023. PMID: 37570000 Free PMC article.
  • Lynnworth L.C. Ultrasonic Measurements for Process Control: Theory, Techniques, Applications. 1st ed. Academic Press, Inc.; Cambridge, MA, USA: 1989. pp. 139–145, 158–166, 371–392.
  • Cegla F.B., Cawley P., Allin J., Davies J. High-temperature (>500 °C) wall thickness monitoring using dry-coupled ultrasonic waveguide transducers. IEEE Trans. Ultrason. Ferroelectr. Freq. Control. 2011;58:156–167. doi: 10.1109/TUFFC.2011.1782. - DOI - PubMed
  • Honarvar F., Salehi F., Safavi V., Mokhtari A., Sinclair A.N. Ultrasonic monitoring of erosion/corrosion thinning rates in industrial piping systems. Ultrasonics. 2013;53:1251–1258. doi: 10.1016/j.ultras.2013.03.007. - DOI - PubMed
  • Cheong Y.-M., Kim K.-M., Kim D.-J. High-temperature ultrasonic thickness monitoring for pipe thinning in a flow-accelerated corrosion proof test facility. Nucl. Eng. Technol. 2017;49:1463–1471. doi: 10.1016/j.net.2017.05.002. - DOI
  • Kazys R., Sliteris R., Rekuviene R., Zukauskas E., Mazeika L. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions. Sensors. 2015;15:19393–19415. doi: 10.3390/s150819393. - DOI - PMC - PubMed

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Europe PubMed Central
  • PubMed Central

Other Literature Sources

  • The Lens - Patent Citations

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Review of Obstacle Detection by Ultrasonic and Laser Sensor for Automated Guided Vehicles

  • Conference paper
  • First Online: 23 September 2023
  • Cite this conference paper

literature review ultrasonic sensor

  • Mahesh G. Sonawane 9 &
  • Nishigandha S. Patel 9  

Included in the following conference series:

  • Techno-Societal 2016, International Conference on Advanced Technologies for Societal Applications

In this twenty-first century, industries are using automated guided vehicles for material handling. The wide usage is because AGVs can easily customize per application requirements. One of the areas of concern is obstacle detection, as AGVs operated in highly congested industrial environments. While working with automated guided vehicles on the production floor, these vehicles come across human operators and manufacturing equipment. In such events, the collision of automated guided vehicles with human operators or manufacturing equipment can occur. These accidents on the production floor can cause loss of human life or damage manufacturing equipment. To avoid such circumstances, AGVs must be able to detect obstacles and avoid them. This paper reviewed obstacle detection techniques using ultrasonic and laser sensors. This study shall help the researchers to understand the advantages, limitations, applications and future scopes of ultrasonic and laser sensors for obstacle detection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

literature review ultrasonic sensor

Obstacle Detection for Robotic Systems Using Combination of Ultrasonic Sonars and Infrared Sensors

literature review ultrasonic sensor

Obstacle detection and avoidance for automated vehicle: a review

literature review ultrasonic sensor

Autonomous Mobile Robot Navigation Issues in Terms of Obstacles Detection in Disrupting Operating Conditions – Case Study

Cardarelli, E., Sabattini, L., Secchi, C., & Fantuzzi, C. (2014). Multisensor data fusion for obstacle detection in automated factory logistics. In IEEE 10th international conference on intelligent computer communication and processing , Romania.

Google Scholar  

Sabattini, L., Cardarelli, E., Digani, V., Secchi, C., Fantuzzi, C., & Fuerstenberg, K. (2015). Advanced sensing and control techniques for multi AGV systems in shared industrial environments. In IEEE 20th conference on emerging technologies & factory automation , Luxembourg.

Thai, N., Khanh Ly, T., & Dzung, L. Q. (2021). Roadmap, routing, and obstacle avoidance of AGV robot in the static environment of the flexible manufacturing system with matrix devices layout. VNUHCM Journal of Science and Technology Development, 24 (3), 2091–2099.

Grand View Research, Automated Guided Vehicle Market Size Report. (2030). Report ID: GVR-1-68038-153-5. https://www.grandviewresearch.com/industry-analysis/automated-guided-vehicle-agv-market

Vidhya, D. S., Rebelo, D., D’Silva, C., & Fernandes, L. (2016). Obstacle detection using ultrasonic sensors. International Journal for Innovative Research in Science and Technology, 2 , 11.

Faisal, M., Reddy, G., Kumar, B. A., & Ajitha, D. (2021). Object detection using ultrasonic sensor. IJMTST, 7 , 7010.

De Simone, M., Rivera, Z., & Guida, D. (2018). Obstacle avoidance system for unmanned ground vehicles by using ultrasonic sensors. Machines, 6 (2).

Yasin, J. N., Mohamed, S. A. S., Haghbayan, M. H., Heikkonen, J., Tenhunen, H., & Plosila, J. (2021). Low-cost ultrasonic based object detection and collision avoidance method for autonomous robots. International Journal of Information Technology, 13 (1), 97–107.

Biswas, A., Abedin, S., & Kabir, A. (2020). Moving object detection using ultrasonic radar with proper distance, direction, and object shape analysis. Journal of Information Systems Engineering and Business Intelligence, 6 (2).

Agbeyangi, A. O., Alashiri, O., Odiete, J. O., & Adenekan, O. A. (2020). An autonomous obstacle avoidance robot using ultrasonic sensor. Journal of Computer Science and Its Application, 27 .

Azeta, J., Bolu, C., Hinvi, D., & Abioye, A. (2019). Obstacle detection using ultrasonic sensor for a mobile robot. 8th International Conference on Mechatronics and Control Engineering, IOP Conference Series: Materials Science and Engineering, 707 .

Khedkar, A., Kajani, K., Ipkal, P., Banthia, S., Jagdale, B., & Kulkarni, M. (2020). Automated guided vehicle system with collision avoidance and navigation in warehouse environments. International Research Journal of Engineering and Technology, 07 (05).

Rebai, K., Benabderrahmane, A., Azouaoui, O., & Ouadah, N. (2009). Moving obstacles detection and tracking with laser range finder. In International conference on advanced robotics 2009 , Munich, Germany.

Qiu, Q., & Han, S. J. (2009). An implementation of typical obstacle detection and recognition with laser range finder. In IEEE international conference on intelligent computing and intelligent systems , Shanghai.

Labayrade, R., Gruyer, D., Royere, C., Perrollaz, M., & Aubert, D. (2007). Obstacle detection based on fusion between stereovision and 2D laser scanner. Mobile Robots: Perception & Navigation .

Perrollaz, M., Labayrade, R., Roy, C., Hauti, N., & Aubert, D. (2006). Long range obstacle detection using laser scanner and stereovision. In IEEE intelligent vehicles symposium , Japan.

Habermann, D., & Garcia, C. (2010). Obstacle detection and tracking using laser 2D. In Latin American robotics symposium and intelligent robotics meeting , Brazil.

Hussein, A., Plaza, P., Martin, D., Escalera, A., & Armingol, J. (2016). Autonomous off-road navigation using stereo-vision and laser-rangefinder fusion for outdoor obstacles detection. In IEEE intelligent vehicles symposium (IV) , Sweden.

Hancock, J., Hebert, M., & Thorpe, C. (1998). Laser intensity-based obstacle detection. In IEEE/RSJ international conference on intelligent robots and systems. Innovations in theory, practice and applications (Cat. No.98CH36190) , Canada.

Download references

Author information

Authors and affiliations.

MIT Art, Design and Technology University, Pune, India

Mahesh G. Sonawane & Nishigandha S. Patel

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mahesh G. Sonawane .

Editor information

Editors and affiliations.

SVERI’s College of Engineering, Pandharpur, Pandharpur, Maharashtra, India

Prashant M. Pawar

Babruvahan P. Ronge

Ranjitsinha R. Gidde

Meenakshi M. Pawar

SVERI’s College of Engineering (Polytechnic), Pandharpur, Pandharpur, Maharashtra, India

Nitin D. Misal

Anupama S. Budhewar

SVERI’s College of Pharmacy, Pandharpur, Pandharpur, Maharashtra, India

Vrunal V. More

Amity University, Dubai, United Arab Emirates

P. Venkata Reddy

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Cite this paper.

Sonawane, M.G., Patel, N.S. (2024). Review of Obstacle Detection by Ultrasonic and Laser Sensor for Automated Guided Vehicles. In: Pawar, P.M., et al. Techno-societal 2022. ICATSA 2022. Springer, Cham. https://doi.org/10.1007/978-3-031-34644-6_101

Download citation

DOI : https://doi.org/10.1007/978-3-031-34644-6_101

Published : 23 September 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-34643-9

Online ISBN : 978-3-031-34644-6

eBook Packages : Engineering Engineering (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Sensors (Basel)

Logo of sensors

An Overview of Autonomous Vehicles Sensors and Their Vulnerability to Weather Conditions

Jorge vargas.

1 Department of Engineering Technology, Middle Tennessee State University, Murfreesboro, TN 37132, USA

Suleiman Alsweiss

2 Global Science & Technology (GST) Inc., Greenbelt, MD 20770, USA

3 Advanced Mobility Institute (AMI), Florida Polytechnic University, 4700 Research Way, Lakeland, FL 33805, USA; ude.ylopadirolf@rekoto (O.T.); ude.ylopadirolf@nadzarr (R.R.); ude.ylopadirolf@3936sotnasj (J.S.)

Rahul Razdan

Joshua santos, associated data.

Not applicable.

Autonomous vehicles (AVs) rely on various types of sensor technologies to perceive the environment and to make logical decisions based on the gathered information similar to humans. Under ideal operating conditions, the perception systems (sensors onboard AVs) provide enough information to enable autonomous transportation and mobility. In practice, there are still several challenges that can impede the AV sensors’ operability and, in turn, degrade their performance under more realistic conditions that actually occur in the physical world. This paper specifically addresses the effects of different weather conditions (precipitation, fog, lightning, etc.) on the perception systems of AVs. In this work, the most common types of AV sensors and communication modules are included, namely: RADAR, LiDAR, ultrasonic, camera, and global navigation satellite system (GNSS). A comprehensive overview of their physical fundamentals, electromagnetic spectrum, and principle of operation is used to quantify the effects of various weather conditions on the performance of the selected AV sensors. This quantification will lead to several advantages in the simulation world by creating more realistic scenarios and by properly fusing responses from AV sensors in any object identification model used in AVs in the physical world. Moreover, it will assist in selecting the appropriate fading or attenuation models to be used in any X-in-the-loop (XIL, e.g., hardware-in-the-loop, software-in-the-loop, etc.) type of experiments to test and validate the manner AVs perceive the surrounding environment under certain conditions.

1. Introduction

According to the Society of Automotive Engineers (SAE), the Advanced Driver-Assistance System (ADAS) is a six-tiered system that categorizes the different levels of autonomy. It ranges from vehicles being solely human-driven to those that are completely autonomous or self-driving, as shown in Figure 1 [ 1 ]. In order to achieve higher autonomy levels, autonomous vehicles (AVs) must rely on a combination of sensors and software to perceive the surrounding environment and navigate without any human intervention. Currently, cutting-edge sensor technologies are rapidly growing to improve autonomous transportation and mobility that is safe for pedestrians and riders [ 2 ]. This has motivated more researchers and engineers, from a variety of fields and backgrounds, to engage in this process and to address all the interconnected challenges that accompanies it. Figure 2 provides a brief summary of the influential events in the history of AVs [ 3 , 4 ].

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g001.jpg

Society of automotive engineers automation levels.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g002.jpg

A brief summary of influential events in the history of autonomous driving.

As a result of the growing interest in AVs, several testing and validation procedures have been developed to optimize their performance for maximum safety before being deployed onto public roadways and infrastructures [ 5 ]. AVs can be tested in simulations or in the real world. Engineers can employ actual vehicles instead of models in physical tests, which provide realistic testing settings. However, due to the high risk associated with such tests, regulations restrict their use in densely populated areas, such as cities. Furthermore, the probability of encountering edge cases is low, and even when such cases occur, the repeatability of these cases represents a challenge. According to recent reports [ 6 , 7 ], conducting empirical field experiments to validate the safety of AVs in an acceptable timescale is unfeasible. As a result, verifications and testing in a virtual environment have the ability to bridge the gap and enable AV systems to be evaluated in a rigorous, controlled, and rapid manner. Nevertheless, the research community recognizes the challenge of generating realistic scenarios that mimics the physical world while depending solely on software [ 8 ].

For instance, according to the National Highway Traffic Safety Administration (NHTSA), over 5,891,000 vehicle crashes occur each year on average, with approximately 1,235,000 being attributed to adverse weather conditionsm such as snow, rain, fog, and strong winds. Approximately half (~46%) of weather-related accidents are caused by rain, and approximately ~17% are caused by snow. Therefore, considering the effects of different weather conditions on the response of AV sensors can significantly improve the validity of simulated tests and generate more realistic scenarios. This paper presents a comprehensive literature review of the effects of different weather phenomena on an AV’s capability to perceive their surroundings. The details of the technological aspects involved in the development of AVs are discussed, specifying the different principle of operation for AV sensors ecosystem. In addition, the electromagnetic properties of these sensors are mapped to various weather conditions in a quantitative fashion. The remainder of this paper is organized as follows: Section 2 details the technologies normally used in AV sensors to capture information about the surrounding environment; Section 3 discusses the weather effects on AV sensors functionality; Section 4 describes the synopsis of automotive sensor strengths; and Section 5 includes our conclusions and future work.

2. Autonomous Vehicles Sensors Ecosystem

This section presents the most representative sensors that make up AVs sensors ecosystem: RADAR, LiDAR, ultrasonic, global navigation satellite system (GNSS), and cameras. These sensors measure wave sources and detect various physical phenomena. They have distinct properties that enable them to perform different tasks under specified conditions. In this work, we focused on the part of the electromagnetic spectrum that they use in their operation. This will shed the light on their vulnerabilities to degraded environments, such as adverse weather conditions. Figure 3 depicts the electromagnetic spectrum and the spectral ranges used by AV sensors investigated in this work. Furthermore, Figure 4 shows a top-level description of the AV sensors suite with a brief description.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g003.jpg

Electromagnetic spectra used by sensors on autonomous vehicles.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g004.jpg

Autonomous vehicles sensor ecosystem (Image source: The Economist).

RADAR is an acronym for Radio Detection and Ranging technology, which is a device that uses radio waves for object-detection within a certain range. When the transmitted waves intercept with an object along its propagation path, they are reflected by its surface where the RADAR antenna collects the backscattered signal (echo) within its field of view (FOV). The round-trip delay time, together with the known velocity of radio waves, provides a precise determination of the object’s distance and velocity from the RADAR system. The RADAR range equation that relates the received echo power ( P r ) to the distance of an object R meters away is shown in Equation ( 1 ) [ 9 ]:

where P t is the transmitted power, G is the gain, λ is the wavelength, σ is the cross section of the target, and L represents all the losses lumped together, including multipath, atmospheric, and environmental losses.

The RADAR systems for AVs operate at 24, 74, 77, and 79 GHz (millimeter wave (MMW)), which are separated out to work for short-range, medium-range, and long-range RADAR (SRR, MRR, and LRR, respectively) [ 9 ]. An LRR can be implemented to detect far targets or objects in front of the ego car, and MRR and SRR, on the other hand, are used for parking assistance or side view detection [ 10 ]. Among a plethora of RADAR technologies available nowadays, linear Frequency-modulated continuous-wave (L-FMCW) RADARs are commonly used in AVs due to their simplicity. Figure 5 shows a top-level block diagram of an FMCW RADAR, where the voltage-controlled oscillator (VCO) module generates an L-FMCW chirp signal that is amplified by the power amplifier (PA) and transmitted by the antenna. The receiving antenna captures the echo signal and the low noise amplifier (LNA) amplifies it before mixing it with the VCO signal in order to generate the intermediate frequency (IF) or beat signal. The analog to digital converter (ADC) then digitizes the signal and passes it to the digital signal processing (DSP) module.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g005.jpg

High-level block diagram of an FMCW RADAR system.

Currently, most AV RADAR systems use an array of micro antennas capable of generating a set of antenna lobes. For instance, a 77 GHz radar with printed circuit board (PCB) antennas and a 24 GHz radar with horn antennas are shown in Figure 6 [ 11 ]. This has become more common with system-on-chip (SOC) architectures, which allows digital beam forming, among other techniques, to restrict the receiver’s spatial FOV and reduces the electromagnetic interference from other active sensors operating at similar center frequencies. The advancements of smart vehicles have triggered the use of this type of device in the automotive sector in order to improve safety. Adaptive cruise control (ACC), collision avoidance systems (CAS), blind spot detection (BSD), and lane change assist (LCA) are some examples. Matlab was used to create the initial step in establishing a complete testbed for AV sensor testing and verification, with an emphasis on radar systems under all environmental conditions of relevance [ 12 ].

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g006.jpg

( a ) A 77 GHz radar with PCB antennas and ( b ) 24 GHz radar with horn antennas.

LiDAR is an acronym for Light Detection and Ranging, which is a technology developed in the 1970s to be deployed on space platforms and airborne platforms. In a similar fashion to RADARs, LiDAR systems base their operation on measuring the time it takes a pulse of light, in infrared or near-infrared ranges, emitted from a laser diode until it is received by the system’s receiver, which is also known as the time-of-flight (ToF) principle. In ToF technology, the LiDAR generates a pulse of light with a specified duration ( τ ) that, at the time of emission, triggers the internal clock in a timing circuit. The reflected light pulse from the target is detected by a photodetector, which produces an electrical output that disables the clock. The distance to the reflection point may be calculated by using this electronically measured round-trip ToF (Δ t ) [ 13 ], as indicated in Equations ( 2 ) and ( 3 ):

where P 0 is the optical peak power of the emitted laser pulse, ρ is the reflectivity of the target, A 0 is the receiver’s aperture area, μ 0 is the detection optics spectral transmission, γ is the atmospheric extinction coefficient, c is the speed of light in a vacuum, and n is the index of refraction of the propagation medium (~1 for air). Figure 7 demonstrates a high-level block diagram for ToF LiDAR.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g007.jpg

High-level block diagram for time-of-flight LiDAR system (Image source: LaserFocusWorld).

The main wavelengths used in LiDARs are 905 nm and 1550 nm, which are determined by atmospheric transmission windows and the availability of high-power pulsed sources [ 13 ]. At the early development stages of AVs, 905 nm pulsed LiDAR systems were chosen for the task due to their availability. However, these systems have some serious limitations, such as high cost, inefficient mechanical scanning, interference from other light sources, and eye-safety power restrictions that limit their detection range to ~100 m. This prompted the shift to the retinal-safe 1550 nm band, because the water in the atmosphere begins to absorb energy from 1400 nm, which allows pulse powers high enough to range from 200 to 300 m [ 13 ]. LiDARs used for vehicles belong to Class-1 [ 14 ] and are safe under all conditions of normal use.

Creating a three-dimensional (3D) profile (typically 360° in azimuth × 20° in elevation) of the environment surrounding the AV requires a raster-scanned laser beams (scanning LiDAR) or flooding the scene with light and collecting the returns (flash LiDAR). For scanning LiDARs, the platform emits pulses from a set of diodes mounted on a rotating pod or by using a rotating multi-faceted mirror. The moving parts in these designs (rotates at 300–900 rpm) represent points of high failure rate in rough driving environments. Other approaches that can reduce the need for mechanical steering include the use of microelectromechanical systems (MEMS) mirror to steer the lens electrically or the use of optical phased array (OPA) technology. On the other hand, flash LiDARs flood the scene within the FOV of the detector with light. The detector is an array of avalanche photodiodes (APDs) where each independently measures ToF to the target feature imaged on that APD.

2.3. Ultrasonic Sensors

Ultrasonic sensors are suitable for many detection tasks in industrial applications. They have the capability to detect objects that are solid, liquid, granular, or in powder form. Ultrasonic sensors rely on sonic transducers to transmit sonic waves in the range of 40 kHz to 70 kHz for automotive applications. This frequency range is beyond the audible range for humans, which makes it safe for human ears. This is an important factor given that a cars’ parking system can generate more than 100 dB of sound pressure to assure clear reception, which is equivalent to the audible sound pressure from a jet engine.

Most ultrasonic sensors are based on the principle of measuring the ToF of sonic waves between transmission and reception. This measured ToF is then used to calculate the distance ( d ) to an object or a reflector within the measuring range, as shown in Equation ( 4 ).

Sonic waves travel in air at ~340 m/s and are a function of air temperature, pressure, and humidity (for each degree Celsius, the speed of sound increases by 0.6 m/s). The time it takes a sonic wave to travel 1 m is approximately 3 × 10 −3 s, as opposed to 3.3 × 10 −9 s for light and radio waves. These several orders of magnitude differences allow the use of low speed signal processing in ultrasonic systems. However, pressure based atmospheric conditions can debilitate the overall performance of ultrasonic sensors [ 15 , 16 ], which promotes the use of SRRs and other technologies instead. Figure 8 demonstrates the application of ultrasonic sensors in vehicles.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g008.jpg

Applications of ultrasonic sensors in vehicles (Image source: newelectronics).

GNSS is the most widely used technology for providing accurate position information on the surface of the earth. The best-known GNSS system is the Global Positioning System (GPS), which is a U.S. owned utility that provides users with positioning, navigation, and timing (PNT) services. The free, open, and dependable nature of GPS made it an essential element of the global information infrastructure that affects every aspect of modern life.

The GPS was developed by the U.S. Department of Defense (DoD) in early 1970 and is divided into three segments: the space segment, the control segment, and the user segment. The GPS system’s space and control elements are developed, maintained, and operated by the US Air Force [ 17 ]. The space segment consists of 31 operational satellites of which at least 24 are available 95% of the time. These satellites fly in medium earth orbit (MEO) at an altitude of 20,200 km, and each satellite orbits the earth twice daily. Its configuration allows any receiver located on the earth’s surface to receive signals in the L-band and some of the S-band frequency range, from 6–12 satellites. The control segment corresponds to a global network of ground facilities that track GPS satellites, analyze their broadcasts, perform analysis, and provide orders and data to the constellation; and the user segment refers to the process of dividing into distinct groups based on shared characteristics.

The operating principal of the GNSS is based on the ability of the receiver to locate at least four satellites, to calculate the distance to every single one of them, and then uses this information to identify its own location by using a process called trilateration. It is worth mentioning that GNSS signals suffer from several errors that degrade the accuracy of the system, such as the following: (1) timing errors due to differences between the satellite atomic clock and the receiver quartz clock, (2) signal delays due to propagating through the ionosphere and troposphere, (3) multipath effect, and (4) satellite orbit uncertainties. In order to improve the accuracy of current positioning systems on vehicles, data from satellites are merged with data from other vehicle sensors (e.g., inertial measurement unit (IMU), LiDARs, RADARs, and cameras) to achieve reliable position information.

2.5. Camera

Self-driving vehicles may rely heavily on cameras to perceive the surrounding environment. According to the electromagnetic spectrum, most cameras can be classified as visible (VIS) or infrared (IR). VIS cameras (e.g., monocular vision [ 18 , 19 , 20 , 21 ] and stereo vision [ 22 , 23 ]) capture wavelengths that ranges from 400 to 780 nm, similarly to human eyes. They are mostly used due to their low cost, high resolution, and their capability to differentiate between colors. Combining two VIS cameras with a predetermined focal distance allows stereo vision to be performed; hence, a 3D representation of the scene around the vehicle is possible. However, even in a stereoscopic vision camera system, the estimated depth accuracies are lower than the ones obtained from active range finders such as RADARs and LiDARs.

IR cameras work with infrared wavelengths ranging between 780 nm and 1 mm. They can be extended to the near-infrared (NIR: 780 nm–3 mm) and the mid-infrared (MIR: 3–50 mm; known as thermal cameras) [ 24 , 25 ] for certain applications. IR cameras are less susceptible to weather or lighting conditions, and it can overcome some of the VIS cameras shortcomings in situations where there are peaks of illumination (e.g., at the exit of a tunnel). In addition, they can be used for warm body detection, such as pedestrians and animals [ 26 , 27 , 28 , 29 ].

Moreover, NIR cameras can be used for range detection by using the ToF principle and the phase difference between the transmitted and the received light pulses. Depending on the number of light emitting diodes (LEDs) utilized in the LED array, the distance ranges from 10 m for interior scenes to roughly 4 m for outdoor scenes.

2.6. AV Sensors Performance Comparison

As briefly described in this section, different sensors have different advantages and disadvantages that play a role in their deployment on AVs and in fusing their responses in the object identification model needed in self-driving cars. Table 1 summarizes some of the main characteristics of different AV sensors. Their vulnerability to weather effects remains the focus of this paper and will be discussed in the following section.

Summary of autonomous vehicles’ sensors.

FeatureLiDARRADARCameraUltrasonic
Primary TechnologyLaser beamRadio waveLightSound wave
Range~200 m~250 m~200 m~5 m
ResolutionGoodAverageVery goodPoor
Affected by weather conditionsYesYesYesYes
Affected by lighting conditionsNoNoYesNo
Detects speedGoodVery goodPoorPoor
Detects distanceGoodVery goodPoorGood
Interference susceptibilityGoodPoorVery GoodGood
SizeBulkySmallSmallSmall

3. Weather Effects

Under ideal operating conditions, AV sensor technology should perform as expected when it comes to perceiving the surrounding environment and executing necessary actions. However, adverse weather conditions (e.g., rain, snow, fog, unfavorable lighting conditions, etc.) can impose serious challenges to AV sensors and the algorithms distilling information from them. This section will summarize the effects of several common adverse weather phenomena on AV sensors in a quantitative manner.

3.1. Precipitation (Rain, Snow, Hail, Sleet)

Precipitation is water (liquid or frozen) that falls back to the ground after condensing in the colder atmosphere. A droplet’s size and distribution defines the intensity of precipitation (measured in millimeters/hour (mm/hr)), which in turn affects the mechanism’s electromagnetic signals that propagate through the precipitated medium. According to [ 30 ], the maximum diameter a rain droplet can have is 6 mm. If the rain droplet diameter exceeds this amount, the air resistance coupled with the terminal velocity of the droplet will exceed its cohesive force and tear it into smaller pieces.

According to Mie’s solution to Maxwell’s equation, any transmission wavelength ( λ ) that is similar or smaller to the droplet diameter of 6 mm will be subject to Mie scattering [ 31 ]. Mie scattering can affect the propagation of the EM signal in two ways: first, the absorption of EM energy by water drops and vapor causing attenuation; and second, the rain volume back scattering or rain clutter can generate false alarms or mask actual targets in front of the sensor.

LiDARs transmitting in the 905 nm and 1550 nm wavebands will be heavily affected by Mie scattering from rain. Figure 9 shows the visibility deterioration on LiDAR during different rain intensities [ 13 ]. As can be observed from Figure 9 a, LiDAR visibility in clear conditions, which should have been ~2 km at these wavebands, has been deteriorated to approximately 1.2 km for the 905 nm band and 0.9 km for the 1550 nm band in 2 mm/h rain. When the rain rate increases to 25 mm/h, the 2 km visibility drops to 0.7 km and 0.45 km for wavelengths 905 nm and 1550 nm, respectively, as illustrated in Figure 9 b. The wetness of the target decreases the visibility by an additional addend of 0.1 km. However, within the range of 250 m usually required for rangefinders on AVs, LiDAR susceptibility to rain is not as noticeable until more severe rain rates occur, as shown in Figure 10 [ 30 ].

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g009.jpg

The visibility effects of differing rain intensities and target wetness on 905 nm and 1550 nm waves vs. normal conditions for ( a ) 2 mm/h and ( b ) 25 mm/h rain rate.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g010.jpg

Variation of the signal/noise ratio as a function of the rain rate and distance between LiDAR and target for a rain droplet radius equal to 3 mm.

For 77 GHz RADAR systems used in AVs ( λ ≈ 3.9 mm), the effect of attenuation is not significant at short distances [ 32 ]. It ranges from 0.0016 dB/m for 1 mm/h to 0.032 dB/m at 100 mm/h [ 33 ]. However, rain backscattering or rain clutter can decrease the maximum range of detectability, as discussed in [ 34 , 35 ]. Moreover, since the received backscatter from the rain depends on R 2 instead of R 4 for the target echo where R is the range, rain clutter can exceed the detection threshold and results in false alarms. Figure 11 summarizes the results of an experiment conducted by [ 33 ] in order to evaluate the effects of rain volume backscattering on 77 GHz RADARS used in AVs. It demonstrates rain RADAR cross section (RCS) and its received power for different rain rates and ranges and for narrow and wide beam RADAR systems.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g011.jpg

Rain RCS ( top ) and received power ( bottom ) for narrow beam ( left ) and wide beam ( right ) for different rain rates.

GNSS is generally not affected by local weather conditions. These satellite systems operate at a frequency (approximately 1.575 GHz) that is mostly unaffected by weather [ 36 ]. However, the windshield wipers will occasionally block reception, making it impossible for a GNSS device to identify a complete navigation data string from satellites. As a result, the GNSS receiver may not properly decode the incoming string and may provide erroneous data [ 37 ].

Camera systems onboard AVs relies on scene brightness to determine the intensity of image pixels. Different weather conditions can introduce sharp intensity fluctuations that can result in quality degradation of images and videos. For instance, snow and heavy rain can obscure edges of an object, rendering it unrecognizable. Fortunately, some digital image processing techniques can mitigate the effect of precipitation and improve images quality under dynamic weather conditions [ 38 ].

Fog is a visible aerosol consisting of small water droplets suspended in the air or near Earth’s surface [ 39 ]. It is considered as a low-lying cloud. This weather phenomenon typically begins around midnight and mostly dissipates after sunrise once atmospheric temperatures raises and relative humidity decreases.

Dust or air pollution must be present in the air for fog to occur. Water vapor condenses around these small solid particles, generating droplets ranging in size from 1 to 20 microns [ 40 ]. This being the case, LiDAR systems, for which its operating wavelengths are less than fog particles, will be subject to Mie scattering. In addition to the adverse effects from scattering, water absorption has a huge impact on the NIR spectral band. Water’s contribution to the extinction coefficient for the 905 nm and the 1550 nm are 0.075 cm −1 and 10.8 cm −1 , respectively. According to M. Hadj-Bachir et al. and J. Wojtanowski et al., under normal conditions, the overall extinction coefficient between these two frequencies are double one to another juxtaposed to the two magnitudes of difference observed from fog’s presence.

Ultrasonic sensors will not be directly affected by scattering compared to most of other sensors. However, being a system working with sound waves, air constitution and temperature would influence its performance. Nonetheless, given the fact that ultrasonic sensors are used for applications that required very short-range detection, the effect of precipitation is very minimal.

The range degradation curves presented by J. Wojtanowski et al. in Figure 12 show the severity of the signal attenuation seen in different ratings of fog. What should be 0.5 km of visibility has diminished to about 0.20 km and 0.12 km for the 905 nm and 1550 nm wavelengths, although they have the same maximum range performance under normal conditions. Moderate continental fog and heavy maritime fog can attenuate NIR signals up to 130 dB/km and 480 dB/km, respectively [ 36 ]. Fog would exhibit slight Rayleigh scattering on millimeter waves due to the extreme magnitude difference of particle to wave size. Fog might indirectly affect RADAR should temperature requirements be met by condensing on the radome or target in question and, thus, emulating what was explained in the above section. The camera, on the other hand, is heavily affected by Mie scattering since its operating wavelengths (400–750 nm) are much smaller than fog particle size. Reference [ 37 ] examines a camera’s ability within a fog chamber of 2 micron and 6 micron size fog particles. It was noted that visibility was reduced to 30 m compared to its ideal 60 m. The presence of air-light should also be taken into consideration since fog facilitates it. Air-light can be defined as the scattering of light from the interference of particles. With air-light interference, objects within the immediate vicinity of the light source are impossible to perceive. As the proximity increases, the perception of other objects is left to how the air-light interferes with the camera. Object distinction becomes increasingly difficult within fog and becomes more reliant on the reflectivity of the objects in question. The higher the reflectivity of an object, the darker it would appear in fog [ 37 ]. Ultrasonic sensors, relying on air constitution, would also be affected by fog. Depending on the type of fog, the air’s water density can range from 0.01 to 0.2 g/cm 3 . This interaction will be discussed further in the humidity section.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g012.jpg

Range degradation curve observed when 905 nm and 1550 nm are subjected to 0.5 km and 0.2 km visibility fog.

3.3. Humidity

Humidity is broken down into two different types: absolute and relative. Absolute humidity is the amount of water vapor in a cubic meter of air. Depending on the temperature of the environment, the amount of water vapor and miscellaneous particles that can be contained within a volume could vary. When observing absolute humidity with the current temperature, relative humidity is attained. Relative humidity is 100 percent or saturated when the dew point temperature and actual temperature match. With this in mind, the ratio of the actual vapor pressure to the saturated vapor pressure will also be the relative humidity [ 41 ]. If saturation is achieved along with certain weather conditions, fog formations could be facilitated. Since pressure is directly affected by humidity, humidity should affect ultrasonic sensors. The levels of humidity have a variable effect on the attenuation of ultrasonic waves. This effect is not necessarily the same for all frequency to humidity levels. For example, at 200 kHz, maximum attenuation is observed at saturation but at 60 kHz, max attenuation is observed at 60% humidity. The latter frequency is relatively close to modern AV sensor frequencies. In conditions of no humidity, attenuation is approximately 0.25 dB/ft. Another phenomenon observed by [ 39 ] was at 20 °C; attenuation seemed to rise sharply.

Based on water’s high absorption coefficient, high humidity should have a noticeable contribution to the attenuation of LiDAR performance. Under normal conditions, the extinction coefficient of the 905 nm is typically about double the 1550 nm’s one. Table 2 shows the atmospheric extinction coefficients subjected to varying levels of humidity, as reported by J. Wojtanowski et al. As can be observed, the extinction coefficients of the 905 nm and the 1550 nm approximately keep their ratios the same. That being said, different levels of humidity have negligible effects on the performance of LiDAR. Since LiDAR is not affected by water vapor content of humidity, RADAR, being a more robust system, will also not be affected by humidity levels [ 33 ].

A list of extinction coefficients for 905 nm and 1550 nm wavelengths subjected to varying levels of humidity.

Extinction Coefficients (904 nm|1550 nm)
502.333|1.1460.463|0.2270.229|0.112
702.327|1.1370.462|0.2250.228|0.111
1002.615|1.3630.519|0.2700.256|0.133

3.4. Lightning

It is important for the AVs to operate with electromagnetic stability on its own in ideal conditions, since it will be making use of technology that requires electricity and electromagnetism [ 40 ]. Other unpredictable conditions outside of electrical interference need to be considered; one of the biggest concerns of this type is lightning.

Lightning is the weather phenomenon where large electrical discharge occurs and balances itself again by producing a momentary flash of up to 30,000 K in under 10 microseconds [ 42 ]. Since clouds are not specifically monitored to observe when lightning might occur and where it may travel, all AV sensors are in danger, especially if located on the exterior of the vehicle. Depending on the car material and structural composition, the surface of the car will hold the electrical charge upon strike. Contrary to popular belief, the passengers may still be negatively affected if in contact with any of the vehicle’s shell as well as components connected to the shell such as the steering wheel, gear shift, doors, or windows; passengers might even be affected by interacting with technology that has electrical capability. Table 3 compares the AV equipment under lightning influence.

Effects of direct and indirect lightning on AV sensors.

AV SensorRole(s)Direct Lightning Effect LevelsIndirect Lightning Effect Levels
RADARElectromagneticHighMedium
LiDARElectricalHighHigh
Ultrasonic SensorsElectromagneticHighMedium
GNSSElectrical, electromagneticHighMedium
CameraElectricalHighHigh

AV sensors can be affected by lightning even when not directly struck; this includes the technologies of the car, since the sensors are usually mounted to the vehicle. The electromagnetic stability can be affected by the new electromagnetic field created by the electrical discharge from the clouds. An example of unmanned aerial vehicle (UAV) produces a charged field in Figure 13 ; the strength of it over distance is in demonstrated in Figure 14 , which decreases as distance increases [ 43 ]. Figure 15 and Figure 16 show electromagnetic disturbances in the UAV [ 43 ]. This permits us to observe how radio disturbances appear to be prevalent. Among the AV sensors, the camera is also affected visually by highly unpredictable lighting changes. The bright light can affect perception in cameras.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g013.jpg

Example of a UAV model.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g014.jpg

Electric field strength due to lightning.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g015.jpg

Electromagnetic disturbance example in the coupling path.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g016.jpg

Electromagnetic disturbance example in the antenna-feeder path.

The lightning discharge effects can be maintained by material choice for low electrical conductivity and controllers for less noise immunity. Guaranteeing noise immunity can be performed by structural methods, circuitry methods, electromagnetic shielding, and filtering. In the end, it comes down to the electromagnetic compatibility (EMC) of the system with a natural external disturbance [ 40 ]. For the UAV example, the conclusion was that the effects of the noise immunity on radio waves are very probable, while the effects of electronic equipment are not as significant [ 40 ]. This topic could have more material available in the field or is a great place to begin simulations per device (RADAR, LiDAR, etc.).

3.5. Thunder

The sound aspect following lightning is thunder; they both occur when the other is present, even if the thunder is inaudible [ 42 ]. As the air has been heated along the path of electrical discharge, the excited particles produce sound in an explosion that can be heard up to 30 km away [ 42 ]. Since the speed of light is greater than the speed of sound, the noise is heard after the flash.

Even though sound can travel a long distance, the audibility can be affected by numerous factors. Humidity, wind velocity, temperature inversions, terrain features, and clouds are all noise distorters; this is important primarily for ultrasonic sensors [ 42 ]. Echoing effects can also be produced by buildings or landscapes causing multiple occurrences of the sound.

Thunder can operate at both the sonic and infrasonic ranges. The dominant frequency is 100 Hz, but infrasonic frequencies, which are below 20 Hz, are inaudible to humans [ 44 ]. Table 4 shows a list of general sound levels for a comparison to thunder. The pressure waves associated with thunder are, therefore, able to damage interior or exterior structures [ 44 ].

General sound levels.

SoundLevel  (dB)
Stream flow, rustling leaves15
Watch ticking, soft whisper20–30
Quiet street noises40
Normal conversation45–60
Normal city or freeway traffic70
Vacuum cleaner75
Hair dryer80
Motorcycle, electric shaver85
Lawn mower, heavy equipment90
Garbage truck100
Screaming baby115
Racing car, loud thunder, rock band120–130
Jet airplane takeoff from 120 feet120
Pain threshold130
Rocket launch from 150 feet180

Depending on the initial conditions such as temperature and relative humidity, the effect of thunder could be negligible with regard to sound for various sensors observed in Figure 17 [ 45 ]. However, in the case of thunder, temperature is much higher (at 30,000 K due to lightning); RADAR, LiDAR, ultrasonic, and possibly GNSS will be affected due to disturbances of wavelengths.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g017.jpg

Sound attenuation dependent on relative humidity (RH) and frequency at 20 degrees Celsius.

3.6. Sun Glint

Sun glint is the surface scattering of reflectance on liquids [ 46 ]. Figure 18 shows a picture of water in (a) and (c) with a respective filtered image in (b) and (d) displaying where the glint occurs. The colors represent the following: glint in red, water in sky blue, and shadow in black [ 46 ]. This poses a problem as vision or object-detection could be faulty when receiving a light signal back at a certain size. The objects interacting with AVs and producing glint will mainly be in the precipitation. The sensors are sharply divided on the attenuation due to sun glint. There are no effects on RADAR, ultrasonic sensors, or GNSS, while LiDAR and cameras will be the most affected by this occurrence. As observed in the example above, shown in Figure 18 , the larger the amount of glint, the more it will affect the readings.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g018.jpg

Surface sun glint patterns.

3.7. Dust Storm

Dust storms are the formations of sand or dust walls typically caused by strong wind currents, which are typically offshoot from a large thunderstorm and/or, on some rarer occasions, the movement of pressure fronts [ 47 ]. These dust particles, for which its size can range from 2.5 to 10 microns, are pushed by the current causing visibility concerns and may also facilitate cloud growth [ 48 ]. Since visibility concerns are present, camera performance is severely debilitated relative to whether there may be poor visible conditions in the air or the camera lens becoming dirty. For the former, over the span of a year, 549 incidents involving casualties were reported to be caused by dust-related storms [ 47 ]. According to Sky Harbor International Airport, dust storms debilitate visibility to less than half a mile (0.8 km). Smoke, which shares some physical characteristics with dust storms, also cause performance issues. Energy absorption and soot particles, similarly to the dust particles, have higher effects relative to sensors with smaller wavelengths and can be around 400–750 nm in the camera’s case [ 49 ].

LiDAR is also affected negatively by smoke but not as severely as the camera [ 48 ]. Similarly to how fog or rain affects LiDAR, dust storms particles will cause Mie scattering to the 905 and 1550 nm wavelengths, since their sizes are around 2.5–10 microns. This scattering will not be as powerful, however, because water as a material has a high absorption coefficient [ 13 ]. RADAR will only be affected minorly by Raleigh scattering since particle size and wavelength differ three magnitudes.

Since ultrasonic sensors are extremely dirt-tolerant, their efficiency shines, especially in harsh working environments where process reliability is not jeopardized by dust, smoke, mist, or other impurities. However, this is only with respect to the particles contained within dust storms. Wind speeds of 6 m/s begin to disturb wave propagation [ 39 ]. That being said, more than half of the dust storms observed in Phoenix, Arizona, had maximum wind speeds between 36 and 57 mph (16–25 m/s) [ 47 ].

3.8. Space Weather

The earth’s atmosphere observed in Figure 19 is composed of a few layers, including the troposphere, stratosphere, mesosphere, thermosphere, ionosphere, and exosphere, with most of the planet’s weather occurring in the troposphere [ 50 ]. The previously discussed weather phenomenon mainly affects RADAR, LiDAR, ultrasonic sensors, and cameras. GNSS is an exception to the group being mainly affected by environmental conditions in our solar system (space weather) [ 36 ].

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g019.jpg

Earth’s atmospheric layers.

The ionosphere is abundant in electrons and ionized particles and renders radio communication possible [ 50 ]. This layer also can shift in size, depending mainly on solar influence [ 50 ]. GNSS operates in this region and can be influenced by interrupting wavelengths and objects coming in from the exosphere. The troposphere, unlike the ionosphere, has the benefit of multiple layer protection, where meteors can be disintegrated in the mesosphere and ultraviolet radiation can be absorbed and dispersed through the stratosphere’s ozone layer [ 50 ]. Technology utilized in the ionosphere is at a higher disadvantage if something were to permeate the planet’s atmosphere, observed in Table 5 .

Earth’s atmosphere layers.

Atmospheric LayerAltitude (km)Space Weather Effect LevelApplicable AV Sensors
Troposphere0–14.5Low–MediumRADAR, LiDAR, ultrasonic sensors, GNSS, and camera
Stratosphere14.5–50MediumGNSS
Mesosphere50–85Medium–HighGNSS
Thermosphere85–600HighGNSS
Ionosphere48–965HighGNSS
Exosphere600–10,000HighGNSS

The L-Band frequencies used by GNSS are severely debilitated by space weather. Space weather, which actably accounts for the sun’s atmospheric interactions, does have an effect on range delay and loss of lock [ 51 ]. The sun has a direct effect on the ionosphere, whether that may be through ionospheric scintillations, changing the density and distribution of the total electron content (TEC), or the time of day and year. The ionosphere can be broken down into three main layers denoted as the D, E, and F regions. These regions are classified by altitude and each effect signals differently. The D-region attenuates signals inversely proportional to the frequency squared. This being the case, attenuation within the L-Band is negligible. The E and F regions cause little to no attenuation due to low air density. They tend to reflect the signal depending on the frequency and incidence angle of the transmitted wave. The Ionosphere’s index of refraction is a function of the transmitted wave frequency and the TEC of the region. At nighttime, the TEC distribution is generally lower within the E region; therefore, lower reflections and range delays occur [ 51 ]. Ionospheric scintillations have a direct effect on the E and F regions, which result in an increase in loss of lock for the GPS. Solar flares cause variations relative to the TEC, which also causes loss of lock variations [ 51 ]. Additionally, solar radio bursts negatively impact tracking performance.

4. Synopsis of Automotive Sensor Strengths

Several strengths of different types of automotive sensors are shown in Figure 20 . A variety of sensors are needed for reliability in which sensors must be placed in order to cover different ranges and speeds for high-speed driving, weather conditions, collision avoidance, and stationary safety.

An external file that holds a picture, illustration, etc.
Object name is sensors-21-05397-g020.jpg

Spider diagrams showing strengths of various sensors found in automobiles. (Image source: www.cleantechnica.com ; accessed on 12 March 2021).

LiDAR sensors are vulnerable to precipitation conditions such as snow, hail, sleet, and rain, as shown in Figure 20 . However, as a short wavelength technology, LiDAR can identify small objects and provide a precise 3D monochromatic picture of an item, which RADAR may lack. The perception of objects is predicted to be severely affected by inclement weather and a decrease in detection range. As a result of the diminished contrast in intensity, there is an increase in misclassifications and even incorrect detection systems [ 52 ]. If the raindrop is very close to the laser emitter, there is a considerable possibility of erroneous detection. Since the laser beam meets a particle and causes a flash of light similar to a small surface, there will be a return of peak that is similar to that of a road item [ 37 ].

RADAR sensors can operate in varied weather conditions due to a wide spectrum wavelength and absence of mechanical moving parts, as opposed to passive visual sensors due to their lack of scope for climate range. A radio wave receiver and transmitter of the same wavelength can be used to generate radio noise, causing the device to count the speed of a moving object as zero. Then, when it comes to RADAR versus LiDAR autonomous automobile systems, both technologies are deceptive and have the same level of security [ 53 ]. Rain is the most important source of radar signal attenuation [ 54 ]. Since droplet sizes are equivalent to radar wavelengths, the rain backscatter effect was discovered to have a considerable impact on performance. The attenuation effect weakens the signal, whereas the backscatter effect adds interference at the receiver [ 55 ]. These indications are equally useful in the presence of snow or mist. When it comes to hail, millimeter-wave radar signals lose part of their power as they move through the weather [ 56 , 57 ]. The mathematical models for snow and mist attenuation and backscatter are the same as those for wet conditions [ 58 ].

Cameras offer reliable information about the environment and can operate in a variety of weather conditions [ 59 ]. During self-parking, a visual navigation system can identify nearby obstructions and locate the vehicle, provided that a complete and accurate full view of the surroundings is taking place in a limited space. In order to accomplish this, a combination of stereo vision techniques can be used [ 60 ]. With the use of real-time depth map extraction, a precise dense map may be built up in the sequence of depth map extraction, obstacle extraction, and fusion across multiple camera frames to achieve an accurate estimation [ 60 ].

Ultrasonic or sonar sensors are used for near-obstacle detection and as a parking aid system. Due to their limited coverage range (less than 2 m) and poor angular sensing resolution, they do not obtain consistent information about the location and the velocity of vehicles on the road [ 61 ]. Moreover, ultrasonic sensors suffer disruptions in noisy environments such as roads, streets, and highways. When trying to enlarge the coverage range, the ping/pulse from the emitter can be produced loudly, which is harmful to citizens and the environment. Consequently, sonar systems should remain in their working range, such as parking lots, to accurately detect obstacles [ 62 ]. The sonar sensors on the front and rear bumpers should be maintained to be clear of snow, ice, and dirt for maximum accuracy.

5. Conclusions and Future Work

Despite the ongoing advancements in automobile technology, one major challenge that has to be overcome is driving safely in severe weather. Driving under less-than-ideal circumstances reduces on-road visibility and impairs the functioning of AV sensors, rendering them vulnerable to possible accidents.

Based on the findings, RADAR sensors seem to align with today’s technology since they can operate beyond adverse weather conditions. The main drawback is that they still need the assistance of a perception system in order to improve decision robustness. LiDAR can be offset to perform in bad weather and, thus, it is still viable. Similarly to RADAR, it requires a perception companion. Cameras can lose their ability to decipher what it observes when lighting changes, weather is severe, or visibility is low. Lightning is a major problem and even obstructions such as dirt-caked signs and cameras will not always be reliable.

On the other hand, LiDAR sensors appear to be more efficient for spacecraft missions, including location and docking, but not for autonomous vehicles. If there are intentions for acquiring active photon generators for automotive purposes, visible wavelength sensors will be useless, since the photon generator covers down to the optical wavelength, while low-cost radars can easily penetrate at incursion situations such as rain, fog, or snow. Numerical studies of a LiDAR sensor simulation tool show that new sensor models can simulate raw data provided by a laser scanner through the pulse laser propagation and energy losses under the weather condition. These attributions will contribute in reducing the cost and time to develop LiDAR applications and to increase performance of such systems. Self-driving vehicles can rely heavily on cameras to perceive the surrounding environment. Therefore, cameras are mostly used due to their low cost, high resolution, and their capability to differentiate between colors. Connecting two visible (VIS) cameras with a predetermined focal distance allows computer stereo vision to be performed; hence, a 3D representation of the scene around the vehicle is produced. However, even in a stereoscopic vision camera system, the estimated depth accuracy is lower than the one obtained from active range finders such as radars and lidars.

Adverse weather conditions can impose challenges and create potential failures to AV sensors. However, with the assistance of satellite for monitoring, the existing infrastructure (e.g., traffic lights) and control centers via mobile devices will be essential for collision avoidance. As a result, prominent semiconductor firms are developing chips that will transform autonomous vehicles into mobile data centers, allowing driverless cars to make crucial decisions in real time.

The paper’s scope is an effort to summarize the weather effects on AV sensors, specifically on RADAR, LiDAR, and camera systems, but the discussion could be expanded. In addition to strategies for minimizing attenuation, the environmental impacts on AV technology, geography, sensor network topology, natural catastrophes, and interferences may be accounted for. What would be even more intriguing is the topic expansion of AV technology effects relative to the environment. Studies on these more expansive dialogues would then contribute to overall sensor information in AV. Although outdoor test environments are the most relevant schemes with high levels of sensor attenuation due to their unpredictable natures, there are additional applications. Scenarios with topology variation, natural disasters, emergencies, physical/internal interference, or lack of human aspects can also contribute to sensor attenuation.

Author Contributions

Conceptualization, J.V. and S.A.; methodology, J.V. and S.A.; investigation, J.V., S.A., O.T., and J.S.; writing—original draft preparation, J.V., J.S., and S.A.; writing—review and editing, J.V., S.A., O.T., and R.R.; supervision, J.V. and S.A. All authors have read and agreed to the published version of the manuscript.

This material is based upon work supported by Middle Tennessee State University (College of Basic and Applied Sciences−Engineering Technology), the Advanced Mobility Institute (AMI) at Florida Polytechnic University, and the National Science Foundation under Grant No. CNS-1919855. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Institutional Review Board Statement

Informed consent statement, data availability statement, conflicts of interest.

The authors declare that there are no conflicts of interests regarding the publication of this paper.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Captcha Page

We apologize for the inconvenience...

To ensure we keep this website safe, please can you confirm you are a human by ticking the box below.

If you are unable to complete the above request please contact us using the below link, providing a screenshot of your experience.

https://ioppublishing.org/contacts/

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sensors-logo

Article Menu

  • Subscribe SciFeed
  • Recommended Articles
  • PubMed/Medline
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

The perception system of intelligent ground vehicles in all weather conditions: a systematic literature review.

literature review ultrasonic sensor

1. Introduction

  • Additionally, a 3D visualization of state-of-the-art sensing technologies is shown using a spider chart, clarifying emerging trends and gaps in development. Based on the spider chart information, sensor reliability challenges and gaps can be tackled in the efficient implementation of automated driving in all weather conditions.
  • Sensor fusion perspective is proposed, which involves several strategies and tasks, that will help to facilitate active sensor toggling (switching). The active sensor toggling strategy helps in the selection of sensors depending on the environment awareness context. Moreover, a potential combination of sensors is proposed for selected driving safety applications for all weather drives.

2. Evolution of Intelligent Vehicle Technology

2.1. phase i (1980 to 2003), 2.2. phase ii (2003 to 2008), 2.3. phase iii (from 2008), 3. automated navigation features in difficult weather conditions, 3.1. forward assistance, 3.1.1. adaptive cruise control (acc), 3.1.2. forward collision avoidance (fca), 3.1.3. road and traffic sign recognition, 3.1.4. traffic jam assist (tja), 3.2. lateral assistance, 3.2.1. lane departure warning (ldw) and lane keeping assistance (lka), 3.2.2. lane change assistance (lca)/blind spot monitoring (bsm), 4.1. overview, 4.4. ultrasonic sensor, 4.5. vision-based systems, 4.6. far-infrared camera, 4.7. emerging technology, 5. perspective on sensor fusion.

  • At the first stage, the fusion system must decide the features that make the navigation environment different from normal navigation. Therefore, there should be a context-aware mechanism that adapts the level of confidence of each piece of sensor information. When snow/rain is falling, the context-aware mechanism can simply use the camera and the weather data to confirm such an occurrence. Furthermore, a training process can be used to classify different weather-related road contexts.
  • At the second stage, fusion processing can be carried out to provide the most recent sensing information and the corresponding level of confidence. Although the fusion concept can enhance the capability of the automated navigation feature in all weather conditions, the overall processing power can be considerable. Therefore, the fusion hardware and software architectures should be deeply analyzed before further implementation steps.
  • Adaptive cruise control (ACC)
  • Forward collision avoidance (FCA), traffic jam assist (TJA), and blind spot monitoring (BSM) systems
  • Road and traffic sign recognition (TSR):
  • Lane departure warning and lane keeping warning (LDW and LKW) safety systems
  • Parking assistance (PA) systems

6. Conclusions

Author contributions, conflicts of interest.

  • D. O. Transport. Weather Impact on Safety. Available online: https://ops.fhwa.dot.gov/weather/q1_roadimpact.htm (accessed on 1 October 2019).
  • NHTSA. Traffic Safety Facts. Available online: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812806 (accessed on 1 November 2019).
  • El Faouzi, N.E.; Heilmann, B.; Aron, M.; Do, M.T.; Hautiere, N.; Monteil, J. Real Time Monitoring Surveillance and Control of Road Networks under Adverse Weather Condition ; HAL: Houston, TX, USA, 2010. [ Google Scholar ]
  • SAE J3016: Levels of Driving Automation. S. International. 2019. Available online: https://www.sae.org/news/2019/01/sae-updates-j3016-automated-driving-graphic (accessed on 1 October 2019).
  • Ando, Y.M.R.; Nishihori, Y.; Yang, J. Effects of Advanced Driver Assistance System for Elderly’s Safe Transportation. In Proceedings of the SMART ACCESSIBILITY 2018: The Third International Conference on Universal Accessibility in the Internet of Things and Smart Environments, Rome, Italy, 25–29 March 2018. [ Google Scholar ]
  • Kang, Y.; Yin, H.; Berger, C. Test Your Self-Driving Algorithm: An Overview of Publicly Available Driving Datasets and Virtual Testing Environments. IEEE Trans. Intell. Veh. 2018 . [ Google Scholar ] [ CrossRef ]
  • Yurtsever, E.; Lambert, J.; Carballo, A.; Takeda, K. A Survey of Autonomous Driving: Common Practices and Emerging Technologies. IEEE Access 2020 , 8 , 58443–58469. [ Google Scholar ] [ CrossRef ]
  • Marti, E.; de Miguel, M.A.; Garcia, F.; Perez, J. A Review of Sensor Technologies for Perception in Automated Driving. IEEE Intell. Transp. Syst. Mag. 2019 , 11 , 94–108. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Rosique, F.; Navarro, P.J.; Fernandez, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019 , 19 , 648. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Bengler, K.; Dietmayer, K.; Farber, B.; Maurer, M.; Stiller, C.; Winner, H. Three Decades of Driver Assistance Systems: Review and Future Perspectives. IEEE Intell. Transp. Syst. Mag. 2014 , 6 , 6–22. [ Google Scholar ] [ CrossRef ]
  • Diermeier, D. Mercedes and the Moose Test (B). Kellogg Sch. Manag. Cases 2017 . [ Google Scholar ] [ CrossRef ]
  • Billington, J. The Prometheus Project: The Story Behind One of AV’s Greatest Developments. Available online: https://www.autonomousvehicleinternational.com/features/the-prometheus-project.html (accessed on 1 October 2019).
  • Carnegie Mellon University. No Hands Across America Journal. Available online: https://www.cs.cmu.edu/~tjochem/nhaa/Journal.html (accessed on 1 October 2019).
  • Bertozzi, M.; Broggi, A.; Conte, G.; Fascioli, R. The Experience of the ARGO Autonomous Vehicle. In Proceedings of the Enhanced and Synthetic Vision, Orlando, FL, USA, 13–17 April 1998. [ Google Scholar ] [ CrossRef ]
  • Automobile, W. Parking Sensor. Available online: https://en.wikipedia.org/wiki/Parking_sensor (accessed on 1 October 2019).
  • Meinel, H.H. Evolving automotive radar—From the very beginnings into the future. In Proceedings of the 8th European Conference on Antennas and Propagation (EuCAP 2014), The Hague, The Netherlands, 6–11 April 2014; pp. 3107–3114. [ Google Scholar ] [ CrossRef ]
  • Bertozzi, M.; Broggi, A.; Fascioli, A. Vision-based Intelligent Vehicles: State of the Art and Perspectives. Robot. Auton. Syst. 2000 , 32 , 1–16. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Broggi, A.; Bertozzi, M.; Fascioli, A. Architectural Issues on Vision-Based Automatic Vehicle Guidance: The Experience of the ARGO Project. Real-Time Imaging 2000 , 6 , 313–324. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Thorpe, C.; Jochem, T.; Pomerleau, D. The 1997 automated highway free agent demonstration. In Proceedings of the Conference on Intelligent Transportation Systems, Boston, MA, USA, 12 October 1997; pp. 496–501. [ Google Scholar ] [ CrossRef ]
  • Urmson, C.; Duggins, D.; Jochem, T.; Pomerleau, D.; Thorpe, C. From Automated Highways to Urban Challenges. In Proceedings of the 2008 IEEE International Conference on Vehicular Electronics and Safety, Columbus, OH, USA, 22–24 September 2008; pp. 6–10. [ Google Scholar ] [ CrossRef ]
  • Bebel, J.C.; Howard, N.; Patel, T. An autonomous system used in the DARPA Grand Challenge. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749), Washington, WA, USA, 3–6 October 2004; pp. 487–490. [ Google Scholar ] [ CrossRef ]
  • Velodyne. HDL—64E Lidar. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 1 October 2020).
  • Dickmann, J.; Appenrodt, N.; Klappstein, J.; Bloecher, H.L.; Muntzinger, M.; Sailer, A.; Hahn, M.; Brenk, C. Making Bertha See Even More: Radar Contribution. IEEE Access 2015 , 3 , 1233–1247. [ Google Scholar ] [ CrossRef ]
  • Hoeger, R.; Amditis, A.; Kunert, M.; Hoess, A.; Flemisch, F.; Krueger, H.P.; Bartels, A.; Beutner, A.; Pagle, K. Highly Automated Vehicles For Intelligent Transport: Haveit Approach. 2008. Available online: https://www.researchgate.net/publication/225000799_HIGHLY_AUTOMATED_VEHICLES_FOR_INTELLIGENT_TRANSPORT_HAVEit_APPROACH (accessed on 1 October 2019).
  • Vanholme, B.; Gruyer, D.; Lusetti, B.; Glaser, S.; Mammar, S. Highly Automated Driving on Highways Based on Legal Safety. IEEE Trans. Intell. Transp. Syst. 2013 , 14 , 333–347. [ Google Scholar ] [ CrossRef ]
  • Thomaidis, G.; Kotsiourou, C.; Grubb, G.; Lytrivis, P.; Karaseitanidis, G.; Amditis, A. Multi-sensor tracking and lane estimation in highly automated vehicles. IET Intell. Transp. Syst. 2013 , 7 , 160–169. [ Google Scholar ] [ CrossRef ]
  • Dávila, A.; Nombela, M. Sartre–Safe Road Trains for the Environment Reducing Fuel Consumption through lower Aerodynamic Drag Coefficient. SAE Tech. Pap. 2011 . [ Google Scholar ] [ CrossRef ]
  • Bertozzi, M.; Bombini, L.; Broggi, A.; Buzzoni, M.; Cardarelli, E.; Cattani, S.; Cerri, P.; Debattisti, S.; Fedriga, R.; Felisa, M.; et al. The VisLab Intercontinental Autonomous Challenge: 13,000 km, 3 months, no driver. In Proceedings of the 17th World Congress on ITS, Busan, Korea, 1–5 September 2010. [ Google Scholar ]
  • Englund, C.; Chen, L.; Ploeg, J.; Semsar-Kazerooni, E.; Voronov, A.; Bengtsson, H.H.; Didoff, J. The Grand Cooperative Driving Challenge 2016: Boosting the Introduction of Cooperative Automated Vehicles. 2016. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7553038&isnumber=7553013 (accessed on 1 October 2019).
  • R. Tom Alkim. European Truck Platooning Challenge. Available online: http://wiki.fot-net.eu/index.php/European_Truck_Platooning_Challenge (accessed on 1 October 2019).
  • Tsugawa, S.; Jeschke, S.; Shladover, S.E. A Review of Truck Platooning Projects for Energy Savings. IEEE Trans. Intell. Veh. 2016 , 1 , 68–77. [ Google Scholar ] [ CrossRef ]
  • Poczte, S.L.; Jankovic, L.M. 10 The Google Car: Driving Toward A Better Future? J. Bus. Case Stud. First Quart. 2014 , 10 , 1–8. [ Google Scholar ]
  • Nissan. Nissan Q50 2014. Available online: http://www.nissantechnicianinfo.mobi/htmlversions/2013_Q50_Special/Safety.html (accessed on 1 October 2019).
  • Tesla. Available online: https://en.wikipedia.org/wiki/Tesla_Autopilot (accessed on 1 October 2019).
  • Ford. Ford and Mobileye to Offer Better Camera-Based Collision Avoidance Tech. Available online: https://www.kbford.com/blog/2020/july/28/ford-and-mobileye-to-offer-better-camera-based-collision-avoidance-tech.htm (accessed on 1 October 2019).
  • Xiao, L.; Gao, F. A comprehensive review of the development of adaptive cruise control systems. Veh. Syst. Dyn. 2010 , 48 , 1167–1192. [ Google Scholar ] [ CrossRef ]
  • Ioannou, P. Guest Editorial Adaptive Cruise Control Systems Special ISSUE. 2003. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1255572 (accessed on 1 October 2019).
  • Rajamani, R.; Zhu, C. Semi-autonomous adaptive cruise control systems. IEEE Trans. Veh. Technol. 2002 , 51 , 1186–1192. [ Google Scholar ] [ CrossRef ]
  • Winner, H.; Schopper, M. Adaptive Cruise Control. In Handbook of Driver Assistance Systems: Basic Information, Components and Systems for Active Safety and Comfort ; Winner, H., Hakuli, S., Lotz, F., Singer, C., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 1093–1148. [ Google Scholar ]
  • Robinson, S.R. The Infrared & Electro-Optical Systems Handbook. Volume 8, No. 25. 1993. Available online: https://apps.dtic.mil/dtic/tr/fulltext/u2/a364018.pdf (accessed on 1 October 2019).
  • FLIR. Seeing through Fog and Rain with a Thermal Imaging Camera. 2019. Available online: https://www.flirmedia.com/MMC/CVS/Tech_Notes/TN_0001_EN.pdf (accessed on 1 October 2019).
  • Forward Vehicle Collision Warning Systems—Performance Requirements and Test Procedures, Intelligent Transport System ISO/TC 204, ISO. 2013. Available online: https://www.iso.org/obp/ui/#iso:std:iso:15623:ed-2:v1:en (accessed on 1 October 2019).
  • Winner, H. Handbook on Driver Assistance System (Fundemental of collision protection systems) ; Springer International Publishing Switzerland: Cham, Switzerland, 2016. [ Google Scholar ]
  • Sign Detection with LIDAR. Available online: https://unmanned.tamu.edu/projects/sign-detection-with-lidar/ (accessed on 1 October 2019).
  • Johansson, B. Road Sign Recognition from a Moving Vehicle. Master’s Thesis, Uppsala University, Uppsala, Sweden, 2003. [ Google Scholar ]
  • Yang, Y.; Luo, H.; Xu, H.; Wu, F. Towards Real-Time Traffic Sign Detection and Classification. IEEE Trans. Intell. Transp. Syst. 2016 , 17 , 2022–2031. [ Google Scholar ] [ CrossRef ]
  • Chen, E.H.; Röthig, P.; Zeisler, J.; Burschka, D. Investigating Low Level Features in CNN for Traffic Sign Detection and Recognition. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 325–332. [ Google Scholar ] [ CrossRef ]
  • Lüke, S.; Fochler, O.; Schaller, T.; Regensburger, U. Traffic-Jam Assistance and Automation. In Handbook of Driver Assistance Systems: Basic Information, Components and Systems for Active Safety and Comfort ; Winner, H., Hakuli, S., Lotz, F., Singer, C., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2014. [ Google Scholar ]
  • Traffic Jam Assistance-Support of the Driver in the Transverse and Longitudinal Guidance. 2008. Available online: https://mediatum.ub.tum.de/doc/1145106/1145106.pdf (accessed on 1 October 2019).
  • Cicchino, J.B. Effects of lane departure warning on police-reported crash rates. J. Saf. Res. 2018 , 66 , 61–70. [ Google Scholar ] [ CrossRef ]
  • Yenikaya, S.; Yenikaya, G.; Düven, E. Keeping the Vehicle on the Road-A Survey on On-Road Lane Detection Systems. ACM Comput. Surv. 2013 . [ Google Scholar ] [ CrossRef ]
  • Hata, A.; Wolf, D. Road marking detection using LIDAR reflective intensity data and its application to vehicle localization. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 20 November 2014; pp. 584–589. [ Google Scholar ]
  • Feniche, M.; Mazri, T. Lane Detection and Tracking For Intelligent Vehicles: A Survey. In Proceedings of the 2019 International Conference of Computer Science and Renewable Energies (ICCSRE), Agadir, Morocco, 22–24 July 2019. [ Google Scholar ] [ CrossRef ]
  • Li, Q.; Chen, L.; Li, M.; Shaw, S.; Nüchter, A. A Sensor-Fusion Drivable-Region and Lane-Detection System for Autonomous Vehicle Navigation in Challenging Road Scenarios. IEEE Trans. Veh. Technol. 2014 , 63 , 540–555. [ Google Scholar ] [ CrossRef ]
  • Intelligent Transport System—Lane Change Decision Aid System, I. 17387:2008. 2008. Available online: https://www.iso.org/obp/ui/#iso:std:iso:17387:ed-1:v1:en (accessed on 1 October 2019).
  • Zakuan, F.R.A.; Hamid, U.Z.A.; Limbu, D.K.; Zamzuri, H.; Zakaria, M.A. Performance Assessment of an Integrated Radar Architecture for Multi-Types Frontal Object Detection for Autonomous Vehicle. In Proceedings of the 2018 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Shah Alam, Malaysia, 20–20 October 2018; pp. 13–18. [ Google Scholar ] [ CrossRef ]
  • ISO 20473:2007(e), Optics and Photonics—Spectral Bands, i. O. F. Standardization. 2007. Available online: https://www.pointsdevue.com/03-iso-204732007e-optics-and-photonics-spectral-bands-international-organization-standardization# (accessed on 1 October 2019).
  • Alibaba.com. Ultrasonic Parking Sensor. Available online: https://www.alibaba.com/showroom/ultrasonic-parking-sensor.html (accessed on 1 October 2020).
  • NHTS. Preliminary Cost-Benefit Analysis of Ultrasonic and Camera Backup Systems. 2006. Available online: https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/nhtsa-2006-25579-0002.pdf (accessed on 1 October 2019).
  • Digitimes Research: 79GHz to Replace 24GHz for Automotive Millimeter-Wave Radar Sensors. Available online: https://www.digitimes.com/news/a20170906PD208.html#:~:text=In%202017%2C%20prices%20for%2024GHz,and%20US%24100%2D200%20respectively (accessed on 1 October 2020).
  • Lambert, E.G.S. LiDAR Systems: Costs, Integration, and Major Manufacturers. Available online: https://www.mes-insights.com/lidar-systems-costs-integration-and-major-manufacturers-a-908358/ (accessed on 1 October 2020).
  • Khader, S.C.M. An Introduction to Automotive Light Detection and Ranging (LIDAR) and Solutions to Serve Future Autonomous Driving Systems. Available online: https://www.ti.com/lit/wp/slyy150a/slyy150a.pdf?ts=1602909172290&ref_url=https%253A%252F%252Fwww.google.com%252F#:~:text=As%20LIDAR%20has%20gained%20in,than%20US%24200%20by%202022 (accessed on 1 October 2020).
  • CSI. Mobileye Car System Installation. Available online: https://www.carsystemsinstallation.ca/product/mobileye-630/ (accessed on 1 October 2020).
  • Tech, E. Mobileye Monocamera. Available online: https://www.extremetech.com/extreme/145610-mobileye-outfits-old-cars-with-new-electronic-vision (accessed on 1 October 2020).
  • GroupGets. FLIR ADK—Thermal Vision Automotive Development Kit. Available online: https://store.groupgets.com/products/flir-adk-thermal-vision-automotive-development-kit#:~:text=FLIR%20ADK%20%2D%20Thermal%20Vision%20Automotive%20Development%20Kit%20%E2%80%93%20GroupGets (accessed on 1 October 2019).
  • Meinel, H.H.; Bösch, W. Radar Sensors in Cars. In Automated Driving: Safer and More Efficient Future Driving ; Watzenig, D., Horn, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 245–261. [ Google Scholar ]
  • Stuff, A. Radar Technical Specifications. Available online: https://autonomoustuff.com/wp-content/uploads/2020/06/radar-comp-chart.pdf (accessed on 1 October 2020).
  • BOSCH. Mid Range Radar. Available online: https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/lane-change-assist/mid-range-radar-sensor-mrrrear/ (accessed on 1 October 2020).
  • Schneider, R.; Wenger, J. High resolution radar for automobile applications. Adv. Radio Sci. 2003 , 1 , 105–111. [ Google Scholar ] [ CrossRef ]
  • Schneider, M. Automotive radar–Status and trends. In Proceedings of the German Microwave Conference, Ulm, Germany, 5–7 April 2005. [ Google Scholar ]
  • Fölster, F.; Rohling, H. Signal processing structure for automotive radar. Frequenz 2006 , 60 , 20–24. [ Google Scholar ] [ CrossRef ]
  • Li, J.; Stoica, P. MIMO radar with colocated antennas. IEEE Signal Process. Mag. 2007 , 24 , 106–114. [ Google Scholar ] [ CrossRef ]
  • Rasshofer, R.H. Functional requirements of future automotive radar systems. In Proceedings of the 2007 European Radar Conference, Munich, Germany, 10–12 October 2007; pp. 259–262. [ Google Scholar ] [ CrossRef ]
  • Schoor, M.; Yang, B. High-Resolution Angle Estimation for an Automotive FMCW Radar Sensor. In Proceedings of the International Radar Symposium (IRS), Cologne, Germany 5–7 September 2007. [ Google Scholar ]
  • Brunnbauer, M.; Meyer, T.; Ofner, G.; Mueller, K.; Hagen, R. Embedded wafer level ball grid array (eWLB). In Proceedings of the 2008 33rd IEEE/CPMT International Electronics Manufacturing Technology Conference (IEMT), Penang, Malaysia, 4–6 November 2008. [ Google Scholar ]
  • Knapp, H.; Treml, M.; Schinko, A.; Kolmhofer, E.; Matzinger, S.; Strasser, G.; Lachner, R.; Maurer, L.; Minichshofer, J. Three-channel 77 GHz automotive radar transmitter in plastic package. In Proceedings of the 2012 IEEE Radio Frequency Integrated Circuits Symposium, Montreal, QC, Canada, 17–19 June 2012; pp. 119–122. [ Google Scholar ] [ CrossRef ]
  • Wagner, C.; Böck, J.; Wojnowski, M.; Jäger, H.; Platz, J.; Treml, M.; Dober, F.; Lachner, R.; Minichshofer, J.; Maurer, L. A 77GHz automotive radar receiver in a wafer level package. In Proceedings of the 2012 IEEE Radio Frequency Integrated Circuits Symposium, Montreal, QC, Canada, 17–19 June 2012; pp. 511–514. [ Google Scholar ] [ CrossRef ]
  • Keysight Technologies. How Millimeter Wave Automotive Radar Enhances ADAS and Autonomous Driving. 2018. Available online: www.keysight.com (accessed on 1 October).
  • Patole, S.M.; Torlak, M.; Wang, D.; Ali, M. Automotive radars: A review of signal processing techniques. IEEE Signal Process. Mag. 2017 , 34 , 22–35. [ Google Scholar ] [ CrossRef ]
  • Dickmann, J.; Klappstein, J.; Hahn, M.; Appenrodt, N.; Bloecher, H.L.; Werber, K.; Sailer, A. Automotive radar the key technology for autonomous driving: From detection and ranging to environmental understanding. In Proceedings of the 2016 IEEE Radar Conference (RadarConf), Philadelphia, PA, USA, 2–6 May 2016. [ Google Scholar ] [ CrossRef ]
  • Meinl, F.; Stolz, M.; Kunert, M.; Blume, H. R27. An experimental high performance radar system for highly automated driving. In Proceedings of the 2017 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Nagoya, Japan, 19–21 March 2017; pp. 71–74. [ Google Scholar ] [ CrossRef ]
  • Brisken, S.; Ruf, F.; Höhne, F. Recent evolution of automotive imaging radar and its information content. IET Radar Sonar Navig. 2018 , 12 , 1078–1081. [ Google Scholar ] [ CrossRef ]
  • Eltrass, A.; Khalil, M. Automotive radar system for multiple-vehicle detection and tracking in urban environments. IET Intell. Transp. Syst. 2018 , 12 , 783–792. [ Google Scholar ] [ CrossRef ]
  • Huang, X.; Ding, J.; Liang, D.; Wen, L. Multi-Person Recognition Using Separated Micro-Doppler Signatures. IEEE Sens. J. 2020 , 20 , 6605–6611. [ Google Scholar ] [ CrossRef ]
  • Lee, S.; Yoon, Y.; Lee, J.; Kim, S. Human–vehicle classification using feature-based SVM in 77-GHz automotive FMCW radar. IET Radar Sonar Navig. 2017 , 11 , 1589–1596. [ Google Scholar ] [ CrossRef ]
  • Zang, S.; Ding, M.; Smith, D.; Tyler, P.; Rakotoarivelo, T.; Kaafar, M.A. The Impact of Adverse Weather Conditions on Autonomous Vehicles: How Rain, Snow, Fog, and Hail Affect the Performance of a Self-Driving Car. IEEE Veh. Technol. Mag. 2019 , 14 , 103–111. [ Google Scholar ] [ CrossRef ]
  • Yang, R.K.; Li, L.; Ma, H.H. Effects of Backscattering Enhancement Considering Multiple Scattering in Rain on Mmw Radar Performance ; NISCAIR-CSIR: New Delhi, India, 2013. [ Google Scholar ]
  • Huang, J.; Jiang, S.; Lu, X. Rain Backscattering Properties And Effects On The Radar Performance At Mm Wave Band. Int. J. Infrared Millim. Waves 2001 , 22 , 917–922. [ Google Scholar ] [ CrossRef ]
  • Norouzian, F.; Marchetti, E.; Hoare, E.; Gashinova, M.; Constantinou, C.; Gardner, P.; Cherniakov, M. Experimental study on low-THz automotive radar signal attenuation during snowfall. IET Radar Sonar Navig. 2019 , 13 , 1421–1427. [ Google Scholar ] [ CrossRef ]
  • Pozhidaev, V.N. AVR 8.1 Estimation of attenuation and backscattering of millimeter radio waves in meteorological formations. J. Commun. Technol. Electron. 2010 , 55 , 1223–1230. [ Google Scholar ] [ CrossRef ]
  • Hasirlioglu, S.; Riener, A. Challenges in Object Detection Under Rainy Weather Conditions. In Proceedings of the Second EAI International Conference, INTSYS 2018, Guimarães, Portugal, 21–23 November 2018. [ Google Scholar ]
  • Wang, J.-G.; Chen, S.J.; Zhou, L.-B.; Wan, K.; Yau, W.-Y. Vehicle Detection and Width Estimation in Rain by Fusing Radar and Vision. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018; pp. 1063–1068. [ Google Scholar ]
  • Peynot, T.; Underwood, J.; Scheding, S. Towards reliable perception for unmanned ground vehicles in challenging conditions. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 1170–1176. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Brooker, G.; Hennessey, R.; Lobsey, C.; Bishop, M.; Widzyk-Capehart, E. Seeing through Dust and Water Vapor: Millimeter Wave Radar Sensors for Mining Applications. J. Field Robot. 2007 , 24 , 527–557. [ Google Scholar ] [ CrossRef ]
  • Alland, S.; Stark, W.; Ali, M.; Hegde, M. Interference in Automotive Radar Systems: Characteristics, Mitigation Techniques, and Current and Future Research. IEEE Signal Process. Mag. 2019 , 36 , 45–59. [ Google Scholar ] [ CrossRef ]
  • Wikipedia. Lidar. 2019. Available online: https://en.wikipedia.org/wiki/Lidar (accessed on 1 October 2019).
  • Hecht, J. Lidar for Self Driving. 2018, pp. 1–8. Available online: https://www.osapublishing.org/DirectPDFAccess/4577511A-CFD1-17871681F42CA46FF8BC_380434/opn-29-1-26.pdf?da=1&id=380434&seq=0&mobile=no (accessed on 1 October 2019). (In English).
  • Stuff, A. Lidar technical comparision. 2019. Available online: https://autonomoustuff.com/lidar-chart (accessed on 1 October 2019).
  • Velodyne. HDL 64. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 1 October 2020).
  • Wallace, A.M.; Halimi, A.; Buller, G.S. Full Waveform LiDAR for Adverse Weather Conditions. IEEE Trans. Veh. Technol. 2020 , 69 , 7064–7077. [ Google Scholar ] [ CrossRef ]
  • Li, Y.; Ibanez-Guzman, J. Lidar for Autonomous Driving: The Principles, Challenges, and Trends for Automotive Lidar and Perception Systems. IEEE Signal Process. Mag. 2020 , 37 , 50–61. [ Google Scholar ] [ CrossRef ]
  • Zermas, D.; Izzat, I.; Papanikolopoulos, N. Fast segmentation of 3D point clouds: A paradigm on LiDAR data for autonomous vehicle applications. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5067–5073. [ Google Scholar ] [ CrossRef ]
  • Bogoslavskyi, I.; Stachniss, C. Fast range image-based segmentation of sparse 3D laser scans for online operation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 163–169. [ Google Scholar ] [ CrossRef ]
  • Thrun, A.P.S. Model based vehicle detection and tracking for autonomous urban driving. Auton Robot 2009 , 26 , 123–129. [ Google Scholar ] [ CrossRef ]
  • Himmelsbach, M.; Mueller, A.; Lüttel, T.; Wünsche, H.J. LIDAR-based 3D Object Perception. In Proceedings of the 1st Int. Workshop on Cognition for Technical Systems, Muenchen, Germany, 6–10 May 2008. [ Google Scholar ]
  • Capellier, E.; Davoine, F.; Cherfaoui, V.; Li, Y. Evidential deep learning for arbitrary LIDAR object classification in the context of autonomous driving. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 1304–1311. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Zeng, W.D.; Posner, I.; Newman, P. What could move? Finding cars, pedestrians and bicyclists in 3D laser data. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 4038–4044. [ Google Scholar ] [ CrossRef ]
  • Douillard, J.U.B.; Vlaskine, V.; Quadros, A.; Singh, S. A Pipeline for the Segmentation and Classification of 3D Point Clouds ; Springer: Berlin/Heidelberg, Germany, 2014. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Premebida, C.; Ludwig, O.; Nunes, U. Exploiting LIDAR-based features on pedestrian detection in urban scenarios. In Proceedings of the 2009 12th International IEEE Conference on Intelligent Transportation Systems, St. Louis, MO, USA, 4–7 October 2009. [ Google Scholar ] [ CrossRef ]
  • Kraemer, S.; Stiller, C.; Bouzouraa, M.E. LiDAR-Based Object Tracking and Shape Estimation Using Polylines and Free-Space Information. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4515–4522. [ Google Scholar ] [ CrossRef ]
  • Zhang, X.; Xu, W.; Dong, C.; Dolan, J.M. Efficient L-shape fitting for vehicle detection using laser scanners. In Proceedings of the 2017 IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA, 11–14 June 2017; pp. 54–59. [ Google Scholar ] [ CrossRef ]
  • Rachman, A.S.A. 3D-LIDAR Multi Object Tracking for Autonomous Driving. 2017. Available online: https://www.semanticscholar.org/paper/3D-LIDAR-Multi-Object-Tracking-for-Autonomous-and-Rachman/bafc8fcdee9b22708491ea1293524ece9e314851 (accessed on 1 October 2019).
  • Hespel, L.; Riviere, N.; Huet, T.; Tanguy, B.; Ceolato, R. Performance evaluation of laser scanners through the atmosphere with adverse condition. Proc SPIE 2011 . [ Google Scholar ] [ CrossRef ]
  • Heinzler, R.; Schindler, P.; Seekircher, J.; Ritter, W.; Stork, W. Weather Influence and Classification with Automotive Lidar Sensors. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 1527–1534. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Kutila, M.; Pyykönen, P.; Holzhüter, H.; Colomb, M.; Duthon, P. Automotive LiDAR performance verification in fog and rain. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, Maui, HI, USA, 4–7 November 2018; pp. 1695–1701. [ Google Scholar ] [ CrossRef ]
  • Bijelic, M.; Gruber, T.; Ritter, W. A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down? In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 760–767. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Li, Y.; Duthon, P.; Colomb, M.; Ibanez-Guzman, J. What Happens for a ToF LiDAR in Fog? IEEE Trans. Intell. Transp. Syst. 2020 . [ Google Scholar ] [ CrossRef ]
  • Kutila, M.; Pyykonen, P.; Ritter, W.; Sawade, O.; Schäufele, B. Automotive LIDAR sensor development scenarios for harsh weather conditions. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Rio de Janeiro, Brazil, 1–4 November 2016; pp. 265–270. [ Google Scholar ]
  • Goodin, C.; Carruth, D.; Doude, M.; Hudson, C. Predicting the influence of rain on LIDAR in ADAS. Electronics 2019 , 8 , 89. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Byeon, M.; Yoon, S.W. Analysis of Automotive Lidar Sensor Model Considering Scattering Effects in Regional Rain Environments. IEEE Access 2020 , 8 , 102669–102679. [ Google Scholar ] [ CrossRef ]
  • Michaud, S.; Lalonde, J.-F.; Giguère, P. Towards Characterizing the Behavior of LiDARs in Snowy Conditions. In Proceedings of the 7th Workshop on Planning, Perception and Navigation for Intelligent Vehicles, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015. [ Google Scholar ]
  • Ryde, J.; Hillier, N. Performance of laser and radar ranging devices in adverse environmental conditions. J. Field Robot. 2009 , 26 , 712–727. [ Google Scholar ] [ CrossRef ]
  • Wojtanowski, J.; Zygmunt, M.; Kaszczuk, M.; Mierczyk, Z.; Muzal, M. Comparison of 905 nm and 1550 nm semiconductor laser rangefinders’ performance deterioration due to adverse environmental conditions. Opto-Electron. Rev. 2014 , 22 , 183–190. [ Google Scholar ] [ CrossRef ]
  • Tesla “Model ‘S’ Owners Manual”. 2020. Available online: https://www.tesla.com/sites/default/files/model_s_owners_manual_north_america_en_us.pdf (accessed on 1 October 2020).
  • Technavio. Global Automotive Parking Sensors Market 2016–2020. 2016. Available online: https://www.technavio.com/report/global-automotive-electronicsglobal-automotive-parking-sensors-market-2016-2020 (accessed on 1 October 2020).
  • Bosch. Ultrasonic Sensor Technical Performance. Available online: https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/construction-zone-assist/ultrasonic-sensor/ (accessed on 1 October 2020).
  • Murata Manufacturing Co., Ltd. Ultrasonic Sensors. Available online: http://www.symmetron.ru/suppliers/murata/files/pdf/murata/ultrasonic-sensors.pdf (accessed on 1 October 2019).
  • Pepperl_Fuchs. Technology Guide for Ultrasonic Sensors. 2010. Available online: https://files.pepperl-fuchs.com/webcat/navi/productInfo/doct/tdoct3631a_eng.pdf?v=20181114123018 (accessed on 1 October 2019).
  • GmbH, R.B. Bosch Automotive Handbook ; Robert Bosch GmbH: Gerlingen, Germany, 2004. [ Google Scholar ]
  • Massa, D.P. Choosing ultrasonic sensor for proximity or distance measurement. Sensors 1999 , 16 , 3. [ Google Scholar ]
  • Nordevall, J.; Method Development of Automotive Ultrasound Simulations. Applied Mechanics, Chalmers University of Technology. 2015. Available online: http://publications.lib.chalmers.se/records/fulltext/219224/219224.pdf (accessed on 1 October 2019).
  • Hatano, H.; Yamazato, T.; Katayama, M. Automotive Ultrasonic Array Emitter for Short-range Targets Detection. In Proceedings of the 2007 4th International Symposium on Wireless Communication Systems, Trondheim, Norway, 17–19 October 2007; pp. 355–359. [ Google Scholar ]
  • Agarwal, V.; Murali, N.V.; Chandramouli, C. A Cost-Effective Ultrasonic Sensor-Based Driver-Assistance System for Congested Traffic Conditions. IEEE Trans. Intell. Transp. Syst. 2009 , 10 , 486–498. [ Google Scholar ] [ CrossRef ]
  • Adarsh, S.; Kaleemuddin, S.M.; Bose, D.; Ramachandran, K.I. Performance comparison of Infrared and Ultrasonic sensors for obstacles of different materials in vehicle/ robot navigation applications. IOP Conf. Ser. Mater. Sci. Eng. 2016 , 149 , 012141. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Kapoor, R.; Ramasamy, S.; Gardi, A.; van Schyndel, R.; Sabatini, R. Acoustic sensors for air and surface navigation applications. Sensors 2018 , 18 , 499. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Li, S.E.; Li, G.; Yu, J.; Liu, C.; Cheng, B.; Wang, J.; Li, K. Kalman filter-based tracking of moving objects using linear ultrasonic sensor array for road vehicles. Mech. Syst. Signal Process. 2018 , 98 , 173–189. [ Google Scholar ] [ CrossRef ]
  • Jiménez, F.; Naranjo, E.J.; Gómez, O.; Anaya, J.J. Vehicle Tracking for an Evasive Manoeuvres Assistant Using Low-Cost Ultrasonic Sensors. Sensors 2014 , 14 , 22689–22705. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Yu, J.; Li, S.E.; Liu, C.; Cheng, B. Dynamical tracking of surrounding objects for road vehicles using linearly-arrayed ultrasonic sensors. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium (IV), Gothenburg, Sweden, 19–22 June 2016; pp. 72–77. [ Google Scholar ] [ CrossRef ]
  • Liptai, P.; Badida, M.; Lukáčová, K. Influence of Atmospheric Conditions on Sound Propagation—Mathematical Modeling. Óbuda Univ. e-Bull. 2015 , 5 , 127–134. [ Google Scholar ]
  • Mahapatra, R.P.; Kumar, K.V.; Khurana, G.; Mahajan, R. Ultra Sonic Sensor Based Blind Spot Accident Prevention System. In Proceedings of the 2008 International Conference on Advanced Computer Theory and Engineering, Phuket, Thailand, 20–22 December 2008; pp. 992–995. [ Google Scholar ] [ CrossRef ]
  • Alonso, L.; Oria, J.P.; Arce, J.; Fernandez, M. Urban traffic avoiding car collisions fuzzy system based on ultrasound. In Proceedings of the 2008 World Automation Congress, Hawaii, HI, USA, 28 September–2 October 2008. [ Google Scholar ]
  • Kai-Tai, S.; Chih-Hao, C.; Chiu, H.C. Design and experimental study of an ultrasonic sensor system for lateral collision avoidance at low speeds. In Proceedings of the IEEE Intelligent Vehicles Symposium, Parma, Italy, 14–17 June 2004; pp. 647–652. [ Google Scholar ] [ CrossRef ]
  • Rhee, J.H.; Seo, J. Low-Cost Curb Detection and Localization System Using Multiple Ultrasonic Sensors. Sensors 2019 , 19 , 1389. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Hosur, P.; Shettar, R.B.; Potdar, M. Environmental awareness around vehicle using ultrasonic sensors. In Proceedings of the 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Jaipur, India, 21–24 September 2016; pp. 1154–1159. [ Google Scholar ]
  • Shin, S.; Choi, S.B. Target Speed Sensing Technique using Dilation Correlation of Ultrasonic Signal for Vehicle. In Proceedings of the 2019 IEEE Sensors Applications Symposium (SAS), Sophia Antipolis, France, 11–13 March 2019. [ Google Scholar ] [ CrossRef ]
  • Kredba, J.; Holada, M. Precision ultrasonic range sensor using one piezoelectric transducer with impedance matching and digital signal processing. In Proceedings of the IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM), Donostia-San Sebastian, Spain, 24–26 May 2017. [ Google Scholar ] [ CrossRef ]
  • Nauth, P.M.; Pech, A.H.; Michalik, R. Research on a new Smart Pedestrian Detection Sensor for Vehicles. In Proceedings of the IEEE Sensors Applications Symposium (SAS), Sophia Antipolis, France, 11–13 March 2019. [ Google Scholar ] [ CrossRef ]
  • Tsai, W.-Y.; Chen, H.-C.; Liao, T.-L. An ultrasonic air temperature measurement system with self-correction function for humidity. Meas. Sci. Technol. 2005 , 16 , 548–555. [ Google Scholar ] [ CrossRef ]
  • Muhoz, A.C. CHAPTER 15—Position and Motion Sensors. In Sensor Technology Handbook ; Wilson, J.S., Ed.; Burlington: Newnes, Australia, 2005; pp. 321–409. [ Google Scholar ]
  • S. F. University. SOUND PROPAGATION. Available online: https://www.sfu.ca/sonic-studio-webdav/handbook/Sound_Propagation.html (accessed on 1 October 2019).
  • Lim, B.S.; Keoh, S.L.; Thing, V.L. Autonomous Vehicle Ultrasonic Sensor Vulnerability and Impact Assessment. 2018. Available online: https://ieeexplore.ieee.org/document/8355132 (accessed on 1 October 2020).
  • Xu, W.; Yan, C.; Jia, W.; Ji, X.; Liu, J. Analyzing and Enhancing the Security of Ultrasonic Sensors for Autonomous Vehicles. IEEE Internet Things J. 2018 , 5 , 5015–5029. [ Google Scholar ] [ CrossRef ]
  • Mobileye. Mobileye C2-270 Technical Datasheet. Available online: https://itk-mdl.asutk.ru/upload/iblock/c82/Mobileye%20C2-270%20Technical%20Spec%20v1.2.pdf (accessed on 1 October 2020).
  • Autonomoustuff. Mobileye Moncamera. Available online: https://autonomoustuff.com/product/mobileye-camera-dev-kit/ (accessed on 1 October 2020).
  • Mehta, S.; Patel, A.; Mehta, J. CCD or CMOS Image Sensor For Photography. In Proceedings of the International Conference on Communications and Signal Processing (ICCSP), Melmaruvathur, India, 2–4 April 2015. [ Google Scholar ] [ CrossRef ]
  • T. D. Inc. CCD vs. CMOS. Available online: https://www.teledynedalsa.com/imaging/ knowledge-center/appnotes/ccd-vs-cmos/ (accessed on 20 February 2020).
  • Bernini, N.; Bertozzi, M.; Castangia, L.; Patander, M.; Sabbatelli, M. Real-time obstacle detection using stereo vision for autonomous ground vehicles: A survey. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 873–878. [ Google Scholar ] [ CrossRef ]
  • John, V.; Yoneda, K.; Liu, Z.; Mita, S. Saliency Map Generation by the Convolutional Neural Network for Real-Time Traffic Light Detection Using Template Matching. IEEE Trans. Comput. Imaging 2015 , 1 , 159–173. [ Google Scholar ] [ CrossRef ]
  • Mu, G.; Xinyu, Z.; Deyi, L.; Tianlei, Z.; Lifeng, A. Traffic light detection and recognition for autonomous vehicles. J. China Univ. Posts Telecommun. 2015 , 22 , 50–56. [ Google Scholar ] [ CrossRef ]
  • Zhang, J.; Huang, M.; Jin, X.; Li, X. A Real-Time Chinese Traffic Sign Detection Algorithm Based on Modified YOLOv2. Algorithms 2017 , 10 , 127. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Kulkarni, R.; Dhavalikar, S.; Bangar, S. Traffic Light Detection and Recognition for Self Driving Cars Using Deep Learning. In Proceedings of the 2018 4th International Conference on Computing, Communication Control and Automation, ICCUBEA, Pune, India, 16–18 August 2018. [ Google Scholar ] [ CrossRef ]
  • Hasirlioglu, S.; Riener, A. Introduction to rain and fog attenuation on automotive surround sensors. In Proceedings of the IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October 2017. [ Google Scholar ]
  • Hasirlioglu, S.; Riener, A. A Model-Based Approach to Simulate Rain Effects on Automotive Surround Sensor Data. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; pp. 2609–2615. [ Google Scholar ] [ CrossRef ]
  • Hasirlioglu, S.; Kamann, A.; Doric, I.; Brandmeier, T. Test methodology for rain influence on automotive surround sensors. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 2242–2247. [ Google Scholar ] [ CrossRef ]
  • Xique, I.J.; Buller, W.; Fard, Z.B.; Dennis, E.; Hart, B. Evaluating Complementary Strengths and Weaknesses of ADAS Sensors. In Proceedings of the IEEE Vehicular Technology Conference, Chicago, IL, USA, 27–30 August 2018. [ Google Scholar ] [ CrossRef ]
  • Garg, K.; Nayar, S.K. When does a camera see rain? In Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV’05) Volume 1, Beijing, China, 17–21 October 2005; Volume 2, pp. 1067–1074. [ Google Scholar ] [ CrossRef ]
  • Gangula, L.B.; Srikanth, G.; Naveen, C.; Satpute, V.R. Vision Improvement in Automated Cars by Image Deraining. In Proceedings of the 2018 IEEE International Students’ Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 24–25 February 2018. [ Google Scholar ] [ CrossRef ]
  • Lee, U.; Jung, J.; Shin, S.; Jeong, Y.; Park, K.; Shim, D.H.; Kweon, I.S. EureCar turbo: A self-driving car that can handle adverse weather conditions. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 2301–2306. [ Google Scholar ] [ CrossRef ]
  • Flir. Automotive Thermal Camera Specification. Available online: https://www.flir.ca/products/adk/ (accessed on 1 October 2020).
  • FLIR. FLIR ADK. Available online: https://www.flir.ca/products/adk/ (accessed on 1 October 2020).
  • FLIR Systems. The Ultimate Infrared Handbook For R&D Professionals ; FLIR AB: Boston, MA, USA, 2010; Available online: https://www.flirmedia.com/MMC/THG/Brochures/T559243/T559243_EN.pdf (accessed on 1 October 2019).
  • John, V.; Tsuchizawa, S.; Liu, Z.; Mita, S. Fusion of thermal and visible cameras for the application of pedestrian detection. Signal Image Video Process. 2016 , 11 , 517–524. [ Google Scholar ] [ CrossRef ]
  • Chien, S.C.; Chang, F.C.; Tsai, C.C.; Chen, Y.Y. Intelligent all-day vehicle detection based on decision-level fusion using color and thermal sensors. In Proceedings of the International Conference on Advanced Robotics and Intelligent Systems, ARIS, Taipei, Taiwan, 6–8 September 2017. [ Google Scholar ] [ CrossRef ]
  • Hurney, P.; Morgan, F.; Glavin, M.; Jones, E.; Waldron, P. Review of pedestrian detection techniques in automotive far-infrared video. IET Intell. Transp. Syst. 2015 , 9 , 824–832. [ Google Scholar ] [ CrossRef ]
  • Berg, A. Detection and Tracking in Thermal Infrared Imagery. Ph.D. Thesis, Electrical Engineering Linkoping University, Linköping, Sweden, 2016. [ Google Scholar ]
  • Kim, T.; Kim, S. Pedestrian detection at night time in FIR domain: Comprehensive study about temperature and brightness and new benchmark. Pattern Recognit. 2018 , 79 , 44–54. [ Google Scholar ] [ CrossRef ]
  • Wang, H.; Cai, Y.; Chen, X.; Chen, L. Night-Time Vehicle Sensing in Far Infrared Image with Deep Learning. J. Sens. 2016 , 2016 , 3403451. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Qi, B.; John, V.; Liu, Z.; Mita, S. Pedestrian detection from thermal images: A sparse representation based approach. Infrared Phys. Technol. 2016 , 76 , 157–167. [ Google Scholar ] [ CrossRef ]
  • Li, X.; Guo, R.; Chen, C. Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding. Sensors 2014 , 14 , 11245–11259. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Forslund, D.; Bjärkefur, J. Night vision animal detection. In Proceedings of the 2014 IEEE Intelligent Vehicles Symposium Proceedings, Dearborn, MI, USA, 8–11 June 2014; pp. 737–742. [ Google Scholar ] [ CrossRef ]
  • Fernández-Caballero, A.; López, M.T.; Serrano-Cuerda, J. Thermal-infrared pedestrian ROI extraction through thermal and motion information fusion. Sensors 2014 , 14 , 6666–6676. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Christiansen, P.; Steen, K.; Jørgensen, R.; Karstoft, H. Automated Detection and Recognition of Wildlife Using Thermal Cameras. Sensors 2014 , 14 , 13778–13793. [ Google Scholar ] [ CrossRef ]
  • Jeong, M.; Ko, B.C.; Nam, J.-Y. Early Detection of Sudden Pedestrian Crossing for Safe Driving During Summer Nights. IEEE Trans. Circuits Syst. Video Technol. 2017 , 27 , 1368–1380. [ Google Scholar ] [ CrossRef ]
  • Baek, J.; Hong, S.; Kim, J.; Kim, E. Efficient Pedestrian Detection at Nighttime Using a Thermal Camera. Sensors 2017 , 17 , 1850. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Jeon, E.S.; Kim, J.H.; Hong, H.G.; Batchuluun, G.; Park, K.R. Human Detection Based on the Generation of a Background Image and Fuzzy System by Using a Thermal Camera. Sensors 2016 , 16 , 453. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Choi, Y.; Kim, N.; Hwang, S.; Kweon, I.S. Thermal Image Enhancement using Convolutional Neural Network. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 223–230. [ Google Scholar ] [ CrossRef ]
  • Hwang, S.; Park, J.; Kim, N.; Choi, Y.; Kweon, I.S. Multispectral pedestrian detection: Benchmark dataset and baseline. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1037–1045. [ Google Scholar ] [ CrossRef ]
  • Iwasaki, Y. A Method of Robust Moving Vehicle Detection For Bad Weather Using An Infrared Thermography Camera. In Proceedings of the International Conference on Wavelet Analysis and Pattern Recognition, Hong Kong, China, 30–31 August 2008. [ Google Scholar ]
  • Iwasaki, Y.; Misumi, M.; Nakamiya, T. Robust Vehicle Detection under Various Environmental Conditions Using an Infrared Thermal Camera and Its Application to Road Traffic Flow Monitoring. Sensors 2013 , 13 , 7756. [ Google Scholar ] [ CrossRef ]
  • Pinchon, N.; Cassignol, O.; Nicolas, A.; Bernardin, F.; Leduc, P.; Tarel, J.P.; Brémond, R.; Bercier, E.; Brunet, J. All-Weather Vision for Automotive Safety: Which Spectral Band? In Advanced Microsystems for Automotive Applications 2018 ; Dubbert, J., Müller, B., Meyer, G., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 3–15. [ Google Scholar ]
  • Sabry, M.; Al-Kaff, A.; Hussein, A.; Abdennadher, S. Ground Vehicle Monocular Visual Odometry. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 3587–3592. [ Google Scholar ] [ CrossRef ]
  • Howard, B. MIT Spinoff WaveSense’s Ground-Penetrating Radar Looks Down for Perfect Self-Driving. Available online: https://www.extremetech.com/extreme/306205-mit-wavesense-ground-penetrating-radar-self-driving (accessed on 1 October 2020).
  • Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access 2020 . [ Google Scholar ] [ CrossRef ]
  • Göhring, D.; Wang, M.; Schnürmacher, M.; Ganjineh, T. Radar/Lidar sensor fusion for car-following on highways. In Proceedings of the The 5th International Conference on Automation, Robotics and Applications, Wellington, New Zealand, 6–8 December 2011; pp. 407–412. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Savasturk, D.; Froehlich, B.; Schneider, N.; Enzweiler, M.; Franke, U. A Comparison Study on Vehicle Detection in Far Infrared and Regular Images. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Las Palmas, Spain, 15–18 September 2015. [ Google Scholar ]
  • Mockel, S.; Scherer, F.; Schuster, P.F. Multi-sensor obstacle detection on railway tracks. In Proceedings of the IEEE IV2003 Intelligent Vehicles Symposium. Proceedings (Cat. No.03TH8683), Columbus, OH, USA, 9–11 June 2003; pp. 42–46. [ Google Scholar ] [ CrossRef ]
  • Nabati, R.; Qi, H. Radar-Camera Sensor Fusion for Joint Object Detection and distance estimation. arXiv 2020 , arXiv:2009.08428v1. [ Google Scholar ]
  • Radecki, P.; Campbell, M.; Matzen, K. All Weather Perception_ Joint Data Association, Tracking and classification. arXiv 2016 , arXiv:1605.02196v1. [ Google Scholar ]
  • Diaz-Cabrera, M.; Cerri, P.; Medici, P. Robust real-time traffic light detection and distance estimation using a single camera. Expert Syst. Appl. 2015 , 42 , 3911–3923. [ Google Scholar ] [ CrossRef ]
  • Zhou, L.; Deng, Z. LIDAR and vision-based real-time traffic sign detection and recognition algorithm for intelligent vehicle. In Proceedings of the 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014. [ Google Scholar ]
  • Kim, T.; Song, B. Detection and Tracking of Road Barrier Based on Radar and Vision Sensor Fusion. J. Sens. 2016 . [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Im, G.; Kim, M.; Park, J. Parking Line Based SLAM Approach Using AVM/LiDAR Sensor Fusion for Rapid and Accurate Loop Closing and Parking Space Detection. Sensors 2019 , 19 , 4811. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Choi, J.; Chang, E.; Yoon, D.; Ryu, S.; Jung, H.; Suhr, J. Sensor Fusion-Based Parking Assist System ; SAE Technical Paper; SAE: Warrendale, PN, USA, 2014. [ Google Scholar ]

Click here to enlarge figure

CriterionIndexes (i)
Range0—None
1—Very low performance
2—Low
3—High
4—Very high performance
Resolution
Contrast
Weather
Cost0—None
1—Very high cost
2—High cost
3—Low cost
4—Very low cost
SensorsActual PriceIdeal Price
Ultrasonic sensorsUSD 16 to USD 40USD 16 to USD 40
Automotive radarUSD 50 to USD 220USD 50 to USD 220
Automotive lidarUSD 500 to USD 75,000USD 100 to USD 10,000
Automotive mono-cameraUSD 100 to USD 1000USD 100 to USD 700
Automotive thermal cameraUSD 700 to USD 3000USD 100 to USD 1500
SensorAdvantagesDisadvantages
Radar
Lidar
Ultrasonic
Camera
Far-Infrared
ApplicationRadarUltrasonicLidar CameraFar-Infrared
Short RangeMedium RangeLong RangeShort RangeMedium RangeLong RangeMonocular CameraStereo Camera
Adaptive Cruise Control
Forward Collision Avoidance
Road/Traffic Sign Recognition
Traffic Jam Assist
Lane Departure and Lane Keeping Assistance
Blind Spot Monitoring
Parking Assistance
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Mohammed, A.S.; Amamou, A.; Ayevide, F.K.; Kelouwani, S.; Agbossou, K.; Zioui, N. The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review. Sensors 2020 , 20 , 6532. https://doi.org/10.3390/s20226532

Mohammed AS, Amamou A, Ayevide FK, Kelouwani S, Agbossou K, Zioui N. The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review. Sensors . 2020; 20(22):6532. https://doi.org/10.3390/s20226532

Mohammed, Abdul Sajeed, Ali Amamou, Follivi Kloutse Ayevide, Sousso Kelouwani, Kodjo Agbossou, and Nadjet Zioui. 2020. "The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review" Sensors 20, no. 22: 6532. https://doi.org/10.3390/s20226532

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

CHAPTER TWO LITERATURE REVIEW 2.1 SENSORS/MOTION SENSORS

Profile image of judekingsley chigozie

Related Papers

Ekpenyong B Effiong

literature review ultrasonic sensor

SHAHUDULLAH KHAN

Hvmm Khoa Dien Tu

gidey gebrehiwot

kanchapogu nagaraju

Sameer Patel

This project aims at automating many home appliances. The appliances are controlled automatically and the functioning of the appliances is controlled by the programmable Logic Controller (PLC). As the functioning of the appliances is integrated with the working of PLC, the project proves to be accurate, reliable and more efficient than the existing controllers. The processes that are proposed to be automated in this project are:- 1. Interior and Exterior Lights 2. Burglar Alarm 3. Fire Alarm 4. AC On/Off, Lights On/Off and Fans On/Off Using DTMF Also, the functioning of many of these devices will be interconnected depending upon the events that occur. The monitoring of the complete process will be done through SCADA.

notesengine.com

IOSR Journals

ajer research

Crime prevention aims at presenting an unattractive target to criminals, by removing opportunities, and the studies of devices required to accomplish this task is the aim of this study. The different types of security devices available to reduce and deter crime is presented. A study was made on metal detectors, alarms, and closed circuit television networks systems among others and their areas of application highlighted. It was discovered that alarm systems, closed circuit television (cctv) and internet protocol (ip) camera systems are generally suited for day to day security and crime protection, with the believe that the police and other government and private security outfits will find the study useful in their search for electronic packages of assistance in helping them in combatting the ever increasing crimes in our society.

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

IJSET JOURNAL

Ananthu Harikumar

Dr. Osama M Elmardi

Engr Khan , Engr. Omit Debnath

lakhi gorai

IJCSMC Journal

Prameela Mankina

Oyebola Olalekan

Nitish Nirala

Shelja Dhawan , IJAERS Journal

Projects Lab bd

Sanowar Hossain

Alabi Abdullateef

nelson valencia

khaukha rama

eSAT Journals

Hoang Thien

Ajay D Kandavalli

IJAR Indexing

Mathias Korbli

International Journal IJRITCC

Gaurav Soni

IJMER Journal

Raymart Bolido

IJERA Journal

Naveen Reddy

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

IMAGES

  1. The HC-SR04 ultrasonic sensors.

    literature review ultrasonic sensor

  2. Ultrasonic Sensor

    literature review ultrasonic sensor

  3. (PDF) RADAR SYSTEM USING ARDUINO AND ULTRASONIC SENSOR

    literature review ultrasonic sensor

  4. Micro bit Lesson

    literature review ultrasonic sensor

  5. (PDF) Ultrasonic Sensors

    literature review ultrasonic sensor

  6. Working principle of ultrasonic sensor

    literature review ultrasonic sensor

COMMENTS

  1. A review of ultrasonic sensing and machine learning methods to monitor

    The application of ultrasonic sensors for process monitoring is reviewed. ... We include literature from other sectors such as structural health monitoring. This review covers feature extraction, feature selection, algorithm choice, hyperparameter selection, data augmentation, domain adaptation, semi-supervised learning and machine learning ...

  2. Ultrasonic Sensor

    Ultrasonic sensors are generally made up of piezoelectric material, where the ultrasonic transmitter transmits the ultrasonic wave which travels through the air until it reaches an object or person, then the wave is reflected back and received by the ultrasonic receiver. ... The literature review shows that active sensors such as LiDAR are more ...

  3. (PDF) Ultrasonic Sensors

    7. 3. Introduction : Ultrasonic sensors work by transm itting a pulse of sound, much like sonar. detectors, outside the range of hum an hearing. This pulse travels away from the. range finder in a ...

  4. (PDF) A, Smart Ultrasonic Radar: Real-Time Object Detection and

    Abstract. Ultrasonic radar, also known as ultrasonic sensors or sonar systems, is a technology that utilizes ultrasonic waves to detect and locate objects. It operates at frequencies beyond the ...

  5. Sensors

    Dear Colleagues, Ultrasonic sensors are widely used in a multitude of applications all around us, starting from proximity sensors in automobiles, to medical and a wide range of industries. This Special Issue aims to highlight advances in the modeling and development of novel ultrasonic sensors with applications in diverse fields.

  6. Sensors for daily life: A review

    Ultrasonic sensors are also employed as the nearness sensor, which sets an impediment to the threshold distance. ... Physical activity monitoring by use of accelerometer-based body-worn sensors in older adults: a systematic literature review of current knowledge and applications. Maturitas, 71 (1) (2012), pp. 13-19. View PDF View article View ...

  7. Application of light detection and ranging and ultrasonic sensors to

    Whilst other reviews 1-7 covered aspects of these sensors in different contexts or with different focus, this literature review describes how ultrasonic and LiDAR sensors applied to high-throughput phenotyping and precision horticulture evolved since the earliest studies and identifies which subjects have gained more attention from ...

  8. High Temperature Ultrasonic Transducers: A Review

    There are many fields such as online monitoring of manufacturing processes, non-destructive testing in nuclear plants, or corrosion rate monitoring techniques of steel pipes in which measurements must be performed at elevated temperatures. For that high temperature ultrasonic transducers are necessary. In the presented paper, a literature review on the main types of such transducers ...

  9. High Temperature Ultrasonic Transducers: A Review

    For that high temperature ultrasonic transducers are necessary. In the presented paper, a literature review on the main types of such transducers, piezoelectric materials, backings, and the bonding techniques of transducers elements suitable for high temperatures, is presented. In this review, the main focus is on ultrasonic transducers with ...

  10. PDF Range Sensors: Ultrasonic Sensors, Kinect, and LiDAR

    A depth camera such as the Kinect sensor provides a depth map as well as the corresponding RGB image in real time. Similarly, a LiDAR sensor provides a collection of distance measurements and the associated gray-scale intensity values. We review four types of range sensors: ultrasonic, structured light (Kinect), time- of-flight (Swiss ranger ...

  11. A Review of Fingerprint Sensors: Mechanism, Characteristics, and

    The earliest ultrasonic fingerprint sensor is a type of ultrasonic probe detection (US5224174A) jointly proposed by Ultrasonic Scan and Niagara and is a probe ultrasonic detection system . The probe emits ultrasonic energy and scans twice from two directions at right angles; after the finger reflects, the pulse receiver absorbs the reflected ...

  12. Review of Obstacle Detection by Ultrasonic and Laser Sensor for

    The ultrasonic sensors have been used in many projects due to their cost effectiveness, but their range for obstacle detection is very less compared to the laser sensor. and thus can be used within small and medium distances. ... This paper presents literature review of ultrasonic and laser sensors for obstacle detection. This study concluded ...

  13. (PDF) Moving Object Detection Using Ultrasonic Radar with Proper

    The components in the figure are ultrasonic sensors [19], [20], Arduino Nano [21], [22] , Bluetooth module [23], MPU-6050 3-axis gyroscope and accelerometer [24], [25] as a position sensor ...

  14. An Overview of Autonomous Vehicles Sensors and Their Vulnerability to

    This paper presents a comprehensive literature review of the effects of different weather phenomena on an AV's capability to perceive their surroundings. ... They have the capability to detect objects that are solid, liquid, granular, or in powder form. Ultrasonic sensors rely on sonic transducers to transmit sonic waves in the range of 40 ...

  15. Study on automatic water level detection process using ultrasonic sensor

    In this article, we will consider the process of automating the measurement and control of water level, that is, an automated system that helps to know when the water in the tank is full or empty. In the automation of this system, an ultrasonic sensor that meets the requirements of the time was used. The working principle of ultrasonic sensors ...

  16. Sensors

    The ultrasonic sensor is popularly used to measure proximity to obstacles in a very short range and is widely used in areas where distance- and occupancy-related detections are needed. For vehicle applications, the popular uses of ultrasonic sensors include (1) low-speed car parking and (2) high-speed blind spot detection.

  17. Ultrasonic sensor based traffic information acquisition system; a

    Literature review. One of the most critical components of ITS is its accuracy in the acquisition of traffic information such as traffic flow, speed, and density. ... Ultrasonic sensors in Fig. 3 were mounted horizontally and the distance measured used to determine whether a vehicle is in lane n or n + 1. This was done by using predefined distances.

  18. (PDF) Ultrasonic Distance Measuring Device Study

    Chapter 2: Review of Related Literature and Studies ... So basically, an ultrasonic sensor sends ultrasonic waves which travels in air and gets reflected after striking any object. By studying the ...

  19. PDF An Ultrasonic Line Follower Robot to Detect Obstacles and Edges for

    LITERATURE REVIEW This Line Following Robot, which is a part of AGV (Automated Guided Vehicle), an autonomous robot in a simple term, can detect black or white lines to follow them. Students have been developing this kind of robots for a while. ... ultrasonic sensor detects the edge, then the robot stops as well .

  20. PDF A Review on Ultrasonic Radar Sensor for Security system

    The ultrasonic sensor cannot detect object more than three meters away. Fig. 3 - hardware design of project Ultrasonic HC-SR04 pin description The ultrasonic sensor is placed on a dc motor that rotates in a 360 span. When the object is detected the dc motor stops moving, and the ultrasonic sensor measure the distance from the object.

  21. CHAPTER TWO LITERATURE REVIEW 2.1 SENSORS/MOTION SENSORS

    CHAPTER TWO LITERATURE REVIEW 2.1 SENSORS/MOTION SENSORS Motion sensors are types of electronic security device that senses movement and usually triggers an alarm. Many types of motion sensors can sense motion in total darkness, without an intruder becoming aware that an alarm has been triggered. ... Ultrasonic sensors generate high frequency ...

  22. (PDF) DISTANCE MEASUREMENT USING ULTRASONIC SENSOR & ARDUINO

    The project is designed to measuring distance using ultrasonic waves and interfaced with arduino. We know that human audible range is 20hz to 20khz. We can utilize these frequency range waves ...

  23. RADAR SYSTEM USING ARDUINO AND ULTRASONIC SENSOR

    Hardware system consist of basically 3 components named as Arduino, servo-motor, and ultra-sonic sensor. Ultrasonic sensor is mounded. upon a servo motor which helps it to m ove and provide it a ...