Overview
ABSTRACT
Over the past two decades, the reduction of accidents and fatalities has been achieved in part through the implementation of ADAS. These onboard safety functions rely primarily on the use of exteroceptive vision sensors. To improve the performance and robustness of ADAS, new sensor technologies such as neuromorphic, polarized, and HDR cameras have been developed. These sensors, whether used individually or in combination, enable real-time perception of road scenes, which is essential for the development of new automated mobility solutions. This article provides an overview of these embedded vision sensor technologies, their applications, limitations, and future prospects.
Read this article from a comprehensive knowledge base, updated and supplemented with articles reviewed by scientific committees.
Read the articleAUTHORS
-
Dominique GRUYER: Research Director - Former Director of LIVIC (Laboratory on Vehicle-Infrastructure-Driver Interactions) - Director of the ICCAM International Associated Laboratory - Assistant to the COSYS Department Director for Automated and Connected Vehicles - COSYS-PICS-L, Gustave Eiffel University, Versailles, France
-
Sio-Song IENG: Researcher - COSYS-PICS-L, Gustave Eiffel University, Champs-sur-Marne, France
INTRODUCTION
Over the past few decades, efforts to improve road safety and reduce the number of road deaths have borne fruit. Thanks to a combination of factors, including technological advances, the number of annual deaths has fallen significantly, reaching a plateau of around 3,000 victims in France. A significant part of this improvement can be attributed to advanced driver assistance systems (ADAS). These systems, which include active features such as emergency braking and informative features such as lane departure warning, rely primarily on the integration of sophisticated sensors. They monitor the conditions outside the vehicle (exteroceptive sensors) as well as the condition of the vehicle itself (proprioceptive sensors), improving overall perception of the road environment and the vehicle's responsiveness in critical situations.
Among the most commonly used exteroceptive sensors in ADAS systems are ultrasonic sensors, cameras, radars, and, more recently, technologies such as LiDAR (Light Detection and Ranging) and infrared (IR) cameras. LiDAR enables extremely accurate three-dimensional modeling of the environment in real time, while IR cameras are particularly useful in low-visibility conditions, such as night driving or foggy weather. At the same time, new technologies are emerging to enhance these perception systems. These include neuromorphic cameras, which are inspired by the biology of the human eye to capture specific events, as well as polarized cameras, which improve visibility in certain complex lighting conditions, and HDR (High Dynamic Range) cameras, capable of capturing images with an extended dynamic range, enabling them to better handle high contrasts between light and shadow.
These sensors, whether used alone or in combination in data fusion systems, have revolutionized automotive safety and paved the way for a new era of driver assistance and automated driving. They enable a more detailed and comprehensive understanding of the road environment, resulting in an increased ability to prevent accidents and respond to unexpected events. On a practical level, perception technologies, particularly those involving onboard vision, now offer the ability to assess in real time the essential attributes of the five main actors in a road scene: obstacles, the road, the vehicle itself (ego-vehicle), the environment, and the driver. This makes it possible to generate local dynamic perception maps (CPDL) containing critical information for performing various functions, such as lane tracking, lane change assistance, automatic emergency braking, adaptive cruise control (ACC), Stop&Go, automatic parking maneuvers, and monitoring the driver's condition, particularly in the event of drowsiness or distraction.
The continuous evolution of optical...
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!
KEYWORDS
environment perception | vision sensors | automated mobility | Advanced Driving Assistance Systems (ADAS)
In-vehicle vision
Article included in this offer
"Vehicules and mobility"
(
89 articles
)
Updated and enriched with articles validated by our scientific committees
A set of exclusive tools to complement the resources
Bibliography
- (1) - GRUYER (D.), ORFILA (O.), GLASER (S.), HEDHLI (A.), HAUTIÈRE (N.), RAKOTONIRAINY (A.) - Are Connected and Autonomous Vehicles the silver bullet for future transportation issues ? Benefits and weaknesses on Safety, Consumption, and Traffic congestion. - In : Frontiers in Sustainable Cities, Special Collection « Advances in Road Safety...
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!