A New Vision for Driver Assistance Systems

Sensors Insights by Peter Riendeau

Although the numbers have steadily declined in recent years, European countries still report 75 traffic fatalities every day. Use of optoelectronics to provide video for modern automotive driver assistance systems (ADAS), however, can greatly reduce the likelihood of accidents.

These visibility systems alert drivers in advance to sudden bends in the road or obstacles so that they can quickly take appropriate action. Providing an accurate visual assessment of traffic and road conditions in bright, dark, and rainy conditions, however, is a major challenge for imaging devices because of the limited range of the vehicle's headlamps and the dynamic range of imagers.

Video cameras used as the front end of these systems capture the location, shape, size, brightness, and color of objects in the line of sight, depending on the lighting conditions. The systems can either display the captured images on the vehicle's instrument panel, generate warnings, or intervene in the vehicle's operation.

Issues arise, however, when the dynamic range of the scene (i.e., the ratio between the brightest and darkest points in the scene) exceeds the capabilities of conventional CCD or CMOS image sensors, which typically exhibit linear sensitivity. The headlights of oncoming vehicles and low-angle bright sunlight can seriously impair the ability of the camera to detect hazards.

Increasing Dynamic Range
If detailed imaging data cannot be acquired due to limitations of the camera's dynamic range, the safety of the driver, passengers, and pedestrians can be compromised. Increasing the dynamic range of a camera/image sensor markedly improves its ability to identify potential dangers. Applying a nonlinear approach expands the dynamic range of an image sensor and improves its ability to accurately represent scene data under extreme lighting conditions. Such an approach reduces the background noise and ensures that pixel saturation occurs only at higher signal strength.

The emergence of high dynamic range (HDR) imaging devices extends sensitivity in automotive applications. Such devices generally feature high sensitivity across all the wavelengths in both the near-infrared spectrum and the visible light spectrum.

With the addition of multiple slope CMOS pixel functionality, the image sensors now incorporated into modern automobiles can deliver photon energy discharge rates that increase in proportion to light intensity. This means that bright pixels begin recording images quicker than conventional technology and repeat the process several times, with ever lower recharge voltages and shorter rest periods. This produces a linear approximation of a logarithmic curve. Image sensors using this technology offer enhanced performance, but still relying on conventional pixel architectures, which are attractive for cost-sensitive automotive applications.

By using this approach, in combination with a proprietary adaptive control algorithm known as Autobrite, a manufacturer of microelectronic integrated systems called Melexis has been able to meet the latest ADAS requirements, including automatic emergency braking functions. Through the Autobrite algorithm, the sensor provider can adjust the shutter speed and expand the image sensors' dynamic range, ensuring that the incremental signal-to-noise ratio always remains above a minimum threshold, extending the dynamic range.

Reduced visibility in poor driving conditions increases the risk of accidents significantly because the state of the road can be misjudged or obstacles not detected. Active driver assistance implementations based on HDR image sensors and employing advanced algorithms can enhance visibility in such circumstances. This increases safety in the driving environment.

Peter Riendeau is a Marketing Communications Manager at Melexis. He can be reached at [email protected] or 603-204-2907.