The advent of Advanced Driver Assist Systems (ADAS) and Autonomous Vehicles (AV) will revolutionize how we travel and transport goods in the future, providing society greater freedom of movement. Simultaneously, this technology will improve safety and reduce accident rates by removing human error and the ever-increasing threat of distracted driving. With over 1 million casualties a year due to car accidents, and human error being the predominant reason for those accidents, it’s easy to see why the industry is racing to create this revolutionary technology.
At its core, AV technology means humans will be replaced as the decision makers behind the wheel, but many technical challenges remain to ensure safety in all conditions and driving scenarios. Meanwhile, automotive engineers are hard at work developing a suite of ADAS-enabled sensors for such tasks as automatic emergency braking (AEB) to be effective anytime, anywhere. To achieve this, engineers and designers are recognizing that no one-size-fits-all sensor can provide safe driving. Rather, a suite of complementary and orthogonal sensors can optimize driving performance by providing critical information and redundancy to ensure safety at all times. With multiple sensor technologies and reliable computer intelligence evolving at an up-precedented pace for automotive, we should all strive to ensure a self-driving car is safer than a human driver.
The Need for Thermal Sensors
The Society of Automotive Engineers (SAE) defines automation level 3 as vehicles able to detect the environment around them and make informed decisions for themselves. Above this, with automation levels 4 and 5, vehicles will require little to no human attention. Mass adoption of automation level 3 and above is dependent on affordable sensor technologies, the compute power required to process the incoming sensor data, and the artificial intelligence needed to execute driving commands that deliver safe and reliable transportation in real-world conditions. Today, ADAS and AV sensor technologies include cameras, ultrasonic sensors, radar, and light detection and ranging (LIDAR).
OEMs and numerous automotive related companies are now looking for higher levels of safety and expanded operational domains then these existing sensors can provide. The use of thermal imaging is now in the spotlight as this technology will create a redundant imaging system to the visible camera to improve capability in various driving scenarios.
Challenging lighting conditions, nighttime driving, blinding sun glare, cluttered urban environments, and inclement weather (i.e. rain, fog, and snow) will be difficult to solve for the self-driving car unless thermal cameras are used. Thermal imaging takes advantage of the fact that all objects emit thermal energy, eliminating reliance on a visible illumination source (day and night time driving for a thermal camera looks relatively the same). Thermal, or long-wave infrared (LWIR) energy, is emitted, reflected, and transmitted by everything that would be on or near a roadway. Utilizing this additional band of the electromagnetic spectrum will push safety beyond what a human with just two eyes and ears can obtain.
Thermal sensors can detect and classify objects in darkness and through sun glare and most fog at distances greater than four times farther than typical headlights illuminate, depending on the lens employed and the corresponding field of view. Thermal cameras are especially adept at detecting differences between a human body (living things), inanimate objects, and background clutter, differentiating them as an essential technology to detect pedestrians, pets, and wild animals. FLIR thermal imaging cameras are extremely sensitive to differences in temperature—as small as 0.05° Celsius. With this precise sensitivity, VGA thermal cameras (640 x 512 pixels) can clearly show nearly everything in a scene, particularly living beings, the objects drivers absolutely do not want to hit. This makes it a critical technology that can help reduce the number of pedestrian deaths; of the 5,987 deaths in 2016 in the U.S., 75 percent occurred at night.
Visible cameras and thermal cameras combined will ensure that autonomous cars are safer than cars driven by humans. This complementary sensor technology, paired to the existing ADAS and AV sensor suite, will help these systems make better, safer decisions with improved situational awareness. If a Robotaxi shows up at your door at night without thermal imaging, would you step into the driverless car? I wouldn’t.
Lowering Cost and Leading a Developing Market
A common misconception is that thermal sensors, with their background in military use, are too expensive for automotive integration. Until recently, thermal cameras were measured in thousands of dollars each for VGA resolution or higher and this prevented the automotive market from considering their use for mass adoption. Thanks to advances in thermal imaging technology, (improved manufacturing techniques and significantly increased manufacturing volume) it is now possible to mass-produce affordable thermal sensors for SAE automation level 2 and higher at just a few hundred dollars each.
The significant components in a thermal camera includes a sensor (microbolometer), lens, electronics, and enclosure. Manufacturing inputs include the silicon wafer, foundry costs, and yield. Fundamentally, the fabrication of thermal imaging sensors is like silicon computing hardware. Per-sensor cost is calculated by dividing total costs by the number of chips available to sell.
Unlike visible cameras, infrared image sensors unfortunately cannot follow Moore’s Law. This is because there is a limit to how small an infrared imager sensor’s pixel can be made due to a performance tradeoff as the pixel shrinks and approaches the wavelength of interest (LWIR sensors detect 8 to 14 microns wavelength radiation).
Pixels sizes of visible cameras are often 1 to 3 microns, where thermal cameras pixels won’t be much smaller than the wavelength of light they are sensing (8 to 14 microns). Therefore, thermal cameras will be more expensive then visible cameras of the same resolution. However, thermal cameras don’t need to be multiple megapixels in resolution to provide excellent classification performance for the contrast they provide is exceptional.
During the past ten years, FLIR has reduced the pixel geometries of LWIR thermal cameras from 50 x 50 microns to 12 x 12 microns, an 83 percent reduction in area. This, in combination with wafer-level packaging and increased scale, and process optimization, has enabled FLIR to achieve the lowest costs in the thermal-image sensor market (reference Figure 3 below).
Continuous ongoing improvements in smaller pixel design and yield improvements at FLIR coupled with the anticipated significant increase in volumes produced promise to lower costs even further. Based on current development plans, it is forecasted that an additional two times reduction in cost can be accomplished over the next several years. This compares favorably with the required ten times cost reduction of LIDAR systems necessary to meet OEM cost targets.
Adoption of significant volumes of thermal cameras for SAE automation levels 2 and 3 will likely start in 2022 or 2023, with annual growth rates of 200 percent to 300 percent through 2030. With the planned improvements and automotive manufacturing scale, thermal cameras will be an affordable component for the ADAS and AV sensor suites, even for large volume OEM use.
Leveraging Data and Machine Learning to Promote Industry-Wide Integration
In the rapidly evolving field of autonomous driving technologies, data is key to training and deploying functional hardware that will enable vehicles to navigate in a variety of conditions. Experienced OEMs and dozens of technology companies are racing to outfit fleets of vehicles with sensors to collect the necessary data to train for various object classifiers and to test their respective systems.
To facilitate more streamlined integration of thermal imaging, FLIR launched an automotive development kit in early 2018. In addition to the FLIR ADK hardware, FLIR provides developers with a free starter dataset of more than 14,000 annotated thermal images. Developers can become familiar with thermal imagery and begin training their ADAS and AV computer systems to perform classification and analytics on thermal data.
The automotive market will transform drastically in the coming years/decades, and the imminent technological shift coming will be greater than the adoption of the cell phone. However, the ADAS and AV market is still in an early development phase and it’s difficult to accurately predict exactly when all this effort will make significant changes to our way of transportation, but it’s clear thermal sensors will be needed for higher-performing ADAS and AV platforms as the automotive industry moves to SAE levels automation 3 (conditional), 4 (high), and 5 (full). With the addition of thermal cameras for ADAS and AV, the world will be a safer place with fewer pedestrian and animal deaths, fewer car crashes, and expanded capability of the L3-5 enabled automobile.
To learn more about thermal technology for ADAS and AV platforms, visit FLIR ADAS.
About the author
Chris Posch has 15+ years of experience at FLIR Systems creating thermal imaging cameras for numerous markets. He is currently the Director of Engineering for Automotive Products at FLIR where he is focused on working with automotive engineers and developers to deploy and test thermal sensors in ADAS and autonomous vehicle platforms.
Prior to directing engineering, Chris was the Director of Application Engineering where he worked across the FLIR OEM thermal camera product line, running a team that interfaced directly with integrators to develop new and innovative OEM products with FLIR cameras. He holds a B.S from UC Irvine, a M.S from Oregon Science and Engineering and a PMP certificate from UC Santa Cruz.