Emerging Tech For Autonomous Vehicles: Thermal Stereo Sensing

Thursday, October 7, 2021

#Management Consulting    #Automotive    #innovation

Over the past five years, there have been a series of high-profile cases of vehicles misinterpreting the driving environment while using Advanced Driver-Assistance Systems (ADAS). Most recently in July, algorithms inside of Tesla’s TSLA +1.4% Full Self-Driving (FSD) system-initiated slowdowns for the yellowish-moon since sensors were accidentally interpreting the distant orb as a traffic signal. Another blockbuster-yet-slightly-humorous story was when a second Tesla vehicle crashed within a week, but the latter accident was with a parked police vehicle (March, 2021). Or sometimes, as in the cases of Tesla vehicles in Delray Beach (2019) and Williston (2016), the vision system’s miss of a semi-trailer resulted in two separate, unfunny crashes killing both drivers. In fact, in response to multiple such cases, the National Highway and Traffic Safety Administration (NHTSA) reported in March that is was launching a Special Crash Investigation. And, yes, all of the incidents are rare and Tesla does get more press than other such incidents (e.g. Uber’s 2018 fatal crash in Tempe), but these news stories quietly highlight that the standard fidelity of vision-based systems (e.g., cameras) might warrant additional improvements to meet the Safety of the Intended Functionality (SOTIF) and support the predicted self-driving market of $173B by 2023.

 AdobeStock_410981572
 
Roadways become easier to detect when adding heat sensing to the existing vision sensing systems.  The image above shows a barrier free parking lot adjacent to a high volume road from a thermal imaging system.

 

With that as the backdrop, long ago (circa the late twentieth century), infrared (IR) sensors were considered and occasionally included for augmented ADAS systems or autonomous research vehicles because they helped with scenarios where vision-systems struggled: night-time darkness, extreme weather (e.g. fog, snow) and/or poor reflectivity. Adding the heat sensing to the long-vision, forward-looking vision would allow greater perception. However, a reoccurring design-slash-operational issue made such sensors less desirable: significant calibration set-up. Believe it or not, for a long while such systems required a separate manufacturing board decorated with a checkerboard pattern and either light bulbs or heat-emitting wires such that both the color cameras and infrared sensors could adjust its base calibration. So not only was the original design expensive operationally, but subsequent recalibration was occasionally needed regardless of how rigidly the platform was designed. Therein, many manufacturers moved away from IR augmentations. In fact, earlier this year Musk announced that Telsa would move to a “pure vision” approach and “… believes a vision-only system is ultimately all that is needed for full autonomy.”

 

However, accidents continue to occur. “We have seen quite a few accidents where a car slammed into what we call ‘non-classified objects’, states Doron Cohadier, the Vice President of Business Development for Foresight Automotive, “which includes animals, unusual vehicles, etc. and harsh weather conditions such as glare, snow, fog and severe rain.” And so poor visibility of the system becomes unwanted visibility in the media.

Therein, companies have reexamined the possibility of adding IR back into the array of forward-facing sensors. “Using two synchronized or ‘stereo’ thermal cameras provides a pixelized depth map of the environment with the ability to detect any object – both classified objects, using Deep Neural Networks (DNN), and non-classified objects using stereo technology under harsh weather conditions,” explains Cohadier.

“Such technology is passive as opposed to competing technologies which are active and emit energy to ‘hit’ the object and get a reflection. With a whole bunch of active sensors at a given intersection, there’s a lot of noise generated affecting the ability to read the sensor signals. The thermal stereo sensor isn’t sending out energy into the environment; rather it’s absorbing what’s out there and is, therefore, less prone to mutual interferences between sensors.”

To overcome the previous operational issues, the new addition of automatic-calibration eliminates the expensive and difficult set-up and maintenance. As explained by Foresight’s Director of Product Development, Izac Assia, “The two cameras are pointing towards the distance and recognizing if the relative post-estimation has changed. This is happening dynamically in order to maintain the self-calibration and create an accurate depth map.”

In the end, such self-calibrating, thermal cameras work in tandem to convert even difficult visual environments into a high-resolution, dense, 3D, thermal point clouds and allow safety systems to accurately understand what’s out there.

Maybe we all need to recalibrate on the technological solution.

 

This article was originally published by Steve Tengler (STEVE.TENGLER@KUGLERMAAG.COM) on Forbes.com on September 30, 2021

LET'S TALK

Do you need to improve your automotive product development, to increase efficiency, or to comply with ASPICE and Functional Safety? You are at the right place.

Back