In the Midwest, where I grew up and learned to drive, rain can come down so hard you have to pull over because you can’t see the car in front of you. Bugs can swarm enough to plaster your windshield when you drive through them. Snow storms can hit so fast, you need to be confident putting on tire chains while slush freezes your fingers.
Startups in the autonomous vehicle space like Michigan-based May Mobility are finding success running heavily geofenced self-driving shuttle fleets on predictable routes - and huge kudos to them, this is no small feat. But, as an industry we’re still a long way from driving a fully autonomous vehicle on a highway in a typical Midwest downpour, snow storm or bug hatch. No autonomous vehicle today can drive in these conditions as well as an experienced human driver. At best, an AV would just pull over. At worst, it might crash. Why? Because the digital eyes of the vehicle – cameras, LiDAR and other sensors – can’t see when their surfaces are significantly blocked, and we haven’t been able to break past that limitation to teach automated driving systems how to drive better in these conditions.
Do you have a backup camera on your car? Does it get dirty and malfunction? This only gets more complicated with today’s early autonomous vehicle technology. Highway-safe vehicles with fully automated driving systems have dozens of pieces of perception hardware – backup cameras, forward-facing cameras, range-detection LiDAR and other sensors - that serve as the vehicle’s eyes. Just like vehicle backup cameras today, if these perception systems get plastered by bugs, coated by freezing rain, or otherwise occluded by Mother Nature, they stop working reliably.
While I believe AVs have the potential to save lives and improve our quality of life, I also know self-driving cars in their nascent stage today can’t yet operate safely in what many in the industry currently think of as “out of scope” edge cases: bugs, mud, freezing rain, snow and similarly complex environmental conditions (and don’t even get me started on tree sap or bird droppings). These messy, unpredictable conditions may be edge cases in California and Arizona, but – as many people living outside these regions know – they’re standard in much of the world. And they pose a future safety concern that, to date, no company working on autonomous vehicle technologies has solved, and few have addressed.
More than eighty cities across the globe are currently piloting autonomous vehicles. When you cross-reference the city and time of year when each pilot has or will take place, you find one major red flag – we’re only piloting AVs in near perfect weather.
This is problematic because this means that the current industry solution to everyday environmental conditions is to essentially avoid them altogether. This limiting of autonomous vehicle deployment to avoid messy, hard-to-predict environmental challenges is what I call “envirofencing.”
Companies working on autonomy must work together to develop vehicles that can operate safely without geographic and environmental fencing constraints. While early use cases like what Voyage has achieved in The Villages, Florida are impressive early indicators of the positive impact an entire community can experience when enabled via autonomous mobility, we need to get beyond sequestering AVs to predictable routes and highly mapped city blocks in near-ideal driving environments. We must get test vehicles on roads with perception systems that are not disabled when environmental factors strike, so we can begin to gather more realistic data to better train our AVs and test and implement real solutions that will keep self-driving vehicles functioning in all conditions at scale.
Building consumer and community trust in self-driving cars is imperative for scaling to mass adoption. This can only be accomplished if we design and adopt solutions to envirofencing that enable AVs to see the road ahead, so we can sit back and truly enjoy, and trust, the ride.