Survey: 40-50% Of Us Trust Our ‘Self-Driving’ Vehicles From Tesla And Cadillac Far Too Much
When Tesla sells a Full Self Driving package for $15,000 US, people seem to trust the name and use it pretty much as such, according to a new survey, even if they only have the less capable Autopilot system. More surprisingly, Cadillac owners are even more likely to trust their cars’ Super Cruise feature.
According to a report released today by the Insurance Institute for Highway Safety, 53% of Cadillac Super Cruise drivers, 42% of Tesla Autopilot users, and 12% of Nissan ProPilot users are just fine treating their cars’ driver assistance technologies as actually fully capable self-driving systems. But many of these same drivers have been locked out of the advanced self-driving systems due to lack of attention: 40% of Autopilot and Super Cruise drivers reported their systems had been switched off at some point automatically, after their cars decided they were not paying enough attention.
That’s even though they have to frequently take control themselves when the software fails.
“Many of these drivers said they had experiences where they had to suddenly take over the driving because the automation did something unexpected, sometimes while they were doing something they were not supposed to,” IIHS Research Scientist Alexandra Mueller and report main author said in a statement.
I’ve never been locked out of Autopilot by my Tesla Model Y, but I have had situations where I’ve needed to disable Autopilot and take control. Construction zones with unclear signage and lane markers are a common culprit.
One of the core problems, according to the study, is people doing non-driving activities that require attention.
Especially Tesla and Cadillac drivers.
“Super Cruise and Autopilot users are more likely than ProPilot users to do things that involve taking their hands off the wheel or their eyes off the road,” the IIHS said today. “They’re also more likely than ProPilot users to say they can do nondriving activities better and more often while using their partial automation systems.”
One example: eating while driving.
While one-handedly snacking on an apple while driving is one thing, chowing down on a Double Cheese from Wendy’s is another. That sometimes requires two hands, meaning zero hands are on the wheel. And while long-time car eaters might control their steering wheel with a knee, that’s obviously not as safe. With Autopilot, Super Cruise, or ProPilot, however, your car is automatically staying in your lane and maintaining follow distance from the car ahead. The technology might be safer than the knee, but it’s not infallible.
“Track tests and real-world crashes have provided ample evidence that today’s partial automation systems struggle to recognize and react to many common driving situations and road features,” the IIHS says.
In my personal experience, a bigger danger than doing things that take some of your attention off the road is the complacency that sets in when you start to trust a partially self-driving vehicle to do the right thing.
In a small previous study on Volvo’s self-driving technology, Pilot Assist, the IIHS found that complacency to be significant.
“Drivers were more than twice as likely to show signs of disengagement after a month of using Pilot Assist compared with the beginning of the study,” IIHS Senior Research Scientist Ian Reagan said at the time. “Compared with driving manually, they were more than 12 times as likely to take both hands off the wheel after they’d gotten used to how the lane centering worked.”
The upshot: we need better ways of ensuring that drivers are actually paying attention. Or, of course, actual full self-driving capability.
That, however, seems to be a ways off yet.