No, Your Self-driving Car Can't Really Drive Itself
Despite advances in semi-autonomous technology, drivers still need to keep their hands on the wheel—at least for now.
With the increased reach and marketing of the benefits of semi-autonomous driving technologies—Tesla’s Autopilot, General Motors’ Super Cruise, Ford’s new BlueCruise, and Nissan’s ProPilot Assist systems included—drivers might think we’ve entered the 1950s sci-fi world of self-driving automobiles.
It is quite remarkable what these systems can do, using cameras, sensors, and GPS systems to offer truly hands-free driving on designated roads in the right conditions. But a considerable number of crashes, including some fatal, involving semi-automated cars suggests the technology and our full dependence upon it for safe motoring is not quite there, at least not yet.
Auto safety experts have established a five-stage model (or the Society of Auto Engineers’ more rigorous six-stage model) outlining the technological steps from entirely human-guided driving to 100% autonomous driving. We are currently at Level II, additional assistance/partial automation—meaning drivers are still required to have their eyes on the road at all times while their cars handle some duties of hands-free driving. Fully automated driving, experts say, could still be decades away.
Humans Are Still the Biggest Operators of Self-Driving Cars
Alexandra Mueller, a research scientist with the Insurance Institute for Highway Safety (IIHS), recently conducted a study that confirms the heightened sense of overconfidence—and perhaps confusion—many owners have about the abilities of their cars’ semi-autonomous systems.
Around 600 owners of vehicles equipped with Tesla’s Autopilot, GM’s Super Cruise, Nissan’s ProPilot were surveyed on their driving habits. Despite media reports about accidents related to Tesla’s system, the IIHS research suggested that 53% of Super Cruise’s users felt entirely comfortable treating their cars as self-driving, compared to 42% of Tesla Autopilot users and 12% of ProPilot drivers. These drivers reported that they often ate, texted, or engaged in other dangerous activities—even picking up dropped food or phones—rather than focusing on their vehicle’s path and speed.
“These systems give consumers the illusion that they’re more in control than drivers think they are,” Mueller said. “They are incredibly sophisticated, but they’re not self-driving, especially as ‘automated’ implies. Drivers still need to be active supervisors of the technology, and be ready and able to intervene. Many users did not really understand their roles and responsibilities…Understanding capabilities is not the same as understanding the limits.”
Mueller cited a lack of education on the advantages and the limits of the various systems. She encourages more clarity in advertising and a stronger emphasis on both driver training and visual, auditory, or other behind-the-wheel feedback to make sure drivers remain engaged.
The IIHS is working on a safeguards ratings system to formalize the basic necessities of early self-driving cars. The agency hopes to work with carmakers on better standards as well as better communication and education for drivers on their cars’ capabilities or lack thereof.
“Our goal is to target design and the philosophies that automakers consider when creating the systems to minimize risk,” Mueller said. “We currently don’t know if these technologies have a real safety benefit as no data exists yet.”
Technology Is Advancing Rapidly But Is Not Quite There Yet
For the truly automated driving experience we’ve seen in television shows and movies (think KITT from TV’s “Knight Rider”), today’s technology will need to improve exponentially. The biggest challenge is the artificial intelligence (AI) required to replicate the many calculations human drivers make on the road, whether they are aware of them or not.
Swedish researchers have suggested the era of fully automated Level 5 self-driving automobiles is still decades away, requiring advanced AI and infrastructure improvements, not to mention the ethical and legal issues created by truly autonomous automobiles.
Retailers, working with tech and automotive companies, continue to make impressive developments in increasingly self-driving trucks. But seamlessly integrating those sort of pilot programs with the more than 280 million cars currently on the road in the United States will still require vast changes in technology.
Smaller countries such as Sweden, with less than 10% of the population and only 13% of the more than 4 million miles of roads and highways in the U.S., have legally conducted tests of semi-automated vehicles since 2018.
The leap to a world of fully self-driving cars also intersects with broad challenges such as adapting thousands of miles of highways to include “smart road” sensors, and upgrading both the power grid and existing wireless technology to support a nation full of autonomous automobiles.
Legalities and the Rise of Self-Driving Technology
As with many aspects of emerging technology, it only takes a few high-profile accidents to get the attention of agencies or legislators, some of whom might not be well-versed in self-driving tech. That poses a significant roadblock for the advent of fully automated driving.
In January 2023, the National Highway Traffic Safety Administration (NHTSA) asked Tesla to recall certain 2016 through 2023 model-year vehicles fitted with the company’s Full Self-Driving (FSD) Beta. Tesla said in early February 2023 that it would comply with the recall request.
The recall comes after a July 2022 investigation into a crash of a 2021 Tesla Model Y which involved the fatality of a motorcyclist. NHTSA has conducted nearly 40 investigations of crashes and more than a dozen deaths related to Tesla vehicles and their use of semi-autonomous driving systems.
In one case, the driver of a Tesla using the Autopilot system was charged in January 2022 with the death of two motorists when his car ran a red light in 2019. The driver was one of the first people in the U.S. charged with a felony related to the use of semi-autonomous driving systems. After the investigation, NHTSA officials concluded that the FSD feature “led to an unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws”, but also stated that drivers, too, are responsible for the operation of their automobiles at all times.
The recall will include an over-the-air software update at no cost to Tesla owners. Tesla said later in February 2023 that it would temporarily halt the rollout of its FSD Beta feature.
Current semi-autonomous systems are a convenient way to assist the driver during a daily commute or parking, but they’re no substitute for human involvement in driving nor will they be, likely for many years to come.