Six Things You Don't Know About Self-Driving Cars
If you look past the hype surrounding autonomous vehicles, the future of self-driving cars looks a lot more pragmatic.
An automaker or automotive startup announces that it's gotten a few steps closer to self-driving cars every few weeks. While initial autonomous vehicle deployment timelines keep getting stretched by the actual problem of creating a system that works on our increasingly complex roadways, some companies have begun to offer robo-taxi rides in certain cities.
The excitement around self-driving vehicles is understandable, but there are issues that many companies don't mention. Some of them will require more advanced technology before they can be addressed, while some are fixed with something as simple as a really powerful stream of water.
There Are No Self-Driving Cars for Sale Today
Hopefully you already know this one, but it’s such a common and frightening misunderstanding that we feel obligated to mention it: You cannot buy a self-driving car today. Tesla sells cars with “Full Self-Driving” capability, but warns drivers that they must keep their hands on the steering wheel when using the system. General Motors’ Super Cruise and Ford’s BlueCruise are capable of hands-free driving, but they require a driver who is paying attention to the road and ready to take over at any time. A true self-driving car is capable of safely navigating roads while the passengers sleep or watch movies. If a technology requires an alert human in the driver seat, it’s merely a driver-assistance system.
The automotive and autonomous-vehicle industries generally adhere to the guidelines established by SAE International (formerly the Society of Automotive Engineers), which define the six levels of driver assistance and autonomy. At Level 0, the driver controls all aspects of the vehicle. Level 1 covers vehicles with cruise control or adaptive cruise control without lane-keeping assistance. A Level 2 vehicle assists with braking, acceleration, and steering in certain driving conditions. Like Level 0 and Level 1, the driver is still responsible for all aspects of driving in a Level 2 car, and needs to pay attention to the road. Most passenger vehicles on sale today are at Level 2 or below.
Level 3 systems must be able to take on all tasks of driving in certain circumstances and environments, but a driver must be prepared to take over. A vehicle can only be considered self-driving or autonomous beyond this point—when the human can be entirely removed from the task of driving. Level 4 vehicles will negotiate roads and traffic on their own, but only in certain circumstances or conditions, such as exclusively on the highway or within a pre-mapped urban area. A Level 5 system removes the boundaries and conditions that limit Level 4 vehicles. These self-driving cars would be able to drive anywhere a human could without human assistance, but some experts believe this kind of autonomy may never be achieved.
The First True Self-Driving Cars Will be Taxis and Big Rigs
It’s fun to think about a self-driving car in your driveway ready to whisk you away to work every morning while you catch up on emails or binge-watch the latest popular TV series. Sadly, that’s further away than you might think and it’s more likely that you’ll need to hire a taxi to enjoy that experience.
While automakers like Tesla and Mercedes are working hard to bring autonomous vehicles to the public, companies like
It’s much easier for a computer to navigate an interstate than a town or urban area. This is why the driver assistance systems in current vehicles work best on the highway. Trucking companies are eager to use self-driving technology to remove humans from their trucks, which would allow them to keep their cargo moving around the clock.
Self-Driving Systems Will Have Remote Driver Backup Systems
Anything out of the ordinary encountered on the road is known as an “edge case.” For example, maybe an autonomous vehicle will come upon a delivery van that has spilled its cargo of teddy bears and is blocking the road. The driver of the delivery van could be waving traffic into the opposite lane without concern for his now-destroyed cargo. For a human, the act of driving over stuffed animals into the wrong lane seems simple. For an autonomous vehicle, it’s chaos. The self-driving car may see the smiling teddy bears as living animals and refuse to move regardless of the permission offered by the delivery driver. That’s where a remote driver comes in.
A confused robo-taxi would ping a sort of call center, and a human would assess the situation and either plot a route, or in some cases, even remotely take control of the vehicle. The former is more likely and has been implemented by General Motors’ Cruise ride-hailing service in San Francisco and Alphabet’s Waymo One program in Phoenix.
Bugs (the Real Kind) Will be a Huge Problem
Bugs, mud, slush, dust, and ice will impede the visibility of the sensors on vehicles just like they do on a car’s windshield. This issue is especially true of the cameras that will be critical to an autonomous vehicle understanding its environment. A sensor is only as good as what it can see, and the problem gets even stickier if gooey crickets have become caked onto the eyes of the vehicle.
To combat the issue, automakers have added high-pressure sprayers placed into the wiper zone of the windshield, like Subaru, or in Tesla’s case, the company filed a patent for a laser system that shoots the windshield and cameras clean. Ford has even built a system that creates an air curtain around the sensor. As bugs get near the sensor, they are directed around it.
Another important takeaway from the inevitable deluge of goo on sensors is that autonomous vehicles will need many different types of sensors, such as radars and lidar sensors, in addition to cameras. They’ll all need to be cleaned in one way or another, but sensor redundancy will reduce the chances of a vehicle being disabled by pesky road elements that clog cameras.
Machine Learning Can't Learn Everything
With machine learning, a computer system is fed a lot of data, and as it parses all that information, it “learns” to react to its surroundings or stimuli. In the world of autonomous driving, machine learning is tasked with figuring out everything it might encounter on the road. Throw thousands of photos and videos of dogs at a system, and it’ll figure out that the corgi it sees on the side of the road is a canine.
That’s a simplified explanation. But at its core, that’s the gist of how an autonomous driving system works.
The issue is that while we can increase the power of these computers to recognize the world around them, there will always be gaps. There will always be something that the car’s computer doesn’t quite understand. A human with decades of knowledge about the world outside of a vehicle can easily and quickly make decisions about a scenario they’ve never experienced before. Self-driving vehicles often can’t adapt when they encounter something they’ve never seen before, which means there will continue to be crashes and humans involved in driving even after the arrival of autonomous cars.
Collisions are Going to be a Legal Quagmire
If a self-driving vehicle makes a mistake (and there will be mistakes), who is responsible for a collision? This is where the lawyers come in—because, of course, there will be lawyers. Some automakers like Volvo, Mercedes, and Audi, have stated that the automaker will take responsibility if something goes wrong with a truly autonomous vehicle.
The situation becomes unclear when an automaker like Tesla offers settings in its driver-assistance system that makes the vehicle more assertive on the road. A lawyer may ask: is it Tesla’s fault if the driver sets the vehicle to drive more aggressively, or is it the driver’s for selecting the setting? When automakers do take responsibility, it creates different issues. Will these companies need to secure insurance for the vehicles they sell? If they do, will the cost be passed down to the consumer?
The autonomous vehicle industry has long promised fewer road mishaps and deaths. Exactly how safe these vehicles might be remains unproven, but even computers are imperfect. Right now, there is likely a gaggle of corporate lawyers out there trying to figure out what to make of this whole situation.