We still have some kinks to sort out regarding incidents involving self-driving cars, but there’s no doubt they’re the wave of the future. Who is responsible in the event of an accident involving an autonomous vehicle remains an important open subject.
Considering how recently the concept of autonomous vehicles has emerged, there is currently no simple solution. That’s not exactly comforting if you were involved in an accident with a self-driving car and weren’t the one behind the wheel. The complexity of assigning culpability is something that needs to be understood before more of these vehicles are allowed on the roads.
Companies making autonomous vehicles often incorrectly label components.
Despite many states’ best efforts, federal regulations for autonomous vehicles are still lacking. The jargon used to describe automated driving systems is one area that is unregulated and could contribute to confusion in the event of an accident.
Oftentimes, manufacturers would give these functions misleading names that could cause drivers to make unsafe decisions. Some companies, including Tesla, use the term “autopilot” to describe their products even though they require significant human input. A German court agreed, finding that Tesla’s Autopilot driver assistance system misleads consumers. Learn more!
A bogus sense of safety
Accidents aren’t the only thing that can happen when driver-assist features have names that aren’t clear. Is it the driver’s responsibility for misinterpreting the autonomous driving feature or Tesla’s fault for choosing a moniker that could be construed as misleading?
German judges who decided against Tesla’s usage of “autopilot” for its feature held the corporation responsibly. Numerous incidents have occurred as a result of careless use of Tesla’s Autopilot technology, such as:
- A British driver activated the Tesla’s Autopilot mode, stepped into the passenger seat, and was promptly suspended for 18 months.
- A Tesla driver was killed when their vehicle crashed into a barrier while the driver was distracted by a video game and had Autopilot turned on.
- A Tesla driving on Autopilot struck a police vehicle that had stopped.
It’s important to remember that automated driving systems are not 100% safe.
Similarly to the trust issues with autonomous driving systems, there is the problem of a false sense of security. Drivers’ ignorance of what they can and cannot do is a major cause of the errors that still occur with these systems. According to one research, drivers of a Tesla Model S spent more time looking away from the road when partial automation was activated.