UNITED STATES—The U.S. is leading the global race toward making driverless car the standard. As such, the most established automakers are developing cars of varying levels of autonomy. Therefore, it seems like driverless cars are here with us to stay. But how safe are driverless cars?
The safety of driverless cars is a concern, as many have been involved in collisions during beta testing and limited release. San Francisco seems to lead among autonomous car accidents in California.
Below are incidents where a driverless car was involved in accidents in San Francisco and nationwide.
Cruise Driverless Autonomous Car Recalled for Software Update
A regulatory filing with the NHTSA indicates that an autonomous vehicle was entangled in a road crash in San Francisco recently. The crash was attributed to a software issue. As a result, software in eighty driverless Cruise vehicles had to be recalled for an update.
In the June 3 crash incident, the driverless Cruise autonomous vehicle made an insecure left turn as it approached Laurel Heights. Consequently, the self-driving vehicle caused a crash with a Toyota Prius.
Cruise investigated the accident prompting a recall of its 80 vehicles to update the software. However, the company has been progressively reintroducing unsecured left turns to its autonomous vehicles.
The company admitted to having filed the submission emanating from their free will for public interest and transparency. Cruise insisted the issue with the prior software version has been addressed. So, it does not affect the present on-road activities.
Cruise is committed to advancing the responsiveness of autonomous vehicles toward predictive performance. With Cruise’s software update, autonomous vehicles can evade a collision in similar circumstances to that of Laurel Heights. The company affirmed that after being updated, the driverless software would now select a dissimilar trail under similar situations.
Out of 111 autonomous car crashes in California, 94 took place in San Francisco. The numbers are not surprising since the city is a leading proving ground for driverless cars and driverless taxis like Waymo. Indeed, the June 3 collision is part of this count, and the Chevy Cruise was operating in autonomous mode at the time.
Luckily, the occupants of both vehicles sustained minor injuries. The victims were treated for minor injuries upon police and emergency services’ arrival. Only a single individual sustained minor injuries and was taken to the hospital.
Further inquiry into Cruise’s self-driving software showed a hard brake would forestall a collision with an oncoming vehicle. The June 3 accident was the first in more than 123,000 unprotected left turns. Nevertheless, self-driving vehicles have suffered other setbacks while in operation.
Vehicles equipped with driving assistance systems are bound to encounter some issues. Therefore, drivers need to be alert to take control whenever needed. The active driving assistance systems can misbehave. It is more common in autonomous vehicles that combine functions like acceleration, braking, and steering. Such systems frequently can disengage almost without notice.
Consequently, a disaster could loom with any slight distraction from the driver in such a situation. Self-driving vehicles have not been truly driverless, at least for now. So, you should never rely too much on the autonomous car’s capabilities.
For instance, in 2016, a Tesla driving down a highway in Florida at full speed smashed into an 18-wheeler truck crossing. The driver of the Tesla later succumbed to injuries sustained. Cruise found that the car’s autopilot function failed to brake. The brightly lit sky made it impossible to differentiate from the white side of the truck.
According to NHTSA, had the occupant of the Tesla remained alert, they could have applied brakes to avoid the collision. The occupant relied on autopilot, which caused a distraction, and was found to be at fault.
Another victim died following a Tesla car autopilot direction finding an error. The victim had previously sought the repair of a multifunctional autopilot, a self-driving feature in Tesla cars. It indicates that safety is still an issue despite cars becoming more autonomous.
System malfunction in a self-driving car is not the only cause of accidents. Hackers have not been left behind as they try to infiltrate the integrity of autonomous vehicles. An incident in 2015 saw hackers remotely taking over control of a jeep. The jeep driving at 70mph on the St. Louis highway was eventually forced to stop. That was after the hackers used the onboard entertainment system to access the jeep’s braking and steering.
Fortunately, it was an exercise on purpose as part of autonomous system safety tests. The company drove the point home, and the driver didn’t know precisely when the planned takeover would occur. Hence, the self-driving feature is not entirely cyber-attack proof. Furthermore, real hackers would mercilessly apply their skills, potentially leading to harmful and deadly scenes.
Danger of Fire
A self-driving car christened and marketed as driverless uses Lithium-ion batteries. Yet, Lithium-ion batteries are highly flammable. Lithium burns can produce temperatures of up to 2,000 degrees Celsius. So, trying to drench the inferno with water can cause a hydrogen gas explosion.
A case in point is a self-driving Tesla in Texas that met a fiery fate. The fire killed both occupants and continued to burn for hours. The NTSB is still investigating the accident. Notably, no one was driving the vehicle, clearly showing the importance of human-driver intervention in driverless cars.
Essentially, an autonomous vehicle crash in which the battery is damaged can result in an unrestrained temperature increase. Thus, an explosion of toxic fumes could happen besides releasing projectiles. It is not only a danger to passengers but also to the emergency respondents.
Some incidents related to damage of lithium batteries during a collision leading to fires include:
- A 2012 Tesla spontaneously caught fire in West Hollywood, CA, without collision. Though, there were no injuries.
- In 2018, a Tesla Model S burned for at least an hour in Fort Lauderdale, FL. The accident claimed two lives after gallons of water were used to reduce the lithium battery’s hot embers.
While driverless cars are marketed as safe, there is still a lot of testing required before they can be used by the masses. On the other hand, most injuries resulting from driverless car accidents are less severe than accidents caused by human-operated vehicle accidents.