The National Transportation Safety Board (NTSB) has determined that a March 1 fatal crash involving a Tesla Model 3 and a semi-truck was caused by the Tesla driver’s improper use of the semi-autonomous “Autopilot” feature.
The accident was strikingly similar to a 2016 crash involving another Tesla on Autopilot. In both cases, the Autopilot failed to detect the large truck and the cars went under the trailer, shearing off the roofs of the cars and killing the drivers.
The NTSB and the National Highway Traffic Safety Administration (NHTSA) have raised questions about the safety of the Autopilot system. Although Tesla says that the system is only semi-autonomous and needs constant supervision by an alert driver, drivers have repeatedly misunderstood that, to Tesla, “Autopilot” doesn’t mean what the word “autopilot” has traditionally meant. These drivers apparently assumed that the vehicle would be completely hands-free.
In the March 1 crash, according to the NTSB, the driver only turned on the Autopilot feature approximately 10 seconds before the crash. The driver’s hands were not detected on the steering wheel for about 8 seconds. The NTSB says the driver didn’t use the Autopilot at all until just before the crash.
During that time, neither the Autopilot nor the driver braked or attempted to avoid hitting the semi-truck. The Tesla was traveling at 68 mph in a 55-mph zone when the crash occurred and the driver, 50, was killed.
Is the Tesla Autopilot system a defective product?
A former head of NHTSA commented that he was surprised the agency hadn’t declared the Tesla Autopilot defective and demand a recall after the 2016 crash.
“Their system cannot literally see the broad side of an 18-wheeler on the highway,” he told the Associated Press.
In its report on the 2016 crash, the NTSB cited design limitations of the Autopilot as playing a major role. It determined that the Model S Autopilot is only safe to use on limited-access highways like interstates. In both the March 1 and 2016 crashes, the Autopilot was used on a divided highway with turn lanes in the median.
The Autopilot system either needs to be more autonomous or needs to warn drivers who aren’t paying close enough attention to react before a crash. Some other semi-autonomous driving systems like GM’s Super Cruise system warn drivers more quickly, but they are also intended for use only on limited-access highways.
The repeat crashes cast some doubt on whether Tesla can — or should — introduce fully autonomous vehicles sometime next year, as currently planned. The company announced last month that it had developed an artificial intelligence system that can safely navigate roadways — but it uses the same cameras and sensors that existing Tesla vehicles use.
“Tesla has for too long been using human drivers as guinea pigs. This is tragically what happens,” said the former head of NHTSA.
Tesla said it was saddened by the March 1 crash but asserted that drivers have safely traveled over a billion miles using the Autopilot feature.
That may be true, but each and every crash that causes injuries or death can be a life-changing catastrophe for those involved. After a crash, no one is comforted to know that they are a statistical outlier. They want to know that the issues leading to their crash will never happen again.
If you have been injured by a Tesla vehicle on Autopilot, discuss your situation with an experienced personal injury attorney.