Self-driving truck startup TuSimple is reportedly under investigation after it tried to blame a self-driving accident on “human error.” Autonomous vehicle researchers at Carnegie Mellon University say attributing blame to a human is misleading and that common safeguards would have prevented the incident.
In April, an autonomous semi-trailer truck equipped with TuSimple technology was traveling on a freeway in Tucson, Arizona, when it suddenly turned left and hit a concrete barrier, according to dashcam footage leaked to YouTube.
TuSimple blamed the accident on “human error,” but in an internal report be reconsidered from Wall Street Journal suggests that pinning the crash on one person is an oversimplification.
According to the internal report, the crash occurred because “a person in the cab had not properly restarted the autonomous driving system before activating it, causing it to execute an outdated command.” Wall Street Journal References.
Essentially, the left turn command was 2.5 minutes and should have been deleted but was not.
But autonomous vehicle researchers at Carnegie Mellon University say attributing blame to a human is misleading and that common safeguards would have prevented the incident.
the researchers said WSJ that the truck should not respond to commands that are even a few hundredths of a second away, and that the system should never allow an autonomous truck to turn so sharply while traveling at 65 miles per hour.
“This information shows that the testing they’re doing on public roads is extremely unsafe,” said Phil Koopman, an associate professor at Carnegie Mellon. Newspaper.
On Tuesday, TuSimple told a suspension“We take our responsibility to find and resolve all safety issues very seriously,” adding that it responded to the April accident with “immediate[ing] our entire autonomous fleet and launched an independent review to determine the cause of the incident.”
“With the lessons learned from this review in hand, we have upgraded all of our systems with new automated system checks to prevent this type of human error from happening again and have reported the incident to NHTSA and the Arizona Department of Transportation,” added company. “
However, the National Highway Traffic Safety Administration (NHTSA) is joining the Federal Motor Carrier Safety Administration (FMCSA) in investigating the San Diego-based company.
FMCSA said in a letter that it has launched a “safety compliance investigation” into TuSimple — referring to the April accident.
TuSimple is not the only self-driving vehicle company being investigated by the NTSA.
The federal agency has launched an investigation into yet another fatal crash involving Tesla’s “fully self-driving” Autopilot system. The latest Tesla crash under federal investigation resulted in three deaths.
In June, the federal investigation into Tesla’s Autopilot function escalated, with NHTSA now investigating whether the Autopilot function is potentially defective. The agency studies data on 200 Tesla crashes, stating that “On average in these crashes, Autopilot lost control of the vehicle less than one second before the first impact.”