If you didn't know, a Tesla driver was recently killed in a fatal crash while using the Tesla Autopilot feature. This disaster sparked an investigation into what the driver was doing right before the crash and driving feature itself.
Firstly, the driver that was killed in Silicon Valley on March 23rd, 2018 was 38-year-old Apple engineer and game developer, Walter Huang. Huang was driving his Tesla Model X in Autopilot mode, collided with a safety barrier, and then was struck by two more cars. Before the crash Huang had reported issues with his Tesla's Autopilot feature to his family and close friend, but it was later discovered that he did not report it to Tesla themselves.
The National Transport and Safety Board (NTSB) conducted an investigation into the crash, as Huang's family are attempting to sue Tesla for a malfunctioning vehicle that they say ultimately killed Huang. The NTSB investigation was extensive, and took quite some time to formulate results. What the NTSB found was that Huang's vehicle was travelling at 70mph when it crashed into the safety barrier, and that Huang experienced the same "glitch" in the same spot on the road with his vehicle multiple times before his death.
The previous times that Huang experienced this "glitch" in Autopilot mode he re-corrected the vehicle. This evidence was acquired by the NTSB through the SD card that is inside every Model X. The NTSB also found out through Apple that Huang was issued two smartphones test devices. These devices had extensive logging/tracking software so Apple can record things such as excessive memory/power leaks with certain apps.
The NTSB acquired the logs from Huang's smartphones and found that Huang was an avid player of the mobile game Three Kingdoms. The logs detailed that Huang was playing the game at precisley 9:06AM, just 21 minutes before the crash. Another log was found just 17 minutes before the crash, and this one showed that the game Three Kingdoms was "extremely active".
The NTSB said, "The game is a world-building, strategy game with multi-player capability. When playing the game on a mobile device such as an iPhone 8 Plus, most players have both hands-on the phone to support the device and manipulate game actions." They also stated, "the log data does not provide enough information to ascertain whether the Tesla driver was holding the phone or how interactive he was with the game at the time of the crash."
U.S Senator Edward Markey criticised Tesla's Autopilot feature calling the title of the feature "misleading". Tesla reponded to the senator, saying the following, "First, throughout the purchase, user and ownership experience, Tesla clearly explains that Autopilot is not an autonomous system and does not make our vehicles autonomous. Autopilot is an advanced driver-assistance system ("ADAS") that is representative of SAE International Level 2 automation("SAE L2"). As such, Autopilot is only designed to assist the driver in performing the driving tasks of steering, acceleration, deceleration, and lane changes."
Tesla's statement continued, "The driver must continually monitor the driving environment and be prepared to immediately overtake the vehicle controls as necessary. The driver is forced to participate through steering wheel detection. When used properly, Autopilot can greatly enhance occupant safety, but, as an SAE L2 ADAS system, the driver is ultimately responsible for the safe operation of his vehicle."
NTSB chairman, Robert Sumwalt said, "If you own a car with partial automation, do you not own a self-driving car. So don't pretend that you do. This means that when driving in the supposed self-driving mode you can't sleep. You can't read a book. You can't watch a movie or TV show. You can't text. And you can't play video games. Yet that's precisely what we found that this driver was doing."
Ultimately, the NTSB said that the Huang crash is extremely similar to other Tesla crashes, but that the main issue is not with Tesla, but with how humans relate to automation. According to Robert Molloy, director of the NTSB's office of highway safety, who discussed "automation complacency", "This is not a unique problem to Tesla, this is a problem related to automation and how people work with automation."
Tesla has also said that judging on its internal data the Autopilot feature has made drivers crash less frequently than if they were driving manually. It seems that since Tesla are pioneering the electric vehicle (EV) industry with features such as Autopilot, they are the ones who are taking the blame for the complacency of drivers who are not yet used to semi-automation. As the EV industry develops and matures, humans will no doubt become better adapted to these technologies.
In other news about Tesla, a woman was caught keying the doors of a Tesla by the vehicles Sentry Mode. The owners were then able to report the woman to the police, check out that story here.
Last updated: Apr 6, 2020 at 04:35 pm CDT
- Tesla Autopilot hackers trick car to accelerate to 85MPH, with tape
- Apple engineer who died in a car crash reported Tesla Autopilot issues
- Used Model S has Autopilot feature remotely disabled by Tesla
- US Senator criticizes Tesla 'Autopilot' feature, calls it 'misleading'
- Tesla owner who died in horrific car crash was playing mobile games
- > NEXT STORY: Coronavirus: first COVID-19 vaccine is ready for human testing
- < PREVIOUS STORY: DOOM, DOOM II, and DOOM III are under $3 for Nintendo Switch