“Irresponsibility — by carmaker Tesla and by a Tesla driver — contributed to a deadly crash in California in 2018, federal investigators say,” writes Camila Domonoske in their NPR article entitled “Tesla Driver Was Playing Game Before Deadly Crash. But Tesla Software Failed, Too.”
According to the article, “The driver appears to have been playing a game on a smartphone immediately before his semi-autonomous 2017 Model X accelerated into a concrete barrier. Distracted by his phone, he did not intervene to steer his car back toward safety and was killed in the fiery wreck.”
“But Tesla should have anticipated that drivers would misuse its ‘autopilot’ feature like this and should build in more safeguards to prevent deadly crashes,” says Domonoske.
“That's according to the National Transportation Safety Board, which spent nearly two years investigating the crash,” Domonoske continues.
The article explains, “Tesla's advanced driver assistance software is called ‘Autopilot.’ That suggests the car can steer autonomously, but the system is limited and drivers are supposed to pay attention so they can take control from the car if necessary.”
"When driving in the supposed self-driving mode, you can't sleep. You can't read a book. You can't watch a movie or TV show. You can't text. And you can't play video games," Robert L. Sumwalt, chairman of the NTSB, said Tuesday, shares Domonoske’s article.
That’s a tragic accident.
What are your thoughts?
Should Tesla have safeguards in place with a driver assistance software called ‘Autopilot?’
Whatever your opinion, we’ll be the first to tell you: you need a methodology to anticipate the actions of your customers.
The quick answer: you want to keep them safe.
And, your goal is to protect your property from liability.
So, how do you do that?
You employ Proactive Operations, well, to make your property’s operation proactive.