On June 30, 2016, Tesla reported the first fatal accident involving its self-driving feature. Joshua Brown, a 40-year-old ex-NAVY seal, was on board of his Tesla S model when he crashed with a truck-trailer that got in the way losing his life. The man was using the self-driving feature at the moment of the collision, so the National Highway Traffic Safety Administration (NHTSA) is going to test all the cars that are equipped with the same software from the one in the accident. The objective of the probe is to determine if the program works as expected.
“This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations,” reads a post on Tesla’s blog.
Josh Brown was a known technology fan who openly supported Tesla Motors. In fact, he posted a video that shows the self-driving feature properly reacting in the highway possibly preventing an accident. In the touching note, the Tesla team labels the man as a friend of the company and sends its condolences to the family. Accordingly, it is titled “A tragic loss”.
There is a lot of unnecessary speculation about the future of smart cars
This might as well be the first fatal traffic accident involving a smart car, and a lot of people feel it will affect not only the future of Elon Musk’s company but the future of self-driving cars. However, there are a lot of reasons to think otherwise.
Yes, the federal regulators are going to probe carefully all the cars carrying the self-driving program, but it is crucial to understand that it is not an extra-ordinary action, it is just routine. After an accident, the authorities open an investigation to understand what happened, however, in this case, the technology involved is in public beta test which means they have to be particularly meticulous.
Nonetheless, bad publicity is a major threat Tesla Motors, and intelligent car makers in general, are facing here. Sometimes, especially when the case involves the death of a person, people can quickly jump to conclusion forgetting, or not want to see, the facts. Tesla’s autopilot has reached more than 130 million driven miles before reporting a fatal victim, and the federal reports indicate that every 60 million miles there is a fatal traffic accident.
Moreover, there is a huge misconception about how the software works, and it could be attributed to the words people have used to describe it such as self-driving feature or autopilot. In fact, it works more as an assistant, and the company makes sure every customer knows this before they can activate it.
Tesla disables the autopilot by default, and when the users activate it, a series of notifications inform people that it is a new technology, it is still being tested, and more importantly, they have to be ready to take control of the car at any given moment. To ensure this, the company set sensors that make sure the driver has his hands on the steering wheel. If he does not, the software starts decelerating until stopping unless the hands are back on the wheel. This means that not only did the software fail to see the truck-trailer but so did the driver.
In the post, the company explains that maybe the white color of the rear part of the trailer blended with a cloudy shiny sky which made impossible for both the machine and men to see the obstacle until it was too late. The Tesla Team says that if the trailer had faced the car, things would have been very different.
Source: Tesla Motors