Dark Light

Elon Musk claims fatal crash was not on ‘Full Self-Driving Beta’ after Tesla said the logs were lost Leave a comment

[ad_1]

Elon Musk has claimed that a fatal crash reported by The Washington Post yesterday was not on ‘Full Self-Driving (FSD) Beta’ after Tesla told the police that they couldn’t confirm it because the logs were lost.

Now, he goes as far as claiming that FSD Beta would have saved the Tesla employee.

Yesterday, we reported on a horrible accident involving a Tesla employee, Hans von Ohain, and his friend Erik Rossiter.

The duo was returning from a day of golfing outside Denver when von Ohain’s Tesla Model 3 crashed into a tree. Rossiter was able to exit the vehicle as it was catching on fire, but unfortunately, von Ohain was stuck in it as a tree was blocking the driver’s door. He died inside the vehicle.

The crash happened almost two years ago, but it was only reported now after The Washington Post obtained the police investigation that came following the crash. The publication talked to the officer leading the investigation, Rossiter, the only witness, and Von Ohain’s loved ones.

The cause of the crash was clear: von Ohain was intoxicated. An autopsy found that he died with a blood alcohol level of 0.26 — more than three times the legal limit. But the police also wanted to investigated the potential factor of advanced driver assist feature as Rossiter told the first responder right away that the driver was using an “auto drive feature on the Tesla.”

Rossiter said that Von Ohain was using “Full Self-Driving” on the way to golf and back. His family also said that he was an avid user of the feature.

In fact, Nora Bass, his wife, said that he used it almost every time and she herself didn’t use on the car because she was uncomfortable with it:

Von Ohain used Full Self-Driving nearly every time he got behind the wheel, Bass said, placing him among legions of Tesla boosters heeding Musk’s call to generate data and build the technology’s mastery. While Bass refused to use the feature herself — she said its unpredictability stressed her out — her husband was so confident in all it promised that he even used it with their baby in the car.

Everything points to Von Ohain having and using FSD Beta, but Tesla said it couldn’t confirm it through the logs.

The police didn’t have access to the logs because the car completely burned down and Tesla says that the car didn’t beam them over-the-air amid the crash:

Colorado police were unable to access data from the car because of the intensity of the fire, according to the investigation report, and Tesla said it could not confirm that a driver-assistance system had been in use because it “did not receive data over-the-air for this incident.” Madden said the remote location may have hindered communications.

Again, that was found through the police investigation which happened over the last almost two years since the crash.

Now that it becomes public, Tesla CEO Elon Musk has claimed that the car didn’t even have FSD Beta:

Interestingly, Tesla even reported the crash to NHTSA to confirm that “a driver-assistance feature had been in use at least 30 seconds before impact”:

However, Tesla did report the crash to the National Highway Traffic Safety Administration. According to NHTSA, Tesla received notification of the crash through an unspecified “complaint” and alerted federal authorities that a driver-assistance feature had been in use at least 30 seconds before impact.

They didn’t specify which feature and NHTSA couldn’t confirm it either.

Electrek’s Take

Again, as I wrote yesterday, there’s no doubt that the driver is responsible for this crash, and it was a bad decision to get behind the wheel after drinking.

However, I think it’s still important to point out the issue of people behind overconfident with Tesla’s Autopilot and FSD beta features. We have seen plenty of accidents happened, intoxication or not, that appear to stem from driver not paying enough attention because they believe too much in what is only a package of level 2 driver-assist features despite the package’s name.

Now, as for this particular case and whether or not FSD Beta or Autopilot were involved, I see a lot of red flag. The family has been clear. The only witness has been clear, and Tesla has had almost two years of collaborating with the police. It couldn’t tell the police whether FSD Beta was involved or not, and yet, Elon can now tell the public it wasn’t. Red flag.

Also, Tesla told the police that the logs were lost, but it could tell NHTSA that “a driver-assistance feature had been in use at least 30 seconds before impact.” Red flag.

Tesla is required to report crashes involving ADAS features.

Again, don’t drink and drive. That’s obvious. But also, don’t believe that Tesla’s FSD package is going to help you drive while paying less attention to the road or being impaired. It won’t. I might be able to concede that driving with FSD Beta is safer than without as long as you are being as or more vigilant than you would be not using the feature.

If you are not as vigilant because you see this as some kind of crutch, it is more dangerous than driving without it.

I know that when I talk about FSD Beta to people who don’t know much about it, one of the first things that often comes up is, “Oh cool, now the car can drive you home when you are drunk”. That’s a thought that Tesla needs to squash and it doesn’t help when Elon goes out there saying that FSD Beta would have “probably” avoided that accident.

FTC: We use income earning auto affiliate links. More.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *