Share
Commentary

Man Who Triggered Devastating Eight-Car Pileup Says His Tesla Did Something Strange Right Before Impact

Share

A man who seems to have been at the head of an eight-car pileup in San Francisco said that Tesla’s automatic breaking in the vehicle’s “full self-driving” mode led to the accident that sent two to the hospital.

The driver who was involved in the Nov. 24 accident, which occurred around 12:40 p.m. on San Francisco’s Bay Bridge, claimed that he had his Tesla Model S in “full self-driving” mode, but the software malfunctioned causing the accident, according to the U.K. Daily Mail.

According to CNN Business, the California Highway Patrol police report of the accident indicated that the man laid the blame on his electric vehicle, adding that the accident happened as he was changing lanes and slowing to a stop.

Despite the man’s claim in the police report, CHP said in its Dec. 7 report that it has not confirmed that the man’s Tesla was in “full self-driving” mode at the time of the crash.

Four ambulances were called to the scene of the accident and two people, including a child, were sent to the hospital. Another 16 people were treated and released at the scene, the Daily Mail added.

Trending:
Not Just Nickelodeon: 'Big Bang Theory' Star Mayim Bialik's Disturbing Claim

The mess shut down two lanes of traffic on Interstate 80 for about 90 minutes, authorities said.

The accident occurred only hours after Elon Musk told Tesla drivers that any of them could download and use the “full self-driving” software. Previously, the software was only available to drivers with high safety marks on the Tesla rating system.

CNN noted that Tesla’s “full self-driving” is “designed to keep up with traffic, steer in the lane and abide by traffic signals. It requires an attentive human driver prepared to take full control of the car at any moment. It’s delighted some drivers but also alarmed others with its limitations. Drivers are warned by Tesla when they install ‘full self-driving’ that it ‘may do the wrong thing at the worst time.'”

The police report claimed that the Tesla was driving at about 55 mph while shifting to the far-left lane. But the car suddenly braked and slowed down to 20 mph. The abrupt actions led to a chain reaction accident.

Should self-driving cars be allowed to operate on the road?

The National Highway Traffic Safety Administration is already set to investigate the accident. The NHTSA has also reported that it is investigating a long list of self-driving tech failures.

There are dozens of complaints saying that Tesla’s self-driving tech has problems with “random” braking “without warning.” Drivers have claimed the tech has caused them to become involved in near accidents.

Indeed, the agency has raised its investigation to an engineering analysis, indicating that the problem with the tech is serious and might necessitate a recall.

The California Department of Motor Vehicles has also become sour on Tesla’s self-driving software. The department accused “Tesla of deceptive practices in marketing [its] autopilot and full self-driving” software, CNBC reported in August.

The department alleged that the carmaker leads owners to think that the software will allow the car to operate as an “autonomous vehicle, but vehicles equipped with those ADAS features could not at the time of those advertisements, and cannot now, operate as autonomous vehicles.”

Related:
EV Giant Tesla Begins Mass Layoffs, Loses Two Top Executives

Teslas have been involved in a number of accidents, such as one that occurred in southern California in 2021 that reportedly involved its autopilot software. A 35-year-old man was killed when his Model 3 struck an overturned semi on the freeway.

In January 2020, the Associated Press reported on three deaths from three separate autopilot accidents — all of which cast doubt on the software.

And this year, it was reported that eleven people died in crashes with electric vehicles using automated driving systemsx in just a four-month period. Also this year, a Florida jury found Tesla negligent in an accident from 2018 that left two teens dead and awarded families a $10.5 million settlement.

We don’t know yet what happened exactly during the Thanksgiving accident. But we do know that autopilot and self-driving modes have been a problem in multiple cases, and it appears that the tech is simply not ready for primetime.

Truth and Accuracy

Submit a Correction →



We are committed to truth and accuracy in all of our journalism. Read our editorial standards.

Tags:
, , , , , , , ,
Share
Warner Todd Huston has been writing editorials and news since 2001 but started his writing career penning articles about U.S. history back in the early 1990s. Huston has appeared on Fox News, Fox Business Network, CNN and several local Chicago news programs to discuss the issues of the day. Additionally, he is a regular guest on radio programs from coast to coast. Huston has also been a Breitbart News contributor since 2009. Warner works out of the Chicago area, a place he calls a "target-rich environment" for political news. Follow him on Truth Social at @WarnerToddHuston.
Warner Todd Huston has been writing editorials and news since 2001 but started his writing career penning articles about U.S. history back in the early 1990s. Huston has appeared on Fox News, Fox Business Network, CNN and several local Chicago news programs to discuss the issues of the day. Additionally, he is a regular guest on radio programs from coast to coast. Huston has also been a Breitbart News contributor since 2009. Warner works out of the Chicago area, a place he calls a "target-rich environment" for political news.




Conversation