World Economic

World Finance and Economic News

Tesla FSD makes terrifying mistake in viral video

4 min read

Tesla Full Self-Driving (Supervised) is back in the spotlight this week after yet another viral video shows a vehicle using the assisted driving tech doing something a human driver would probably never do under normal circumstances.

On Monday, March 9, a video circulated of a Tesla with FSD engaged driving through a railroad crossing that had deployed its guard arms well before the Tesla arrived. The vehicle drove through the first guard arm at about 20 mph, slowed down on the tracks, and then drove through the second arm as the train approached.

Tesla global deliveries by year2025: 1.22 million2024: 1.79 million2023: 1.81 million2022: 1.37 million2021: 936K2020: 499K2019: 367K
Source: Statista

It’s not the first time Tesla FSD has gone viral for the wrong reason.

Last September, X user @HinrichsZane shared a video of his Model S traveling in the left lane with FSD engaged, going about 50 mph. Anticipating the ambulance’s approach, he filmed the driver’s side mirror to see how his Model S would react. 

It didn’t react well, seemingly not recognizing the ambulance had the right of way. “I let FSD go as long as I could before I took over and pulled over. Not a hater. FSD is amazing but nowhere near foolproof,” he said.

SAE International (formerly the Society of Automotive Engineers) considers advanced driver assistance systems, such as GM Super Cruise and Tesla Full Self-Driving, to be Level 2 automation, which requires the driver to remain engaged.

Anything Level 3 and above is considered truly “autonomous.” This means no human intervention is required when the system activates features such as lane assist and automatic braking.

But that’s not what Tesla FSD is designed to do, and Monday’s viral video shows exactly why.

A recent incident involving Tesla Full-Self-Driving at a railroad crossing raises serious concerns.

Leong/Washington Post via Getty Images

Tesla FSD ignores the railroad crossing, drives through the barrier

On Monday, Laushi Liu’s dashcam footage of the railroad incident spread across social media from his Threads account.

Liu, with FSD enabled, makes a left turn and cruises towards the railroad crossing in West Covina, Calif., a few hundred feet ahead.

Related: Viral Tesla FSD video shows why human drivers are a big problem

But instead of slowing down as the railroad track barriers deploy, the car cruises along at 25 mph and crashes through the barrier. It crawls to an almost-stop on the tracks before driving through the second barrier to safety.

This isn’t the first time a Tesla has failed to stop at a railroad crossing.

Last year, an NBC News investigation found that Tesla’s FSD often fails to stop at train crossings. The news service found 40 instances online since 2023 in which FSD mishandled railroad crossings, including failing to stop.

NBC interviewed six Tesla drivers who had the same complaint, four of whom provided videos.

The National Highway Traffic Administration told NBC that it has been “in communication” with Tesla about the issue.

Tesla did not immediately respond to a request for comment.

Autonomous vehicles are better than human drivers at some things, an insurance analyst says

Waymo, which is the most active of the U.S robotaxi options, says that compared to those with human drivers, its autonomous vehicles have been involved in 90% fewer crashes resulting in serious injuries. 

Auto insurance companies have a lot at stake with this new technology. Autonomous vehicles could change insurance pricing at the most minute level. The question is: Will it raise rates or lower them?

Related: Tesla Robotaxi prices just jumped. Here is what riders pay now

Right now, the industry is in a wait-and-see pattern.

“I don’t think they have the data yet to make that kind of assessment,” David Kidd, vice president for vehicle research at the Insurance Institute of Highway Safety, told Bloomberg when asked which drivers are more likely to crash: autonomous or human. 

“Most insurers are extremely conservative, and they rely on historical data to assess risk accurately. There just isn’t enough information available yet.”

Trent Victor, Waymo’s director of safety research and best practices, recently gave an interview saying much of the same. “There is not yet sufficient mileage to make statistical conclusions about fatal crashes alone,” he said.

“As we accumulate more mileage, it will become possible to make statistically significant conclusions on other subsets of data, including fatal crashes as its own category.”

Waymo vehicles have driven approximately 127 million miles across the fleet and have been involved in at least two fatal crashes, MSN reported. However, the autonomous vehicle was not directly found responsible for either of them. Human drivers average about 123 million car miles traveled for every fatality, according to the IIHS.

So how can an AV company prove to IIHS that its vehicles are safer than human drivers?

“It would depend upon the use case,” according to Kidd.

“If a trucking company operates AVs on interstates between two hubs, and they’re able to do that with very infrequent crashes compared to truck drivers, then I would say they provide a substantial safety improvement in that environment. But I wouldn’t generalize to say that means automation is safer across the board. Those assessments need to be done on a case-by-case basis.”

Related: Tesla gets some more good news from a key region

#Tesla #FSD #terrifying #mistake #viral #video

Leave a Reply

Your email address will not be published.