Tesla self-driving car crashes into fire engine in US

In the Washington Post, far-right billionaire Elon Musk’s self-driving car flops.

Some excerpts:

The school bus was flashing stop warning lights when Tillman Mitchell, 17, got off one afternoon in March 2023. Then a Tesla Model Y approached on North Carolina Highway 561, a North Carolina state road. The car – allegedly on autopilot mode – never slowed down.

Hitting 75km/h, Mitchell was thrown through the windshield, went flying through the air and landed face down on the road, according to his great-aunt, Dorothy Lynch. Mitchell’s father heard the accident and ran from his porch to find his son lying in the middle of the avenue. “If he was a smaller child,” said Lynch, “he would be dead.”

That Halifax County crash was one of 736 crashes in the U.S. since 2019 with Teslas on Autopilot mode — far more than previously reported, according to an analysis by the U.S. Department of Highway Traffic. The number has risen over the past four years, the data shows, reflecting the dangers associated with the increased use of self-driving technology by Tesla, billionaire Elon Musk’s automaker.

When authorities first released a partial account of crashes involving Autopilot in June 2022, they counted just three deaths linked to the technology. The latest figures include at least 17 fatal incidents, 11 of them since May 2022, and five serious injuries. (…)

Mitchell’s family considers the accident a mistake motivated by overreliance on technology, what experts call “automation complacency”. But when asked about Musk, the young man’s relatives were tougher: “I think they need to ban autonomous driving, it should be banned.”

According to Musk, Tesla’s self-driving cars are safer than those piloted by humans. He pressed the automaker to develop features to maneuver on the streets and deal with stalled school buses, fire engines, traffic lights and crosswalks – arguing that the technology will usher in an accident-free future. While it’s impossible to say how many accidents have been prevented by automation, the data shows clear flaws in the technology.

The 17 fatal Tesla crashes reveal distinct patterns, the Washington Post found: Four involved a motorcycle. Another involved an emergency vehicle. Meanwhile, some of Musk’s decisions — such as expanding the availability of features and removing radar sensors from vehicles — appear to have contributed to the reported increase in incidents, according to experts who spoke with the Post. Tesla and Elon Musk did not speak to the newspaper.

The US Highway Agency (NHTSA) does not comment on ongoing investigations. “NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully involved in the task of driving at all times. Consequently, all state laws hold the human driver responsible for the operation of their vehicles.” (…)

Former NHTSA senior safety adviser Missy Cummings, a professor at the College of Engineering at George Mason University, says the increase in Tesla crashes is alarming. “Tesla is having more serious – and fatal – accidents than average,” she said of the Post’s survey. One likely cause, she said, is the launch of Full Self-Driving, which brings driver assistance to urban streets. “Is it reasonable to expect that this is increasing accidents? Sure, sure.”

Elon Musk at a Tesla event in Shanghai in January 2020

It’s unclear whether the data captured all crashes involving Tesla’s driver assistance systems. The NHTSA data includes some incidents where it is “unknown” whether cruise control or fully autonomous driving was in use. That includes three deaths, including one in 2022.

The data was collected after the federal government required automakers to disclose accidents with autonomous cars. The number of accidents involving the technology is low compared to all road incidents; NHTSA estimates that more than 40,000 people will die in car crashes in the US in 2022. Since the beginning of this investigation, the vast majority of automation-related crashes have involved Teslas.

Introduced in 2014, Tesla’s Autopilot is a set of features that allows the car to maneuver from the entrance ramp to the highway exit ramp, maintaining speed and distance behind other vehicles, following the lane lines. Tesla offers it as a standard feature on its vehicles, of which more than 800,000 are equipped with Autopilot.

Full Self-Driving is an experimental feature that customers must purchase, allowing Teslas to maneuver from point A to point B by following step-by-step instructions on a route, stopping at traffic lights, turning, changing lanes and responding to hazards. along the way. With either system, Tesla says drivers must monitor the road and intervene when necessary.

The spike in collisions coincides with the launch of Tesla’s Full Self-Driving, which expanded from around 12,000 users to nearly 400,000 in just over a year. Nearly two-thirds of all driver assistance failures that Tesla reported to the NHTSA occurred in 2022. In February 2023, Tesla recalled more than 360,000 vehicles with Full Self-Driving – due to concerns that the software would take its vehicles disobeying traffic lights, stop signs and speed limits. (…)

NHTSA has opened several investigations into Tesla’s crashes and other issues with its driver assistance software. One focused on “ghost braking,” a phenomenon in which vehicles abruptly slow down due to imagined hazards. In 2022, a Tesla Model S suddenly braked on the San Francisco Bay Bridge, resulting in an eight-vehicle pileup that left nine people injured.

Also last year, the NHTSA opened two investigations into fatal crashes involving Teslas and motorcyclists. One occurred in Utah, when a motorcyclist on a Harley-Davidson was traveling on a track outside Salt Lake City shortly after 1 am, according to authorities. A Tesla hit the motorcycle from behind, killing the rider.

Source: https://www.diariodocentrodomundo.com.br/17-mortes-e-736-acidentes-o-fracasso-retumbante-do-carro-autonomo-de-elon-musk/

Leave a Reply

Your email address will not be published. Required fields are marked *