Tesla drivers are paying a lot to test Flawed self-driving software

Owners of newer Tesla models are driving vehicles with imperfect codes, with the risk of an accident.

On a recent Saturday evening in South Carolina, Ward Mundy got the handle of his Tesla Model S firmly as his wife yelled. The car sagged on the road.

Mundy, a lawyer and an auto enthusiast said: “It’s the most dangerous ride you can imagine when you turn on Autopilot. He tried to get the Model S back under control, but that wasn’t the only time he had to rely on quick reflexes to avoid an accident. “You can row at 50 mph and the radar points [an approaching] brake and brake immediately. “

Sometimes the car doesn’t respond when it should. “The other extreme point is that you approach the lights with a stopped car and Tesla absolutely doesn’t apply the brakes,” Mundy says.

Autopilot is the generic term for Tesla’s self-driving technologies. Its hardware and software suite is intended to adjust the car’s speed to traffic conditions, keep the car in or automatically change lanes (known as Autosteer), park the car at a nearby spot, or let Vehicle permission is summoned from the garage. Elon Musk has said that later this year, an Autopilot-powered Tesla will drive from Los Angeles to New York without a human being touching the wheel.

To Mundy, that day seemed far from over. “It was quite a frightening experience,” he said. “If you were in the car with my wife, you would know how many times she yelled to turn it off.”

Mundys’s recent autonomous driving experience was reiterated by other Tesla owners on online forums and in YouTube videos about car rotation and Near miss.Tesla has been building new Autopilot hardware and software for every vehicle off the production line since November, quickly implementing self-driving capabilities before they are fully tested. While many Tesla drivers seem willing to play with it, the company’s strategy worries some of them.

Stakes are very high for Tesla as it bets on this aggressive method of testing. Owners who want to enable the Autopilot feature of their car pay thousands of dollars more. If the drivers choose not to participate, the company will lose in an attempt to cover costs. But with a growing record of unexpected trades, fishtailing and other miscalculations, Tesla is at risk of harming not only its reputation, but also its life and safety. Full of some of its biggest fans.

Everything looks a lot more rosy Last March, when Elon Musk took to the stage at Tesla’s Design Center in Hawthorne, California and promised a car with complete self-driving technology for just $ 35,000 (although activating it would have to pay extra fee). Tesla’s Model 3 will be the company’s big leap forward: a stylish, speedy electric car that’s smart enough to let you read a book while commuting to work or share your robot driver for a few bucks la.

The company’s semi-autonomous drivetrain, which works on radar, cameras and ultrasonic sensors, has done well. And Musk’s decision not to use lidar, the powerful yet expensive laser-emitting system found in most self-driving competitors, seems to have paid off.

First introduced in October 2015, the Autopilot has immediately caught the attention of owners who love the ability to drive on the highway without human control. In fact, they were so enthusiastic that Musk quickly promised to dominate Autopilot’s hands-free operation to “minimize people doing so. crazy things with it.”

Sadly, the restrictions don’t work for Joshua Brown. In May 2016, a 40-year-old Model S owner activated Autopilot while driving on a highway in rural Florida. The car cannot detect a truck turning left in front and rushes into the underbody, killing Brown. A reconstruction of the National Highway Traffic Safety Administration (NHTSA) indicated that the truck should have seen Brown at least seven seconds before the crash, indicating he was not paying attention to the child. way forward.

Leave a Comment