Tesla Problems: U.S. National Highway Traffic Safety Administration Investigates Autopilot Problems in 765,000 Vehicles

0

The U.S. government has launched a formal investigation into Tesla’s partially automated Autopilot driving system after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles – almost everything Tesla has sold in the United States since the start of the 2014 model year.

Of the crashes identified by the National Highway Traffic Safety Administration as part of the investigation, 17 people were injured and one was killed.

NHTSA says it has identified 11 crashes since 2018 in which Teslas on autopilot or cruise control warning traffic struck vehicles at scenes where first responders used flashing lights, flares, an arrow sign illuminated or cones warning of dangers. The agency announced the action on Monday.

The investigation is another sign that the NHTSA, led by President Joe Biden, is taking a stronger stance on automated vehicle safety than under previous administrations. Previously, the agency was reluctant to regulate new technology for fear of hampering adoption of potentially life-saving systems.

The survey covers Tesla’s entire current model lineup, Model Y, X, S and Model 3 from 2014 to 2021.

The National Transportation Safety Board, which also investigated some of Tesla’s 2016 crashes, recommended that NHTSA and Tesla limit the use of autopilot to areas where it can be operated safely. The NTSB also recommended that the NHTSA require Tesla to have a better system to ensure that drivers pay attention. NHTSA did not act on any of the recommendations. The NTSB has no executive power and can only make recommendations to other federal agencies.

Last year, the NTSB blamed Tesla, drivers and lax NHTSA regulations for two crashes in which Teslas crashed under semi-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to ensure that automakers put in place safeguards to limit the use of electronic driving systems.

The agency made the move after investigating a 2019 crash in Delray Beach, Florida in which the 50-year-old driver of a Tesla Model 3 was killed. The car was traveling on autopilot when neither the driver nor the autopilot system applied the brakes or attempted to prevent a semi-trailer from crossing in its path.

The autopilot has often been misused by Tesla drivers, who have been caught driving drunk or even rolling in the back seat as a car rolled down a California highway.

Tesla, who disbanded his media relations office, did not immediately respond to a message seeking comment.

The NHTSA has dispatched teams to investigate 31 accidents involving partially automated driver assistance systems since June 2016. Such systems can keep a vehicle centered in its lane and a safe distance from vehicles in front. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were reported, according to data released by the agency.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to passing through semi-trailers, Tesla using autopilot crashed into stopped emergency vehicles and a roadway barrier.

The NHTSA investigation is long overdue, said Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University who studies automated vehicles.

Tesla’s failure to effectively monitor drivers to ensure they are paying attention should be the top priority of the investigation, Rajkumar said. Tesla’s sense pressure on the steering wheel to make sure drivers are engaged, but drivers often trick the system.

“It is very easy to get around the pressure from the management,” said Rajkumar. “This has been going on since 2014. We’ve been discussing it for a long time now.

The emergency vehicle collisions cited by NHTSA began on January 22, 2018 in Culver City, Calif., Near Los Angeles, when a Tesla using autopilot struck a parked fire truck that was partially in the traffic lanes with its flashing lights. The crews were handling another accident at the time.

Since then, the agency said there had been accidents in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.

“The investigation will assess the technologies and methods used to monitor, assist and enforce driver engagement in the dynamic driving task during autopilot operation,” NHTSA said in its investigation documents.

In addition, the probe will cover the detection of objects and events by the system, as well as where it is allowed to operate. NHTSA has said it will look at “contributing circumstances” to crashes, as well as similar crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

“NHTSA reminds the public that no motor vehicle commercially available today is capable of driving itself,” the agency said. “Every available vehicle requires a human driver in control at all times, and all state laws hold human drivers accountable for the operation of their vehicles. “

The agency said it has “robust enforcement tools” to protect the public and investigate potential security issues, and will act when it finds evidence of “non-compliance or an unreasonable security risk “.

In June, the NHTSA ordered all automakers to report any accidents involving fully autonomous vehicles or partially automated driver assistance systems.

Tesla, based in Palo Alto, Calif., Uses a camera-based system, great computing power, and sometimes radar to spot obstacles, figure out what they are, and then decide what vehicles should do.

But Carnegie Mellon’s Rajkumar said the company’s radar was plagued by “false positives” and would stop cars after determining overpasses to be obstacles.

Now Tesla has done away with radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects in the path. The system, he said, does a very good job on most of the objects that would be seen in the real world. But he had problems with parked emergency vehicles and perpendicular trucks in his path.

“He can only find models he was trained on ‘in quotes’,” Rajkumar said. “Obviously, the inputs that the neural network has been trained on just don’t have enough images. They are only as good as inputs and training. Almost by definition, training will never be good enough.

Tesla is also allowing some owners to test what it calls a “fully autonomous driving” system. Rajkumar said this should also be investigated.


Source link

Leave A Reply

Your email address will not be published.