Self-driving and driving-assisted technology involved in hundreds of car accidents

The federal government’s top auto-safety regulator on Wednesday revealed that nearly 400 accidents in the U.S. in 10 months involved cars using advanced driving-assist technologies.

The findings are part of a serious effort by the National Highway Traffic Safety Administration to determine the safety of advanced driving systems.

Of the 392 incidents listed by the agency from July 1 to May 15 last year, six were fatal and five were seriously injured. Teslas, which operates with an automated pilot, had 273 crashes with the highly ambitious full self-driving system or any component component associated with it. Five of those Tesla crashes were fatal.

The following data were collected An NHTSA order last year Vehicle manufacturers are required to report accidents involving cars with advanced driving-assist systems. Many manufacturers have developed such systems in recent years, including features that allow you to take your hands off the steering wheel under certain conditions and enable parallel parking.

This is an unusually bold step for NHTSA’s order regulator, which has come under fire in recent years for not having much commitment with automakers.

“Until last year, the NHTSA’s response to autonomous vehicles and driving assistance was, frankly, passive,” said Matthew Wansley, a professor at the Cardoso School of Law in New York who specializes in emerging automotive technologies. “This is the first time the federal government has collected crash data directly on these technologies.”

Speaking to reporters ahead of Wednesday’s release, Steven Cliff, an NHTSA executive, said the data – which the company continues to collect – “will help our investigators quickly identify potential vulnerability trends.”

Dr Cliff said the NHTSA would use such data as a guide to formulate any rules or requirements for their design and use. “These technologies have great promise to improve safety, but we need to understand how these vehicles operate in real-world situations,” he said.

An advanced driver-assist system can automatically operate, brake, and accelerate vehicles, however drivers must be alert and ready to control the vehicle at any time.

Safety experts are concerned because these systems allow drivers to drop active control of the car and seduce them into thinking their cars are driving by themselves. If the technology malfunctions or is unable to handle a particular situation, drivers may not be ready to take control quickly.

About 830,000 Tesla cars in the United States are fitted with auto-pilot or other driver-assist technologies of the company – an explanation given on Wednesday that Tesla vehicles are responsible for nearly 70 percent of all crashes.

Ford Motor, General Motors, BMW and other vehicles have similar advanced systems that allow hands-free driving on highways under certain conditions, but very few of those models have been sold. However, these companies have sold millions of cars over the past two decades, fitted with unique components of driving-assist systems. Parts include so-called lane keeping, which allows drivers to stay in their lanes and has adaptive cruise control, which adjusts the car’s speed and brakes automatically when front traffic slows down.

In a Wednesday release, the NHTSA revealed that Honda vehicles were involved in 90 incidents and the Supras in 10 incidents. Ford, GM, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.

Data include cars with systems designed to operate with little or no intervention from the driver, and individual data on systems that simultaneously control the driver’s speed but require constant attention from the driver.

Automotive vehicles – most of which are still under development, but they are being tested on public roads – have been involved in 130 incidents, the NHTSA found. One person was seriously injured, 15 with minor or moderate injuries and 108 with no injuries. Many accidents involving automatic vehicles are fender curves or bumper tubes because they were mainly driven at low speeds and city driving.

See also  Join the USC and UCLA Big Ten Conference, rocking the college playground

In one-third of the 130 crashes caused by automated systems, the car is stopped and another vehicle collides. In 11 accidents, the data show that a car driven with such technology went straight and collided with another vehicle that was changing lanes.

Most incidents involving advanced systems take place in the San Francisco or Bay Area, where companies such as Waymo, Argo AI and Cruise test and refine the technology.

Waymo, which is owned by Google’s parent company, has been running driverless taxis in Arizona, part of 62 incidents. Cruise, a division of GM, was involved in 23. Cruise launched driverless taxi rides in San Francisco this month Permission granted From the California authorities to charge passengers.

None of the cars using the automatic systems were involved in serious accidents, and only one accident resulted in serious injury. In March, a cyclist from behind collided with a cruise-driven vehicle as the two were traveling down a street in San Francisco.

The NHTSA’s order for vehicle manufacturers to submit data has been somewhat triggered by accidents and fatalities operating on the Teslas auto pilot over the past six years. Last week The NHTSA expanded the investigation Regarding whether the automation pilot has technical and design defects that could cause safety risks.

The agency is investigating 35 crashes that occurred when the autopilot was activated, nine of which have killed 14 people since 2014. It has also launched an initial investigation into 16 incidents in which Teslas under the control of an autopilot collided with a parked emergency vehicle. Their lights are flashing.

In November, Tesla recalled nearly 12,000 vehicles that were part of a full self-driving beta test after a software update that said the cars could cause accidents due to unexpected activation. Emergency braking system.

According to the NHTSA mandate, companies must provide data on accidents when sophisticated driver-assist systems and automation technologies are in use within 30 seconds of impact. Although these data provide a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce malfunctions or improve security.

See also  Dow futures fell 270 points on fears of Fed overreach

The company does not collect data that allows researchers to easily determine whether it is safer to use these systems than to turn them off in the same situations. Automakers were allowed to edit descriptions of what happened during accidents, an option commonly used by Tesla and Ford and others, making data difficult to interpret.

Some are independent Studies These technologies have been explored, but have not yet shown whether they reduce accidents or improve safety.

J. Christian Gertes, a professor of mechanical engineering and director of Stanford University’s Center for Automated Research, said the data released on Wednesday was useful to a degree. “Can we learn more from this data? Yes, ”he said. “Is this a complete gold mine for researchers? I did not see it.

Because of the corrections, he said, it is difficult to quantify the final application of the findings. “The NHTSA has a better understanding of this data than the general public sees,” he said.

Dr. Cliff, NHTSA Administrator, was protected about acting on the results. “Data can raise more questions than they can answer,” he said.

But some experts said the newly available information should prompt regulators to be more assertive.

“The NHTSA can use its various powers to do more – rule-making, star ratings, inquiries, further inquiries and soft influence,” said Bryant Walker Smith, associate professor of law and engineering at the University of South Carolina. Transport technologies.

“This data will trigger more voluntary and spontaneous expressions,” he added. “Some companies may voluntarily provide additional environment, especially when traveling miles, accidents are ‘prevented’ and other indicators of good performance. Investigating attorneys will also look for patterns and cases in these data.

Overall, “it’s a good start,” he said.

Jason Cow, Asma Elcordi And Vivian Lee contributed to research and reporting.

Leave a Reply

Your email address will not be published.