Automated Vehicle Systems Perform Poorly in AAA Tests
Reliability needs to be improved to gain public trust
It sounds like a question on a junior high school math test: If you took a cross country trip on Route 66 from Chicago to Los Angeles (about 2,120 miles) in a car equipped with Level 2 automated vehicle systems, how many disengagements/disruptions in the technology would you encounter along the way?
(Image: AAA)
The answer is 265, which averages out to once every eight miles. That’s according to a new AAA study.
But the results don’t have anything to do with what road you’re on, your starting point or destination.
AAA attributes the incidents to the limitations and unreliability of current driver-assist systems.
The Tests
Researchers tested five vehicles at a closed-course test track and on public roads. The vehicles, which were equipped with proprietary systems that automate steering, braking and throttle functions under certain conditions, included:
- 2019 BMW X7 with Active Driving Assistant Professional
- 2019 Cadillac CT6 with Super Cruise
- 2019 Ford Edge with Co-Pilot360
- 2020 Kia Telluride with Highway Driving Assist
- 2020 Subaru Outback with EyeSight
The Results
During closed-course testing, the automated vehicle systems performed mostly as intended.
But there was one big problem area. When approaching a simulated disabled vehicle, two-thirds of the tests ended with a collision. The average impact speed was 25 mph.
On the plus side, none of the vehicles made contact with a lead car during stop-and-go testing.
Overall incident rates were much higher during real-world applications on public roads, AAA says. Nearly three-fourths of the problems involved instances of lane departure or erratic lane position. Other concerns included issues with driving too close to other vehicles and guardrails.
AAA also found that active driving assistance systems—those that combine throttle control with braking and steering—often disengage with little notice and abruptly hand control back to the driver. Cadillac’s Super Cruise was particularly prone to this, but researchers say the system fared much better than those of its competitors when it comes to lane-keeping.
Conclusions
It doesn’t take an “A” student to decipher the results. They’re not good.
“With the number of issues we experienced in testing, it is unclear how these systems enhance the driving experience in their current form,” AAA summarizes. It notes that the ADAS-equipped vehicles didn’t perform consistently, particularly in real-world scenarios.
This is especially problematic considering that in many cases these systems are giving consumers a bad first impression of automated vehicle technologies.
As a result, the insurance giant recommends manufacturers increase the scope of testing for active driving assistance systems and limit their rollout until functionality is improved to provide a more consistent and safer driver experience.
The full results of the study are available on AAA’s web site.