Quick Takes: Sensor Suites and Higher-Level Autonomy
LeddarTech, Renesas and Coast Technologies execs share their views
(Images: LeddarTech)
Lidar developer LeddarTech is hosting a series of webinars on autonomous driving and new mobility services. The latest session focused on Multiple Sensing Modalities: The Key to Level 3-5 Autonomy.
Who?
The panel featured a pair of similarly named CTOs: Pierre Oliver of LeddarTech and Pierre Lefrevre of Coast Autonomous, a mobility-as-a-service and self-driving vehicle developer. They were joined by Daniel Sisco, senior automotive director for chip giant Renesas.
Here are a few highlights and insights:
Putting the Plus in L2
Oliver: “Level 2 is the first stage where the car starts to drive itself. But it’s not very good. …. Level 2+ is Level 2 that works. It’s what most people would like today.”
Coast’s Mission
Lefrevre: “We don’t build autonomous vehicles. We build road missions with a focus on safety.
“This requires both long- and very close-range perception… (and) a lot of redundancy in field of view, range and technology.”
On Lidar
Oliver: “We’re still trying to catch up and deliver lidar that is at a similar maturity level as cameras and radar. Getting sufficient processing at an affordable cost and power budget remains a challenge.”
Catching Up to Humans
Sisco: “The human brain does incredible parallel processing that we are struggling to emulate from the compute perspective. You can have the largest sensor suite in the world. But at the end of the day you have to more or less serially compress all of that down to a scene and make decisions based on the environment.”
“Is there a sensor that replicates the instinct of a human, who may see some flash of light or hear the tires on the road to sense something has gone wrong and feel that the vehicle is drifting a little, so I know, because of my brain and experience that I probably have a flat tire? We have to artificially replicate these in autonomous systems.”
Improving ADAS
Oliver: “ADAS is very much about safety and convenience. But all evidence shows that ADAS today is not very good in instrumented tests or consumer feedback. Many people turn off these features because they aren’t delivering what they claim. There is room for much improved ADAS.”
Computer Power and the Cloud
Sisco: “Compute power per watt is key. Cooling systems, power dissipation, even generating the power to run these ECUs has a big effect on the vehicle system overall.
“We have to think how we interact with that cloud, during run time, development, testing. How do we get data back and forth between the cloud and the edge and what role does it play?”
Innovation Needed
Oliver: “We can all use better sensors and faster processing.
“The key is in the processing and perception. It’s about how can the automated vehicle better emulate the behavior of the human?
“How do you better process all the sensor input and leverage the cues and infrastructure to deliver solutions that match or come close to human drivers?”
RELATED CONTENT
-
Kroger Tests Self-Driving Grocery Delivery Service
The Kroger Co. and Silicon Valley startup Nuro launched a pilot program for autonomous grocery delivery this week in Scottsdale, Ariz.
-
On Audi's Paint Colors, the Lexus ES 250, and a Lambo Tractor
From pitching a startup idea to BMW to how ZF is developing and using ADAS tech to a review of the Lexus ES 250 AWD to special info about additive at Toyota R&D. And lots in between.
-
Toyota Employees to Aid Michigan V2X Research
Toyota Motor Corp. is encouraging employees at its research and development center near Ann Arbor, Mich., to participate in an on-going program there to test connected vehicle technologies.