“Collaborative” Driving: Sharing is caring
- Supplier develops next level of driver assists
- Car drives itself, but can hand back human control
- Can act as ultimate driver aid
- Exists with current hardware
While it is widely predicted that many new vehicles sold by 2030 will include advanced technologies that provide collaborative interaction with their drivers, “only a fraction, maybe one percent globally, will be Level 5 fully autonomous,” says Ola Bostrom, Research, Innovation & IPR vice president for Swedish AV developer Veoneer AB. So Veoneer is concentrating on “Collaborative Driving” -- somewhere between Levels 2 and 3 – which lets the car drive itself under ideal conditions but will warn and return control to the human driver on demand and when it senses it should. In Collaborative Driving, he says, “the car is handling parts of the driving, but the driver is still engaged.”
Among the core technologies displayed/demonstrated by Veoneer include:
Pilot Assist Level 2+ -- Gives a vehicle the ability to brake, accelerate and steer itself, but with the driver still in the loop “in an increasing number of situational conditions.” Veoneer (together with software partner Zenuity, Inc.) is working with automakers to deliver its first Pilot Assist Level 2+ functions by 2020-21, with more coming down the road.
Forward-looking cameras using “Deep Learning” technologies for more advanced and reliable driver-aid systems, including next-generation highway pilot assist, potentially available in production vehicles by late 2019.
5G Connectivity – In partnership with mobile broadband internet communications provider Ericsson, cutting-edge 5G technology will enable drivers and vehicles to interact with their surroundings, including real-time roadside assistance and direct engagement with emergency responders, traffic controllers, even people working along the road.
Zeus supercomputer – Zeus (a collaborative effort among Veoneer, Zenuity, and chip-supplier NVIDIA) is claimed to be the world’s first publicly-announced technology that meets the requirements for Level 4 Autonomous Driving. It integrates data from cameras, radars, lidars and other sensors and sources to interpret situations and take required action.
An on-track demonstration in a Veoneer LIV (Learning Intelligent Vehicle) 3.0 third-generation research vehicle showed intriguing collaborative driving capabilities. It took control and pulled over for an emergency vehicle approaching from behind that the driver had not yet seen in his mirror. It steered to miss a pedestrian unseen in heavy fog inside a dark tunnel. It warned of accidents and construction workers ahead before they could be seen. And it took control when the driver was distracted. How does it know? An interior camera sees and interprets not only the driver’s attention level but also his/her emotions. A little creepy? Maybe.
“We have a camera toward the driver that can see cognitive load, thinking, paying attention, sleeping or reading,” Bostrom explains. “You should not be under-stimulated, or overstimulated. Our purpose is to create trust, but not too much trust, and we are measuring the trust. You could design it in a way that when you are sleeping or not paying attention, the car will pull over and stop in a safe way. [To be able to do that] you have to be connected to the infrastructure. You need very good knowledge of what is happening on the road.”
So, while widespread use of fully autonomous vehicles is still very likely years away, we all should be able to benefit from increasing levels of collaborative driving available from technology developer Veoneer and others in more and more new vehicles in coming years.