General

Mercedes: We’ll Be Liable for Self-Driving Cars

Mercedes wants to take legal responsibility for the decisions its automated driving software makes.

Background: Self-Driving Doesn’t Exist Yet

Many companies are working on developing self-driving systems. But no automaker yet sells a truly autonomous car.

An increasing number of them sell cars with limited self-driving modes that allow drivers to briefly take their hands from the wheel but not their attention from the road.

SAE International, a global association of engineers and related technical experts in the aerospace, automotive, and commercial-vehicle industries, has set the industry standard for evaluating self-driving systems. They organize autonomous driving systems into five levels.

Every system currently for sale in the U.S. is an SAE Level 2 system.

Related: Self-Driving Cars – Everything You Need to Know

These systems include GM’s Super Cruise, Ford’s BlueCruise, and Tesla’s misleadingly-named Full Self-Driving (even Tesla admits the system is not self-driving). They relieve some of the workload of driving under specific circumstances. But they require the driver’s full attention. The companies take no legal responsibility from the driver for the car’s actions.

Mercedes would like to.

Confidence in Redundancy

Mercedes’ Drive Pilot system, the automaker claims, is capable of Level 3 autonomy. That means the car can navigate on its own without driver input under limited conditions.

In Drive Pilot’s case, the system can take over driving below 40 mph on pre-mapped highways — in other words, in traffic jams. The system is approved for use in Germany. Drive Pilot Senior Development Manager Gregor Kugelmann tells Road & Track, “we’re aiming to get that for California and Nevada by the end of this year.”

There are few federal laws addressing automated driving. So, Mercedes Vice President of Automated Driving George Massing tells R&T, “we will probably have to deal with each individual state.” But the company plans to accept legal liability for what the car does while Drive Pilot is engaged.

Mercedes can do that, the company says, because Drive Pilot is capable of doing things other automation systems can’t. For instance, it combines camera images, LiDAR scanners, and GPS guidance for redundant decision-making processes. It solves a notoriously hard problem for artificial intelligence by using microphones and cameras to detect emergency vehicles. Mercedes says Drive Pilot can identify them with enough time to give drivers 10 seconds warning to take over.

That apparently leaves Mercedes confident enough in the system to accept responsibility for any accidents it might cause.

Not Clear That Mercedes Can Legally Accept Liability Everywhere

It isn’t clear whether the company can legally take over liability from drivers. Federal laws control some liability questions for software makers, and a patchwork of state laws controls liability for traffic accidents. But, if Mercedes is able to convince regulators and drivers that the company is responsible for decisions its software makes, that could go a long way toward making Americans more willing to accept self-driving technology.