Germany’s government has answered the car ethics question once and for all: driverless cars should prioritize the protection of human life over the destruction of animals or property.
On Wednesday, the nation’s Federal Ministry of Transport and Digital Infrastructure – a curious combination that suggests they took “information superhighway” too literally – announced it will “implement” guidelines devised by a panel of experts scrutinizing self-driving technology.
Back in June, the ministry’s ethics commission produced a report on how computer-controlled vehicles should be programmed and designed in future. The panel of 14 scientists and legal eggheads suggested some 20 rules autonomous rides should follow. Now, Germany’s transport regulator has pledged to enforce them in one way or another.
- The protection of human life always has top priority. If a situation on the road goes south, and it looks as though an accident is going to happen, the vehicle must save humans from death or injury even if it means wrecking property or mowing down other creatures.
- If an accident is unavoidable, the self-driving ride must not make any choices over who to save – it can’t wipe out an elderly person to save a kid, for instance. No decisions should be made on age, sex, race, disabilities, and so on; all human lives matter.
- A surveillance system should be in place – such as a black box – that records the steps leading to an accident so that it’s obvious who was driving at the time and who is therefore at fault: the human behind the wheel, or the computer. The identity of the driver should also be documented. It should be entirely possible to proportion blame accurately, essentially.
- Drivers should have full control over what personal information is collected from their vehicles. This would basically to stop tech giants taking location data on the down-low to custom advertising, for example.
Ultimately, drivers will still bear responsibility if their autonomous charabanc crashes, unless it was caused by a system failure, in which case the manufacturer is on the hook.
“The interaction between Man and machine raises new ethical questions during this time of digitization and self-learning systems,” said transport minister Alexander Dobrindt. “The ethics commission has done pioneering work and has developed the world’s first guidelines for automated driving. We are now implementing these guidelines.”
This comes after a law was passed earlier this year in Germany requiring a human to be sat behind the wheel of all “driverless” cars so that they can take over at the first sign of trouble. This allows people to test their autonomous vehicle software and hardware, and prat about on their phone as needed while the computer does the rest.
Although Germany is, famously, home to a large number of car manufacturers – including BMW, Mercedes and Daimler to name but three – the most famous purveyor of autopilots in the Anglosphere is US-headquartered Tesla. The Musk-mobile maker has suffered a number of very high-profile crashes, including one where a driver who initially blamed the autopilot later told the media – seemingly with Tesla’s approval – that he himself was at fault.
In addition, 40-year-old Joshua Brown was killed when neither he nor his Tesla’s autopilot spotted a large light-colored lorry driving slowly across the motorway he was traveling on. America’s National Transportation Safety Board carried out a very detailed investigation into the crash, obtaining telematics data from the car’s systems.
It seems likely that Tesla’s autopilot software, off the shelf, will comply with Germany’s upcoming rules, which will be reviewed two years after they are implemented. Key on the list of “what does this mean?” questions for autonomous cars are:
- Assigning blame after the accident.
- Data protection (driverless cars generate and store relatively large amounts of data, including on the human driver’s behavior at the wheel).
- Insurance issues.
This move by Germany will eventually put one solution to the question of “what should the car do in the event of an inevitable crash” on a legally binding footing. ®