The ethics of autonomous driving has been a bone of contention for some time, with things like the trolley dilemma a commonly used construct to explore how driverless vehicles should behave in emergency situations.
One angle of thought is that we should look to mimic how human drivers behave. People have been shown to display a high willingness to sacrifice themselves for the safety of others, whilst also making snap decisions based upon the victim’s age and so on.
These findings have emerged from a recent paper, from the University of Osnabrück, Germany, which explored the ethics of motoring. It diverges from the common heuristics around such circumstances, which broadly suggest that no life is more valuable than another. The authors believe it has valuable insights for how ethics should be programmed into autonomous vehicles.
“The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind,” the authors say. “The behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable.”
Whilst it’s widely believed that autonomous vehicles will be considerably safer than their human-driven counterparts, it is also inevitable that they will find themselves in circumstances where morally challenging decisions will need to be made.
The researchers used a virtual reality simulator to explore human intuition in a range of driving scenarios, including the trolley problem. The results revealed that the volunteers often acted at odds with the ethical guidelines established by policy makers.
“The German ethics commission proposes that a passenger in the vehicle may not be sacrificed to save more people; an intuition not generally shared by subjects in our experiment. We also find that people chose to save more lives, even if this involves swerving onto the sidewalk –endangering people uninvolved in the traffic incident. Furthermore, subjects considered the factor of age, for example, choosing to save children over the elderly,” the authors explain.
The study suggests that autonomous vehicles that abide by the ethical guidelines will frustrate passengers who would behave differently if they themselves were behind the wheel. The authors believe that there needs to be a more thorough, societal discussion to define the kind of goals and constraints we wish to apply to autonomous vehicles.
“While ‘dilemma’ situations deserve more study, other questions should also be discussed. Driving requires an intricate weighing of risks versus rewards, for example speed versus the danger of a critical situation unfolding. Decision making-processes that precede or avoid a critical situation should also be investigated,” they conclude.