Saturday 14 December 2024
Select a region
Turns out, driverless cars can be programmed to make moral decisions

Turns out, driverless cars can be programmed to make moral decisions

5 months ago

Turns out, driverless cars can be programmed to make moral decisions

5 months ago


Self-driving cars are being touted as the future of transport, but one rather important question remains: are they capable of making ethical decisions in life-and-death situations?

Scientists say yes.

Researchers from the University of Osnabruck in Germany say they have demonstrated that driverless vehicles are capable of making moral choices – just like we do every day.

Their study has found human moral behaviour can be modelled by algorithms that could be fed into machines as well.

Driverless cars
(Philip Toscano/PA)

Until now, it has been assumed that moral decisions were dependent on context and circumstances and, therefore, could not be described algorithmically.

But as study author Leon Sutfeld explains: “We found quite the opposite. Human behaviour in dilemma situations can be modelled by a rather simple value-of-life-based model that is attributed by the participant to every human, animal, or inanimate object.”

The participants took part in VR experiments where they were asked to drive a car in a typical suburban neighbourhood on a foggy day.

The scenarios they experienced involved unexpected unavoidable dilemma situations with inanimate objects, animals and humans.

Supercar GIF - Find & Share on GIPHY

The drivers had to decide which was to be spared.

The results were then used to create conceptual statistical models for moral rules along with explanations as to how and why the decisions were made.

The scientists were able to put together a formula that placed a variety of living things and objects in order, based on their ‘value of life’.

They say the decision now lies in whether guidelines for machine behaviour should include moral values as well.

Driverless car.
(Johavel/Getty Images)

An initiative from the German Federal Ministry of Transport and Digital Infrastructure has already defined 20 ethical principles related to self-driving vehicles.

Study co-author Peter Konig explains: “Now that we know how to implement human ethical decisions into machines we, as a society, are still left with a double dilemma.

“Firstly, we have to decide whether moral values should be included in guidelines for machine behaviour and secondly, if they are, should machines act just like humans.”

The research is published in the journal Frontiers In Behavioral Neuroscience.


« Return to Tech

You have landed on the Bailiwick Express website, however it appears you are based in . Would you like to stay on the site, or visit the site?