DZone
AI Zone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
  • Refcardz
  • Trend Reports
  • Webinars
  • Zones
  • |
    • Agile
    • AI
    • Big Data
    • Cloud
    • Database
    • DevOps
    • Integration
    • IoT
    • Java
    • Microservices
    • Open Source
    • Performance
    • Security
    • Web Dev
DZone > AI Zone > Is a Human Life Worth as Much as a Robotic Life?

Is a Human Life Worth as Much as a Robotic Life?

You might think that it would be impossible for people to value a piece of hardware over human life, yet new research suggests that it might actually be true.

Adi Gaskell user avatar by
Adi Gaskell
·
Apr. 09, 19 · AI Zone · News
Like (1)
Save
Tweet
3.23K Views

Join the DZone community and get the full member experience.

Join For Free

You might think that it would be impossible for people to value a piece of hardware over human life, yet new research from Radboud University suggests that such circumstances may exist. Bizarrely, one of these circumstances might involve a perception that robots feel pain.Image title

"It is known that military personnel may mourn a robot that is used to clear mines in the army. Funerals are organized for them. We wanted to investigate how far this empathy for robots extends, and what moral principles influence this behavior towards robots. Little research has been done in this area as of yet, " the authors explain.

Sacrificing the Machine

What extent would we be willing to sacrifice machines if it meant saving a human life? Volunteers to the research were presented with a number of moral dilemma scenarios whereby they would have to save an individual to save a group of wounded people. In some scenarios that individual was a human, in some, it was a humanoid robot, whilst in others, it was a more regular piece of machinery.

The results showed that when the robot was humanoid in style, it presented a much sterner dilemma for the volunteers. When it was designed in a humanoid-style and presented as having its own thoughts and emotions, the participants were less likely to sacrifice the machine for anonymous humans. It's a finding that the researchers believe highlight how people can bestow certain moral values on robots in the right circumstances.

"A human-looking robot can cause feelings and behaviors that contrast with the function for which they were developed to help us. And the question is whether this is desirable for us," they explain.

Empathizing With the Machine

This should perhaps not come as that big a surprise. Research published a few years ago in Nature highlighted the ability of people to form emotional bonds with machines.

The research, conducted by a Japanese team, believes it has found the first neurophysiological evidence of our ability to empathize with a robot in apparent pain, albeit at a slightly different level to that shown towards other humans.

The study saw EEG tests performed on a relatively small sample of adults who were shown pictures of either a human or robotic hand that was in a painful situation.

The results were fascinating. Participants did show empathy towards the robot, but at a lower level than to the humans in the picture.

"The ascending phase of P3 (350-500 ms after the stimulus presentation) showed a positive shift in the observer for a human in pain in comparison with the no-pain condition, but not for a robot in perceived pain. Then, the difference between empathy toward humans and robots disappeared in the descending phase of P3 (500-650 ms)", the authors say, "The positive shift of P3 is considered as reflecting the top-down process of empathy. Its beginning phase seems related to the process of perspective taking, as was shown in a previous study."

So it's perhaps not that surprising that people are increasingly willing and able to exhibit an emotional connection with technology, which in turn distorts what would appear to be rational moral responses towards them.

Machine Empathy (software) Moral IT Extent (file systems) Comparison (grammar) teams Testing Clear (Unix) Form (document)

Published at DZone with permission of Adi Gaskell, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Kotlin vs Java: Which One Is the Best?
  • Building a QR Code Generator with Azure Functions
  • How Do You Integrate Emissary Ingress With OPA?
  • The Developer's Guide to SaaS Compliance

Comments

AI Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • MVB Program
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends:

DZone.com is powered by 

AnswerHub logo