Boyoung Kim and Ruchen Wen and Ewart J. de Visser and Qin Zhu and Tom Williams and Elizabeth Phillips
RO-MAN 2021 Workshop on Robot Behavior Adaptation to Human Social Norms
We examined whether a robot that proactively offers moral advice promoting the norm of honesty can discourage people from cheating. Participants were presented with an opportunity to cheat in a die-rolling game. Prior to playing the game, participants received from either a NAO robot or a human, a piece of moral advice grounded in either deontological, virtue, or Confucian role ethics, or did not receive any advice. We found that moral advice grounded in Confucian role ethics could reduce cheating when the advice was delivered by a human. No advice was effective when a robot delivered moral advice. These findings highlight challenges in building robots that can possibly guide people to follow moral norms.