Authors

Ruchen Wen and Boyoung Kim and Elizabeth Phillips and Qin Zhu and Tom Williams

Venue

ACM Transactions on Human-Robot Interaction

Publication Year

2022
Because robots are perceived as moral agents, they must behave in accordance with human systems of morality. This responsibility is especially acute for language-capable robots because moral communication is a method for building moral ecosystems. Language capable robots must not only make sure that what they say adheres to moral norms; they must also actively engage in moral communication to regulate and encourage human compliance with those norms. In this work, we describe four experiments (total š‘ = 316) across which we systematically evaluate two different moral communication strategies that robots could use to influence human behavior: a norm-based strategy grounded in deontological ethics, and a role-based strategy grounded in role ethics. Specifically, we assess the effectiveness of robots that use these two strategies to encourage human compliance with norms grounded in expectations of behavior associated with certain social roles. Our results suggest two major findings, demonstrating the importance of moral reflection and moral practice for effective moral communication: First, opportunities for reflection on ethical principles may increase the efficacy of robotsā€™ role-based moral language; and second, following robotsā€™ moral language with opportunities for moral practice may facilitate role-based moral cultivation