Authors

Yifei Zhu and Colin Brush and Tom Williams

Venue

IEEE International Conference on Robot and Human Interactive Communication

Publication Year

2024
Robots deployed into real-world task-based environments may need to provide assistance, troubleshooting, and on-the-fly instruction for human users. While previous work has considered how robots can provide this assistance while co-located with human teammates, it is unclear how robots might best support users once they are no longer co-located. We propose the use of Augmented Reality as a medium for conveying long-distance task guidance from humans' existing robot teammates, through Augmented Reality facilitated Robotic Guidance (ARRoG). Moreover, because there are multiple ways that a robot might project its identity through an Augmented Reality Head Mounted Display, we identify two candidate designs inspired by existing interaction patterns in the human-robot interaction (HRI) literature (re-embodiment-based and telepresence-based identity projection designs), present the results of a design workshop to explore how these designs might be most effectively implemented, and the results of a human-subject study intended to validate these designs.