Authors
Adam Stogsdill and Thao Phung and Tom Williams
Venue
2nd Workshop on Natural Language Generation for Human-Robot Interaction
Publication Year
2020
Situated human-human communication typically involves a combination of both natural language and gesture, especially deictic gestures intended to draw the listener's attention to target referents. To engage in natural communication, robots must thus be similarly enabled not only to generate natural language, but to generate the appropriate gestures to accompany that language. In this work, we examine the gestures humans use to accompany spatial language, specifically the way that these gestures continuously degrade in specificity and then discretely transition into non-deictic gestural forms along with decreasing confidence in referent location. We then outline a research plan in which we propose to use data collected through our study of this transition to design more human-like gestures for language-capable robots.