出版社:The Japanese Society for Artificial Intelligence
摘要:A suitable control of head motion in robots synchronized with its utterances is important for having a smooth human-robot interaction. Based on rules inferred from analyses of the relationship between head motion and dialog acts, this paper proposes a model for generating head tilting and evaluates the model using different types of humanoid robots. Analysis of subjective scores showed that the proposed model can generate head motion with increased naturalness compared to nodding only or directly mapping people's original motions without gaze information. We also evaluate the proposed model in a real human-robot interaction, by conducting an experiment in which participants act as visitors to an information desk attended by robots. The effects of gazing control were also taken into account when mapping the original motion to the robot. Evaluation results indicated that the proposed model performs equally to directly mapping people's original motion with gaze information, in terms of perceived naturalness