WHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhat

No Thumbnail Available

Date

2024-10-25

Authors

Giannitzi, Eleni

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Human-robot interaction is becoming increasingly popular, with social robots like Furhat playing key roles in enhancing communication through both verbal and non-verbal cues. This thesis investigates the impact of gaze and laughter coordination in human-robot interactions, focusing on how these non-verbal behaviours, aligned with each other, enhance metrics such as perceived naturalness, empathy, and human-likeness of the robot. The study builds upon existing research on non-verbal communication and further explores how laughter and gaze alignment can improve conversational flow and emotional engagement between humans and robots. Using Furhat, a social robot, experiments were conducted, involving a simulated cooking activity, where participants interacted with the robot through dialogue that integrated gaze and gaze-aligned laughter functions. For the study, participants were evenly divided into two experimental groups. Throughout the interaction, participants were recorded and later asked to complete a questionnaire to capture their perceptions and emotional state. The insights gathered from the experiments highlight interesting trends in both quantitative and qualitative aspects related to user experience. Participants who saw Furhat produce gaze and laughter behaviour in line with human behaviour rated Empathy, Naturalness and Authenticity, Naturalness of Laughter, and Compassion as higher than those who witnessed the same behaviours in inappropriate contexts. These results show promising potential for designing more human-like social robots capable of meaningful non-verbal communication. The thesis also addresses limitations that may guide future studies.

Description

Keywords

laughter, gaze, alignment, social robots, Furhat, human-computer interaction

Citation