WHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhat

dc.contributor.authorGiannitzi, Eleni
dc.contributor.departmentUniversity of Gothenburg / Department of Philosophy,Lingustics and Theory of Scienceeng
dc.contributor.departmentGöteborgs universitet / Institutionen för filosofi, lingvistik och vetenskapsteoriswe
dc.date.accessioned2024-10-25T10:05:12Z
dc.date.available2024-10-25T10:05:12Z
dc.date.issued2024-10-25
dc.description.abstractHuman-robot interaction is becoming increasingly popular, with social robots like Furhat playing key roles in enhancing communication through both verbal and non-verbal cues. This thesis investigates the impact of gaze and laughter coordination in human-robot interactions, focusing on how these non-verbal behaviours, aligned with each other, enhance metrics such as perceived naturalness, empathy, and human-likeness of the robot. The study builds upon existing research on non-verbal communication and further explores how laughter and gaze alignment can improve conversational flow and emotional engagement between humans and robots. Using Furhat, a social robot, experiments were conducted, involving a simulated cooking activity, where participants interacted with the robot through dialogue that integrated gaze and gaze-aligned laughter functions. For the study, participants were evenly divided into two experimental groups. Throughout the interaction, participants were recorded and later asked to complete a questionnaire to capture their perceptions and emotional state. The insights gathered from the experiments highlight interesting trends in both quantitative and qualitative aspects related to user experience. Participants who saw Furhat produce gaze and laughter behaviour in line with human behaviour rated Empathy, Naturalness and Authenticity, Naturalness of Laughter, and Compassion as higher than those who witnessed the same behaviours in inappropriate contexts. These results show promising potential for designing more human-like social robots capable of meaningful non-verbal communication. The thesis also addresses limitations that may guide future studies.sv
dc.identifier.urihttps://hdl.handle.net/2077/83857
dc.language.isoengsv
dc.setspec.uppsokHumanitiesTheology
dc.subjectlaughter, gaze, alignment, social robots, Furhat, human-computer interactionsv
dc.titleWHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhatsv
dc.title.alternativeWHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhatsv
dc.typeText
dc.type.degreeStudent essay
dc.type.uppsokH2

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Thesis_Eleni_Giannitzi LT2402.pdf
Size:
5.19 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.68 KB
Format:
Item-specific license agreed upon to submission
Description: