WHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhat
| dc.contributor.author | Giannitzi, Eleni | |
| dc.contributor.department | University of Gothenburg / Department of Philosophy,Lingustics and Theory of Science | eng |
| dc.contributor.department | Göteborgs universitet / Institutionen för filosofi, lingvistik och vetenskapsteori | swe |
| dc.date.accessioned | 2024-10-25T10:05:12Z | |
| dc.date.available | 2024-10-25T10:05:12Z | |
| dc.date.issued | 2024-10-25 | |
| dc.description.abstract | Human-robot interaction is becoming increasingly popular, with social robots like Furhat playing key roles in enhancing communication through both verbal and non-verbal cues. This thesis investigates the impact of gaze and laughter coordination in human-robot interactions, focusing on how these non-verbal behaviours, aligned with each other, enhance metrics such as perceived naturalness, empathy, and human-likeness of the robot. The study builds upon existing research on non-verbal communication and further explores how laughter and gaze alignment can improve conversational flow and emotional engagement between humans and robots. Using Furhat, a social robot, experiments were conducted, involving a simulated cooking activity, where participants interacted with the robot through dialogue that integrated gaze and gaze-aligned laughter functions. For the study, participants were evenly divided into two experimental groups. Throughout the interaction, participants were recorded and later asked to complete a questionnaire to capture their perceptions and emotional state. The insights gathered from the experiments highlight interesting trends in both quantitative and qualitative aspects related to user experience. Participants who saw Furhat produce gaze and laughter behaviour in line with human behaviour rated Empathy, Naturalness and Authenticity, Naturalness of Laughter, and Compassion as higher than those who witnessed the same behaviours in inappropriate contexts. These results show promising potential for designing more human-like social robots capable of meaningful non-verbal communication. The thesis also addresses limitations that may guide future studies. | sv |
| dc.identifier.uri | https://hdl.handle.net/2077/83857 | |
| dc.language.iso | eng | sv |
| dc.setspec.uppsok | HumanitiesTheology | |
| dc.subject | laughter, gaze, alignment, social robots, Furhat, human-computer interaction | sv |
| dc.title | WHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhat | sv |
| dc.title.alternative | WHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhat | sv |
| dc.type | Text | |
| dc.type.degree | Student essay | |
| dc.type.uppsok | H2 |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- Thesis_Eleni_Giannitzi LT2402.pdf
- Size:
- 5.19 MB
- Format:
- Adobe Portable Document Format
- Description:
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 4.68 KB
- Format:
- Item-specific license agreed upon to submission
- Description: