Show simple item record

dc.contributor.authorGiannitzi, Eleni
dc.date.accessioned2024-10-25T10:05:12Z
dc.date.available2024-10-25T10:05:12Z
dc.date.issued2024-10-25
dc.identifier.urihttps://hdl.handle.net/2077/83857
dc.description.abstractHuman-robot interaction is becoming increasingly popular, with social robots like Furhat playing key roles in enhancing communication through both verbal and non-verbal cues. This thesis investigates the impact of gaze and laughter coordination in human-robot interactions, focusing on how these non-verbal behaviours, aligned with each other, enhance metrics such as perceived naturalness, empathy, and human-likeness of the robot. The study builds upon existing research on non-verbal communication and further explores how laughter and gaze alignment can improve conversational flow and emotional engagement between humans and robots. Using Furhat, a social robot, experiments were conducted, involving a simulated cooking activity, where participants interacted with the robot through dialogue that integrated gaze and gaze-aligned laughter functions. For the study, participants were evenly divided into two experimental groups. Throughout the interaction, participants were recorded and later asked to complete a questionnaire to capture their perceptions and emotional state. The insights gathered from the experiments highlight interesting trends in both quantitative and qualitative aspects related to user experience. Participants who saw Furhat produce gaze and laughter behaviour in line with human behaviour rated Empathy, Naturalness and Authenticity, Naturalness of Laughter, and Compassion as higher than those who witnessed the same behaviours in inappropriate contexts. These results show promising potential for designing more human-like social robots capable of meaningful non-verbal communication. The thesis also addresses limitations that may guide future studies.sv
dc.language.isoengsv
dc.subjectlaughter, gaze, alignment, social robots, Furhat, human-computer interactionsv
dc.titleWHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhatsv
dc.title.alternativeWHEN EYES MEET LAUGHTER: Exploring Non-Verbal Cues in Human-Robot Interaction with Furhatsv
dc.typeText
dc.setspec.uppsokHumanitiesTheology
dc.type.uppsokH2
dc.contributor.departmentUniversity of Gothenburg / Department of Philosophy,Lingustics and Theory of Scienceeng
dc.contributor.departmentGöteborgs universitet / Institutionen för filosofi, lingvistik och vetenskapsteoriswe
dc.type.degreeStudent essay


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record