Research

My interests lie in the automatic analysis and understanding of verbal and nonverbal (vocal) behaviors in human-human and human-machine interaction, and the design of socially interactive technology to support human needs. Taking an interdisciplinary approach within the realms of affective computing and social signal processing, I aim to develop socially and affective intelligent interfaces (e.g. virtual conversational agents, social robots) that can recognize and display social and affective signals, and I aim to study how humans interact with this new kind of technology. Coming from a background in (computational) paralinguistics and speech analysis, my main focus is on analysing the vocal modality of expression, in addition to the visual (e.g. facial expressions, eye gaze) and physiological (e.g. heart rate, galvanic skin response) modalities in social interaction.

social signal processing / affective computing / multimodal interaction / paralinguistics / speech prosody / non-verbal vocalisations / laughter / dialogue / embodied conversational agents / human-human interaction / human-robot interaction / computer-assisted language learning / pronunciation error detection

Projects

Other academic activities