Projects

 

Netherlands eScience Center project on Advancing technology for multimodal analysis of emotion expression in everyday life 2017 – current
Emotional expression plays a crucial role in everyday functioning. It is a continuous process involving many features of behavioral, facial, vocal, and verbal modalities. Given this complexity, few psychological studies have addressed emotion recognition in an everyday context. Recent technological innovations in affective computing could result in a scientific breakthrough as they open up new possibilities for the ecological assessment of emotions. However, existing technologies still pose major challenges in the field of Big Data Analytics.

 

4TU NIRICT proje4tu-small-logo-croppedct on Empathic Lighting 2016 – current
In this project, we collaborate on research at the intersections of technology and humans, in particular, combining insights from computational intelligence, user modeling, personalization and human-computer interaction for lighting installations that can adapt to and influence people’s affective states. We aim to develop affect-adaptive lighting interfaces to be deployed in independent living for seniors. Seniors often experience negative affective states, such as gloominess due to the distance from their families or anxiety when disoriented (e.g. due to dementia). For updates check out our website!

 

4tu-small-logo-cropped4TU.H&T (4TU Humans & Technology) Smart Social Systems and Spaces for Living Well (S4) 2015 – current
The 4TU research centre Humans & Technology (H&T) brings together the social sciences, humanities, and technical sciences. Its goal is excellent research on innovative forms of human-technology interaction for smart social systems and spaces. The research program “Smart Social Systems and Spaces for Living Well” (S4) aims to combine knowledge available from different disciplines, such as computer science, psychology, and industrial design.

 

nao_chaos1EU-FP7 SQUIRREL (Clearing Clutter Bit by Bit) 2014 – current

Development of a robot for children (4-10 years old) that cannot only perform complex navigation, detection, and manipulation tasks in a cluttered environment, but is also affectively and socially intelligent, engaging and fun in a collaborative task. Detection of children’s affective and social states (e.g., engagement, dominant behavior) in a multiparty robot-children scenario (1 robot and more than 1 child) through (non-verbal) speech analysis.

 

img_7255EU-FP7 TERESA (Telepresence Reinforcement-learning Social Agent) 2014 – 2017
Development of a socially-intelligent telepresence robot, e.g., semi-autonomously navigating among groups, adaptation to quality of the mediated human-human interaction (elderly people). Detecting and monitoring the quality (e.g., how well is the conversation going, are the interactants in sync, are they disagreeing, do they like each other) of the mediated human-human interaction through (non-verbal) speech analysis.

 

runningCOMMIT P3 (SENSEI) – 2013 – 2017
Integration of technology to sense, analyse, interpret and motivate people who take part in sports and exercises (running) towards a better wellbeing. Detection of the runner’s physical and mental state through speech analysis.

 

 

cromeCroMe (Croatian Memories) 2012 – 2013
Gathering and documenting testimonies on war-related experiences in Croatia’s past, and making these audiovisual testimonies publicly available and searchable through technology. Analysis of verbal and non-verbal behavior of interviewees, for example, by comparing between word usage and prosodic speech parameters, and analysis of sighs in emotionally-colored dialogs.

 

semaineEU-FP7 SEMAINE (The Sensitive Agent project) 2009 – 2013
Development of a Sensitive Artificial Listener, a multimodal dialogue system that can sustain an interaction with a user for some time and that react appropriately to the user’s non-verbal behavior. Analysis of interruptive agents, analysis of generation, detection, and timing of backchannels (listener responses).

 

 

sspnetlogoEU-FP7 SSPNet (Social Signal Processing Network) 2009 – 2013
Automatic (multimodal) analysis and detection of social signals, manifested through non-verbal cues, in interaction. Analysis of non-verbal vocalisations such as laughter and sighs in interaction, interruptions, synchrony/mimicry, listener responses in interaction.

 

 

logo-multimedianBSIK MultimediaN N2 (Multimodal Interaction) 2005 – 2009
Realizing an excellent user experience during a human-machine interaction by attuning the interaction to the user’s intentions and emotions. Automatic emotion recognition in speech, automatic detection of laughter, multimodal sentiment analysis.

 


Supervision

  • Deniece Nazareth, phd student eScience project (May 2017 – …)
  • Jaebok Kim, phd student SQUIRREL/TERESA (July 2014 – …)
  • Roelof de Vries, phd student COMMIT P3 (May 2013 – …)
  • Cristina Zaga, phd student SQUIRREL (Oct 2014 – …)
  • Dr. Meiru Mu, postdoc COMMIT P3 (March 2015 – Dec 2016)