Communicative behaviour

In interpersonal communication, nonverbal signals—including facial and body cues—convey far more emotional content than spoken words. To capture and analyse emotion expressed in often-unconscious nonverbal behaviour, this project builds a generative pipeline that reconstructs 3D meshes from 2D images and video using 3D morphable face models and human parametric body models (e.g., the SMPL family). This produces a unified, embodied 3D representation of head pose, gaze, and facial micro-expressions, which we then use to infer latent affective states.

As part of WARA Media and Language, and in collaboration with the Perceptual Neuroscience group at Karolinska Institutet (KI) and Volvo, the model is applied in three contexts: (1) estimating user affect for a digital avatar; (2) analysing nonverbal responses to sensory stimuli (e.g., quantifying approach–avoidance tendencies to olfactory stimuli); and (3) estimating frustration in driving scenarios.

PhD student: Chen Ling, KTH

Leave a Reply

Your email address will not be published. Required fields are marked *