Universiteit Leiden

nl en

European Grant for Mariska Kret's Virtual Reality emotion training tool

Teaching people to recognize subtle, real-world expressions will help them understand and trust others better. The aim of Mariska Kret is to develop an interactive virtual-reality training tool (E-VIRT) for a broad group of users, including patients. Kret provides a brief description of her idea for the Proof of Concept Grant.

The problem

Humans’ capacity to express, recognize and share emotions enables them to navigate their social worlds and forms a core component of what it means to be socially competent and healthy. Emotional expressions have received much attention in the literature and in training applications. However, the focus has been on explicit, acted, prototypical facial expressions of the type rarely seen in daily life. In reality, subtle cues such as dilated pupils, tears and blushing make emotion recognition challenging and form a problem for many patients with social anxiety disorder and autism spectrum disorders.

Kret's research shows that most patients with these disorders are not severely impaired when it comes to recognizing explicit expressions, but that problems start to emerge when cues are subtle. In addition, such patients tend to avoid eye contact and become hyperaroused. Imagine an estate agent telling you, with a big smile, that the problem of damp has been solved and the wall will soon dry out. His blushing and avoidance of eye contact set your alarm bells ringing. Many patients, however, would not pick up on these cues of untrustworthiness and, following the smile, would make the wrong decision. Difficulties with real-world emotions make them trust the wrong people, and make them doubt that sharing emotions can lead to help, assistance and support. These individuals risk social isolation, and their low confidence may prevent them from reaching out to a therapist.

The solution

Teaching people to recognize subtle, real-world expressions will help them understand and trust others better. The aim of this project is to develop an interactive virtual-reality training tool (E-VIRT) for a broad group of users, including patients. They could benefit from online help as a first step to practicing social skills in a safe home environment, without the presence of others, and at their own pace. The virtual environment is statistically programmed, and the machine-learning and similarity algorithms behind the virtual characters’ expressions and the emotion recognition skills safeguard the program’s ecological validity, making real-life impact feasible.

In E-VIRT, the user is immersed in a virtual environment in which they are gradually exposed to social situations differing in complexity (e.g., job interview, dinner speech). The user interacts with characters who express emotions which are adapted, in real time, to the user’s expressions. The user will be trained on emotion processing in three steps: emotion regulation, emotion recognition, and interoception.

The origin

This projects builds on the ERC Starting grant of Kret. One major achievement of that project is that we have developed a database which consists of thousands of static/ dynamic, genuine, non-acted facial expressions taken from real-life situations across cultures (from reality TV, TikTok and from filming participants in the lab after emotion induction). There is huge potential to use this material in the training application, both as input for virtual character expressions, but also for the automatic recognition of users’ expressions.

This website uses cookies.  More information.