Decoding Facial Expressions For Useful Applications

The end of the academic year approaches, stress levels start to rise. For many third year students, this means completing their final year projects (FYPs) – the result of their blood, sweat, and tears.

As a final year Psychology student, I had the opportunity to explain my FYP experiment on IGNITE. Continue reading for further information.

What is my experiment about?

My experiment is on the topic of ‘gaze behaviour during facial emotion recognition’. Don’t worry, it’s not as complicated as it sounds!

I investigated which emotions are identified better or worse.In particular, I specifically focused on the following emotions: anger, fear, happiness, sadness, and surprise.

Essentially, our ability to identify emotions depends largely on how we process the face. Processing the face as whole entity facilitates recognition of emotions. Nevertheless, processing facial features independently impairs emotion recognition. The aim of my experiment was to determine whether emotion recognition is dependent on the way we process the face.


How did I execute the experiment?

The participants did an emotion identification task, which is self-explanatory. Basically, they had to look at 300 photos of different facial expressions and guess the emotion of the face. An eye tracking software helped to track and measure eye movements, which looks something like this:

Why did I choose this topic?

Eye tracking technology provoked my interested and I wanted to learn more about it. Besides, the concept of facial recognition fascinated me and this project incorporated both of these aspects.

How does the research apply to real life?

This study is highly relevant to our daily lives. We gain social information by looking at faces. It is quite clear that the findings suggest that by processing faces as a whole, we are able to better identify emotions. For instance, we could use this in real life when observing the emotions of our classmates, friends and others.

Darwin’s Theory of Evolution by Natural Selection may explain why certain emotions are recognised better than others. In order to survive, we need to respond rapidly to threatening events. One way we detect danger is by identifying the emotions of others as an indication of threats. For instance, participants recognised surprised better compared to other emotions. However, fear and anger should have also been recognised better according to this explanation and this was not the case for my experiment.

The development of artificial intelligence

Additionally, another way this research is useful relates to artificial intelligence (AI). Companies like Unilever introduced emotion recognition detection as an HR tool in their recruitment process. During interviews, subjective measures may be unreliable in determining whether the candidate is a good fit for the job. Emotion recognition is a more objective measure for honest responses, mood (indicated by facial expression) and perhaps even personality.

In other areas, AI technology can also find its niche in video games. Facial emotion detection can improve the understanding of how gamers are feeling and whether they are responding in the way the designers of the videogames intended. Relying on verbal or written feedback from users may be inefficient as they may not remember how they felt at a certain point in the game or may not be able to express their feelings effectively. Thus, this is a practical method of observing emotional reactions. Similarly, in order to improve products and services, market research employs emotion detection. Traditionally, companies use surveys to learn more about consumer needs and preferences. However, these do not always correspond with their actions and they may have underlying subconscious feelings towards certain brands or products. Their emotions reveal feelings that are more accurate and could be useful in predicting consumer behaviour.

Written by Sophie Byfield

Comments are closed.