Our Mind Controls How We Interpret Emotions on Others Faces

Our Mind Controls How We Interpret Emotions on Others Faces

Health In Focus
RSS Email Print This Page Comment bookmark
Font : A-A+

Highlights:
  • We perceive facial expressions differently not just what's reflected on the face, but also depending on our own understanding
  • For example, if the subject views anger and disgust to be conceptually more similar, the visualized images of the angry and the disgusted face will look like they have a greater physical resemblance
  • Facial cues that are used to understand others' emotions differ from one person to another depending on how we have conceptually understood these emotions
People see emotions on another person's face depending on their pre-conceived views of how they understand these emotions, reveals a new study done by a team of researchers at New York University.
Our Mind Controls How We Interpret Emotions on Others Faces

The study appeared in the journal Nature Human Behaviour.

"Perceiving other people's facial emotion expressions often feels as if we are directly reading them out from a face, but these visual perceptions may differ across people depending on the unique conceptual beliefs we bring to the table," explains Jonathan Freeman, the paper's senior author and an associate professor in NYU's Department of Psychology and Center for Neural Science.

"Our findings suggest that people vary in the specific facial cues they utilize for perceiving facial emotion expressions," says Freeman.

How People Form their Ideas about Different Emotions

Jeffrey Brooks, an NYU doctoral student, conducted the study where he asked the subjects how they formed their concept or idea about different emotions. The subjects were then asked to estimate how closely related two different emotions were in their mind.

For instance, some people might think that anger and sadness are somewhat similar if they had associated both the emotions with actions like crying and slamming the fist on the table.

Others might think that anger and sadness are completely different - their association might be based on the fact that what is felt during the two emotions is completely different and that results in separate actions.

Precisely, the authors assessed the subjects on how similarly they held different pairs of the following six emotions in their mind:
  • Happiness
  • Sadness
  • Anger
  • Disgust
  • Fear
  • Surprise
These six emotions are known to be universal across cultures and genetically hard-wired in humans.

Freeman and Brooks then tested whether the different ways in which the subjects conceptually held the six emotions in their mind may influence how subjects visually perceive these emotions on others' faces.

The subjects viewed a series of pictures of human facial expressions depicting the emotions and made judgments about the emotion these faces were expressing.

New Technology Developed to Track Emotions

To obtain unbiased results, Freeman developed an innovative mouse-tracking technology that used the subject's hand movements to reveal what they were thinking or the emotion category that came to their mind when they saw the facial expression. With this technique, the subjects have to make split-second decisions and hence will not be able to alter their responses consciously.

The experiments revealed that when the subjects thought that any two emotions were conceptually similar, they saw a corresponding similarity in the faces that showed those emotions.

So, if the subjects had a preconceived idea that two different emotions like anger and disgust were conceptually similar, their hand attempted to simultaneously indicate that they saw both the emotions when viewing any one of the facial expressions!!

Finally, the authors used a technique known as 'reverse correlation' to see how the subjects visualize the six different emotions in their mind's eye.

They started with a single neutral face, created hundreds of different versions of this face, and overlaid them with varying patterns of random noise. The noise patterns falsely created random variations in the face's cues; for example, one version might look more like it is frowning when it is actually smiling.

Subjects were then asked to look at two different versions of this face and decide which version looked more like a specific emotion (e.g., happiness), although in reality they were the same emotion and it was the noise pattern that had created any difference in the two versions' appearance. So, depending on the noise patterns a subject chose, an average facial "prototype" for each of the six emotions could be seen that served as a kind of window into the mind's eye of a subject.

Facial Expressions are Perceived Differently

The results converged with the mouse-tracking results and proved that, when any two emotions were conceptually more similar in a subject's mind, the images of those two visualized facial prototypes physically resembled one another to a greater extent.

"The findings suggest that how we perceive facial expressions may not just reflect what's in the face itself, but also our own conceptual understanding of what the emotion means," explains Freeman, who notes interest in facial expressions has intrigued scientists dating back to Charles Darwin in the 19th century.

"For any given pair of emotions, such as fear and anger, the more a subject believes these emotions are more similar, the more these two emotions visually resemble one another on a person's face. The results suggest that we may all slightly differ in the facial cues we use to understand others' emotions, because they depend on how we conceptually understand these emotions."

So far, classic scientific theories of emotion assume that each emotion has its own specific facial expression that every human universally recognizes. Based on this view, the same exact facial expression, say a scowling face for anger, should always look like a perception of anger, and what we personally think constitutes "anger" should not affect the process.

Freeman observes that the findings may have implications for artificial intelligence and machine learning. Automated algorithms can easily spot emotions for facial emotion recognition, and other computer-vision and security applications, which could potentially be enhanced by incorporating conceptual processes.

Reference:
  1. Jeffrey A. Brooks & Jonathan B. Freeman Nature Human Behaviour (2018)


Source-Medindia

Post a Comment

Comments should be on the topic and should not be abusive. The editorial team reserves the right to review and moderate the comments posted on the site.
Notify me when reply is posted
I agree to the terms and conditions
Advertisement

News A - Z

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

News Search

Medindia Newsletters

Subscribe to our Free Newsletters!

Terms & Conditions and Privacy Policy.

Find a Doctor

Stay Connected

  • Available on the Android Market
  • Available on the App Store

News Category

News Archive