Man poses for 6 headshots with varying facial expressions

You can’t determine emotion from someone’s facial movements and neither can AI.

If you saw a person with their brow furrowed, mouth turned down, and eyes squinted, would you guess they’re angry? What if you found out they’d forgotten their reading glasses and were deciphering a restaurant menu?

Interpreting a person’s facial movements can’t be done in a vacuum; it depends on the context—something that Northeastern neuroscientist Lisa Feldman Barrett shows in a groundbreaking new study published Thursday in the scientific journal Nature Communications.

Barrett, a university distinguished professor of psychology at Northeastern, and colleagues from several other institutions around the world used photographs of professional actors portraying richly constructed scenarios to show that people not only use different facial movements to communicate different instances of the same emotion category (someone might scowl, frown, or even laugh when they’re portraying anger), they also employ similar facial configurations to communicate a range of instances from different emotion categories (a scowl might sometimes express concentration, for example)—findings that have serious implications for emotion recognition technology that purports to “read” emotions in the face.

Read more at News@Northeastern.

Photos by Matthew Modoono/Northeastern University.

Psychology