When people smile politely, flash a grin of recognition, or tighten their lips in disapproval, the movement is tiny – but the message can be huge.

Imagine you are in a courtroom observing a juror who is closely watching a witness’s testimony. Could you predict the juror’s opinion just by observing his facial expressions?

Or suppose you’re talking to someone who suddenly stops smiling. His eyebrows pull together, eyes narrow slightly, and lips press into a thin line. Even if he says, “It’s fine,” you can reasonably infer that he feels annoyed, worried, or uncomfortable – because that combination of facial gestures is commonly associated with negative emotions such as sadness, concern, anger, frustration, surprise, or shock.

So, when you know what a person feels from their facial gestures, you’re reading these subtle movements and matching them to emotional patterns your brain has learned over time. Facial gestures are among the most powerful forms of communication in primate societies, delivering emotion, intention, and social meaning in fractions of a second.

A study in the prestigious journal Science has uncovered how the brain prepares and produces these gestures through a temporally organized hierarchy of neural “codes,” including signals that appear well before movement begins.

 Smiling young woman using seashells as earrings (credit: INGIMAGE)
Smiling young woman using seashells as earrings (credit: INGIMAGE)

Facial expressions and brain waves

The study titled “Facial gestures are enacted through a cortical hierarchy of dynamic and stable codes,” was headed by Prof. Yifat Prut of the Medical Neurobiology Department of the Hebrew University of Jerusalem (HUJI) and Prof. Winrich Freiwald, Dr. Geena Ianni, and Dr. Yuriria Vázquez of Rockefeller University.

Every time we show facial gestures, it feels effortless, but the brain is quietly coordinating an intricate performance. The new study shows that facial gestures aren’t controlled by two separate “systems” – one for deliberate expressions and the other for emotional ones – as scientists long assumed. Instead, multiple face-control regions in the brain work together, using different kinds of signals.

Some facial-related brain signals are fast and shifting, like real-time choreography, while others are steadier and vary on a longer time scale. Remarkably, these brain patterns appear before the face even moves; the brain starts preparing a gesture in advance, shaping it not just as a movement, but as a socially meaningful message.

That’s important because facial expressions are one of our most powerful tools for communication, and understanding how the brain builds them helps to explain what can go wrong after brain injury or in conditions that affect social signaling. This may eventually guide new ways to restore or interpret facial communication when it’s lost, Prut suggested in an interview with The Jerusalem Post. She spent a year-long sabbatical working with the Rockefeller scientists.

For decades, neuroscience has leaned on a neat division: lateral cortical areas in the frontal lobe that control deliberate, voluntary facial movements, while medial areas govern emotional expressions. This view was shaped in part by clinical evidence from people with brain tumors. Research has focused on testing the association between facial expressions and preferences in a non-social context, but the researchers said it is important to attend to the social facets of this process, as preference formation is often social.

Facial expressions are a communicative signal that can help us interpret what others like continuously and non-verbally.

But by directly measuring activity from neurons across both cortical regions, the researchers found something striking – both regions encode both voluntary and emotional gestures, and do so in ways that are distinguishable well before any visible facial movement occurs. Previous literature suggests that facial expressions both indicate one’s internal emotional state and convey emotional messages intended for others.

Prut said that in the future, the team hopes to study interactions among various parts of the brain and their contribution to producing appropriate, context-related facial movements. A better understanding of the brain mechanisms that underlie facial gestures could help in diagnosing physical and psychological problems, including Parkinson’s disease and depression.

“Understanding how cortical codes generate naturalistic communication may inform brain-computer interfaces designed to restore these functions in patients.

“Movement is an essential component of our daily lives. All kinds of movements – including locomotion, reaching for objects, communicating verbally, or even making subtle facial gestures – require coordination across large numbers of muscles. The nervous system is able to perform these tasks in a rapid, precise, and adaptable manner,” the neurobiology professor said.

“But by directly measuring activity from individual neurons across extended cortical regions, the researchers found something striking: both regions encode both voluntary and emotional gestures, and they do so in ways that are distinguishable well before any visible facial movement occurs,” said Prut, who received her doctorate from HUJI and completed a postdoctoral fellowship at the University of Washington in Seattle.

Facial gestures are not just physical movements; they are social actions, and the brain treats them as such. This discovery offers a new framework for understanding how such gestures are coordinated in real time, how communication-related motor control is structured in the brain, and what can go wrong in disorders where facial signaling is disrupted – whether through neurological injury or conditions affecting social communication.

By showing that multiple brain regions work in parallel, each contributing different timing-based codes, the study opens new pathways for exploring how the brain produces socially meaningful behavior.