Gestures fulfill a role in language

Research finds that actual actions on objects have less impact on brain’s understanding of speech than simply gesturing.

By ACOUSTICAL SOCIETY OF AMERICA (ASA)
May 13, 2012 13:30
2 minute read.
stirring a spoon in a cup

Stirring a spoon 370. (photo credit: Thinkstock/Imagebank)

 
X

Dear Reader,
As you can imagine, more people are reading The Jerusalem Post than ever before. Nevertheless, traditional business models are no longer sustainable and high-quality publications, like ours, are being forced to look for new ways to keep going. Unlike many other news organizations, we have not put up a paywall. We want to keep our journalism open and accessible and be able to keep providing you with news and analyses from the frontlines of Israel, the Middle East and the Jewish World.

As one of our loyal readers, we ask you to be our partner.

For $5 a month you will receive access to the following:

  • A user experience almost completely free of ads
  • Access to our Premium Section
  • Content from the award-winning Jerusalem Report and our monthly magazine to learn Hebrew - Ivrit
  • A brand new ePaper featuring the daily newspaper as it appears in print in Israel

Help us grow and continue telling Israel’s story to the world.

Thank you,

Ronit Hasin-Hochman, CEO, Jerusalem Post Group
Yaakov Katz, Editor-in-Chief

UPGRADE YOUR JPOST EXPERIENCE FOR 5$ PER MONTH Show me later Don't show it again

People of all ages and cultures gesture while speaking, some much more noticeably than others. But is gesturing uniquely tied to speech, or is it, rather, processed by the brain like any other manual action?

A US-Netherlands research collaboration delving into this tie discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. This is surprising because there is less visual information contained in gestures than in actual actions on objects. In short: Less may actually be more when it comes to gestures and actions in terms of understanding language.

Be the first to know - Join our Facebook page.


Spencer Kelly, associate professor of Psychology, director of the Neuroscience program, and co-director of the Center for Language and Brain at Colgate University, and colleagues from the National Institutes of Health and Max Planck Institute for Psycholinguistics will present their research at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Among their key findings is that gestures – more than actions – appear to make people pay attention to the acoustics of speech. When we see a gesture, our auditory system expects to also hear speech. But this is not what the researchers found in the case of manual actions on objects.

Just think of all the actions you’ve seen today that occurred in the absence of speech. “This special relationship is interesting because many scientists have argued that spoken language evolved from a gestural communication system – using the entire body – in our evolutionary past,” points out Kelly. “Our results provide a glimpse into this past relationship by showing that gestures still have a tight and perhaps special coupling with speech in present-day communication. In this way, gestures are not merely add-ons to language – they may actually be a fundamental part of it.”

A better understanding of the role hand gestures play in how people understand language could lead to new audio and visual instruction techniques to help people overcome major challenges with language delays and disorders or learning a second language.

What’s next for the researchers? “We’re interested in how other types of visual inputs, such as eye gaze, mouth movements, and facial expressions, combine with hand gestures to impact speech processing. This will allow us to develop even more natural and effective ways to help people understand and learn language,” says Kelly.



This article was first published at www.newwswise.com

Related Content

Lab
August 31, 2014
Weizmann scientists bring nature back to artificially selected lab mice

By JUDY SIEGEL-ITZKOVICH