People of all ages and cultures gesture while speaking, some much more noticeably than others. But is gesturing uniquely tied to speech, or is it, rather, processed by the brain like any other manual action?
A US-Netherlands research collaboration delving into this tie discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. This is surprising because there is less visual information contained in gestures than in actual actions on objects. In short: Less may actually be more when it comes to gestures and actions in terms of understanding language.
Spencer Kelly, associate professor of Psychology, director of the Neuroscience program, and co-director of the Center for Language and Brain at Colgate University, and colleagues from the National Institutes of Health and Max Planck Institute for Psycholinguistics will present their research at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.
Among their key findings is that gestures – more than actions – appear to make people pay attention to the acoustics of speech. When we see a gesture, our auditory system expects to also hear speech. But this is not what the researchers found in the case of manual actions on objects.
Just think of all the actions you’ve seen today that occurred in the absence of speech. “This special relationship is interesting because many scientists have argued that spoken language evolved from a gestural communication system – using the entire body – in our evolutionary past,” points out Kelly. “Our results provide a glimpse into this past relationship by showing that gestures still have a tight and perhaps special coupling with speech in present-day communication. In this way, gestures are not merely add-ons to language – they may actually be a fundamental part of it.”
A better understanding of the role hand gestures play in how people understand language could lead to new audio and visual instruction techniques to help people overcome major challenges with language delays and disorders or learning a second language.
What’s next for the researchers? “We’re interested in how other types of visual inputs, such as eye gaze, mouth movements, and facial expressions, combine with hand gestures to impact speech processing. This will allow us to develop even more natural and effective ways to help people understand and learn language,” says Kelly.
This article was first published at www.newwswise.com