Health Scan: 3D simulator reveals the inner workings of autistic brain

“Theories of global and multisensory integration deficits in ASD are deeply rooted in the scientific conversation about autism,” says expert.

An image of the human brain (photo credit: REUTERS)
An image of the human brain
(photo credit: REUTERS)
What is it like to have an autistic brain? Sufferers can’t easily describe the sensations, but now a simulator developed at Baylor College of Medicine-Houston with help from Bar-Ilan University in Ramat Gan offers a peek at the sensory stimuli processed by the brains of people with autism spectrum disorder (ASD). Strapped into a motion-enabled simulator and wearing 3D glasses, 36 teen volunteers recently experienced what it was like to “travel” through a field of virtual stars. The experiments provided new and exciting data about how sensory stimuli are processed by the brains of individuals with ASD.
The study, published recently in the online Proceedings of the National Academy of Sciences, was written by BIU’s Dr. Adam Zaidel and Baylor’s Dr. Robin Goin-Kochel and Dr. Dora Angelaki. While one will be built at BIU, this study was performed in the Baylor motion simulator.
One of the hallmarks of ASD is superior low-level task performance alongside reduced performance in tasks that involve the processing of complex sensory data. This has led to the assumption that autism is characterized by a difficulty integrating individual units of perceptual data into global concepts. Based on this belief, experts have thought that people with autism have difficulty integrating multi-sensory input. But the new study has successfully challenged this conventional wisdom and identified a neurological phenomenon connected to greater sensitivity to “noisy” sensory signals.
“Theories of global and multisensory integration deficits in ASD are deeply rooted in the scientific conversation about autism,” said Zaidel, a member of BIU’s Gonda (Goldschmied) Multidisciplinary Brain Research Center and the article’s lead author. “Recently, this notion has come under scrutiny, as more and more investigators have observed discrepancies with experimental results.”
Virtual visual motion processing in ASD has usually been tested using a computer-based tool in which study participants are asked to designate the overall direction of motion of a field of dots, while a certain number of dots – the “noise” in an otherwise coherent picture – are randomly displaced. In these experiments, the level of noise at which participants can no longer determine overall direction is seen as a measure of the participant’s innate ability to integrate isolated visual stimuli into a global picture.
Zaidel’s new approach proves that such traditional methods – which depend on noise as a modulator of task difficulty – have led to widespread misinterpretation of how individuals with ASD integrate visual stimuli.
“Our study is carried out in a 3D environment in which a field of moving dots generates the feeling of traveling through space, with different trials ‘steering’ to the right or left of straight ahead,” Zaidel said. “By asking participants to indicate their perceived direction of movement, we test their ability to create a global picture out of individual details. Significantly – and this is where our method differs from previous tests – we can achieve measurable results both when randomized dots are included in the overall picture, and in a completely coherent, noise-free environment.”
Zaidel noted that without randomly-moving dots, autistic participants performed well, successfully determining the direction of movement at a level similar to that achieved by the non-ASD control group. When the noisy signals were introduced, however, the ASD group was significantly more affected than controls. This means that it is the presence of noise – rather than any innate integration deficit – that makes the task more difficult for people with autism.
The simulator was able to make the person sitting in the chair feel movement, thus requiring the participant to respond to and interpret visual and vestibular stimuli at the same time.
“By adding movement to the experiment, we created a situation in which participants didn’t just see the direction of the movement, but felt it as well,” he explained. “In this scenario, people with autism displayed intact multi-sensory integration, completing tasks in a normative manner, both in a coherent, noiseless environment, and even when noise was present. These findings raise questions about prevalent theories related to multi-sensory integration deficits in ASD.”
“Our results suggest that people with autism may experience a deficiency in ‘Bayesian priors’ – the ability to draw on existing knowledge to understand what we see and to predict what we will see in the near future,” Zaidel said. “If you’re more heavily weighted toward perceiving the world bottom up – from stimulus to perception – and relying less on rules of thumb from prior knowledge, perception will be both more taxing, and more sensitive to sensory noise.”
It may someday be possible to study this phenomenon directly in the brain or create treatments to help the autistic become more adept at reconnecting to and using prior knowledge, Zaidel concluded.
“The research showed that heightened sensitivity to sensory noise – the random signals inserted into the visual tasks traditionally used by scientists to test sensory integration levels in autism – may provide an alternative explanation for impaired performance. When this noise is removed from the equation, the integration of visual motion stimuli in ASD is equal to, or maybe even superior to that of the control group, and the multi-sensory integration seen in autistic participants was comparable to that of the non-autistic control group.”