Hebrew U. scientists map brains of the blind

Researchers studying brain activity of blind people have tackled the long-inscrutable question of how tasks such as reading and identifying numerical symbols have their own brain region.

Eye exam [Illustrative] (photo credit: REUTERS)
Eye exam [Illustrative]
(photo credit: REUTERS)
Researchers at the Hebrew University studying the brain activity of blind people have tackled the long-inscrutable question of how tasks such as reading and identifying numerical symbols have their own brain regions, if these tasks were developed only a mere few thousand years ago.
They wondered what was the job of these physiological regions before the invention of symbols.
In a new paper published over the weekend in the prestigious journal Nature Communications, Sami Abboud and colleagues in the lab for brain and multisensory research of Prof. Amir Amedi show that these same “visual” brain regions are used by blind subjects, who are actually “seeing” through sound.
“These regions are preserved and functional even among the congenitally blind who have never experienced vision,” said Abboud, who challenged the conventional view of how the human brain specializes to perform different kinds of tasks – that these regions were adapted from other visual tasks, such as seeing the angles of lines and their intersections.
The accepted view in previous decades was that the brain is divided into distinct regions mainly by the sensory input that activates them, such as the visual cortex for sight and the auditory cortex for sound. Within these large regions, sub-regions have been defined that are specialized for specific tasks such as the “visual word form area,” a functional brain region believed to identify words and letters from shape images even before they are linked with sounds or meanings.
Similarly, there is another area that specializes in number symbols.
Their new findings suggest that unexpected brain connectivity can lead to rapid brain specialization, allowing humans to adapt to the rapid technological and cultural innovation of our generation.
Vision, the researchers declared, is not a prerequisite for “visual” cortical regions to develop these preferences.
The required condition, the HU researchers stated, is not sensory-based (vision) but rather connectivity- and processing- based. For example, blind people reading Braille using their fingers will still use the “visual” areas of the brain.
The research uses shows unique connectivity patterns between the visual-number- form-area (VNFA) to quantity-processing areas in the brain’s right hemisphere and between the visual-wordform- area (VWFA) to language- processing areas in the left hemisphere. This type of mechanism can help explain how our brain adapts quickly to the changes of our era of constant cultural and technological innovations.
The very existence of the VWFA is surprising, they said, since symbols such as “O” can be used either as the letter O or as the number zero, despite being visually identical.
Shedding new light on how our brains can adapt to the rapid cultural and technological changes of the 21st century, the Abboud team used unique tools known as “sensory substitution devices.”
SSDs “help the blind in their everyday life and also create new research opportunities, added Amedi, whose lab is located in the medical neurobiology department at the Institute for Medical Research Israel-Canada at HU’s Medical Faculty. SSDs take information from one sense and present it in another, for example enabling blind people to “see” by using other senses such as touch or hearing.
By using a smartphone or webcam to translate a visual image into a distinct soundscape, SSDs enable blind users to create a mental image of objects, such as their physical dimensions and color.
With intense training (available online at www.amedilab.com), blind users can even “read” letters by identifying their distinct soundscape.
“These devices can help the blind in their everyday life,” explained Amedi, “but they also open unique research opportunities by letting us see what happens in brain regions normally associated with one sense, when the relevant information comes from another.”
Abboud and colleagues were interested in whether blind people using sensory substitution would, like sighted people, use the VWFA sub-region of the brain to identify shape images or whether this area is specialized exclusively to visual reading with the eyes.
The paper demonstrates that these same “visual” brain regions are used by blind people who are actually “seeing” through sound.
“These regions are preserved and functional even among the congenitally blind who have never experienced vision,” said Abboud. The researchers used functional MRI imaging (fMRI) to study the brains of blind subjects in real-time while they used an SSD to identify objects by their sound. They found that when it comes to recognizing letters, body postures and more, specialized brain areas are activated by the task at hand, rather than by the sense (vision or hearing) being used.
Previous attempts to explain why both the word and number areas exist suggest that, in the far distant past, these areas were specialized for other visual tasks such as recognizing small lines, their angles and intersections, and thus were best suited for them. However, this new work shows that congenitally blind users using the sensory substitution devices still have these exact same areas, suggesting that vision is not the key to their development.
“Beyond the implications for neuroscience theory, these results also offer us hope for visual rehabilitation,” said Amedi. “They suggest that by using the right technology, even non-invasively, we can reawaken the visually deprived brain to process tasks considered visual, even after many years of blindness.”
Amedi suggested that the main criteria for a reading area to develop is not the letters’ visual symbols, but rather the area’s connectivity to the brain’s language-processing centers. Similarly a number area will develop in a region which already has connections to quantity- processing regions. If we take this one step further,” added Amedi, “this connectivity- based mechanism might explain how brain areas could have developed so quickly on an evolutionary timescale. We’ve been reading and writing for only several thousand years, but the connectivity between relevant areas allowed us to create unique new centers for these specialized tasks.
The Amedi lab has developed several patented devices to help blind people identify objects and navigate using a technique called “sensory substitution” (mainly ‘seeing’ by translating an image taken from a simple smartphone or webcam into sound with no need for special hardware).
Another device it developed us the EyeCane, which uses an algorithm to translate distance into sound and vibrations. The EyeCane aims to boost mobility and navigation for the visually-disabled.
Within five minutes of training, users can successfully navigate, detect, and avoid obstacles and estimate distance.
Recently-published EyeCane research demonstrated that using the device significantly improves users’ mobility patterns. “Our users no longer cling to the walls,” noted Shachar Maidenbaum, one of the researchers working on this project. “Usually the blind avoid large open spaces since they don’t have ‘anchors’ in them, but the expanded sensory information from the EyeCane lets them easily walk down the center of a corridor or cut through the center of large rooms.”