Tuesday, October 16, 2012

New Research Reveals Complexity of Brain’s Auditory Processing

A new study links motor skills to perception, enhancing the findings of earlier brain imaging techniques that merely hinted at this link.
Auditory testing

The Gist

Questioning the sound of one hand clapping may be an ancient Asian riddle to inspire Zen Buddhists, but it turns out that exactly what we hear may depend on what our hands are doing. That’s right—depending on whether your right or left hand is actively engaged in activity while listening could ultimately change how your brain processes the sounds you hear.

New research published this week and presented at Neuroscience 2012, the annual meeting of the Society for Neuroscience, is among the first to match human behavior with the brain’s specific left brain/right brain auditory tasks. Earlier brain imaging tests also observed specialization of auditory processing in the brain’s left and right hemispheres based on the tone and speed of speech as it was heard.

The new findings, as reported by senior researcher Dr. Peter E. Turkeltaub, M.D., Ph.D., a neurologist in the Center for Brain Plasticity and Recovery at Georgetown University in Washington, D.C., may suggest new methods to help speech pathologists assist patients who have barriers to speaking after suffering a stroke. Aphasia, the medical term for the disorder, affects more than one million Americans according to the Aphasia Hope Foundation, a nonprofit that raises awareness about the condition.

The Expert Take

According to Turkeltaub, the first challenge is to more fully understand the basic organization of the brain’s auditory systems.

“Understanding better the specific roles of the two hemispheres in auditory processing will be a big step in that direction,” he says in the study. “If we find that people with aphasia, who typically have injuries to the left hemisphere, have difficulty recognizing speech because of problems with low-level auditory perception of rapidly changing sounds, maybe training the specific auditory processing deficits will improve their ability to recognize speech.”

In an exclusive interview with Healthline, Turkeltaub elaborates, “The first step will be finding out whether speech comprehension problems in people with stroke occur in part due to problems in processing rapidly changing sounds, like the ones we used in this study. If that’s true, then this study suggests that asking people with stroke to attempt movement of the right hand during speech comprehension therapy might improve outcomes.”  

Source and Method

Turkeltaub and his team embedded rapidly and slowly changing sounds in background noise and asked 24 volunteers to indicate whether they heard the noises by pressing a button. Volunteers were also asked to switch from depressing the button from their right to their left hand every 20 seconds. When test subjects used their right hands, they heard more rapidly changing sounds more often. The opposite was found to be true for the left hands and the dominance of more slowly changing sounds.

“Since the left hemisphere controls the right hand and vice versa, the results demonstrate that the two hemispheres specialize in [hearing] different kinds of sounds—the left hemisphere likes rapidly changing sounds, such as consonants, and the right hemisphere likes slowly changing sounds, such as syllables or intonation,” explains Turkeltaub. “These results also demonstrate the interaction between motor skills and perception. Imagine that you’re waving an American flag while listening to one of the presidential candidates. The speech will actually sound slightly different to you depending on whether that flag is in your left or right hand.”

As of now, only right-handed people have taken part in the research.    

Other Research

While the human brain remains among the last medical frontiers in terms of our ability to understand its complex system of processes, researchers and clinicians are growing ever closer to understanding its elegance.
For instance, brain mapping—making determinations about the areas of the brain that become activated when performing various tasks—has been ongoing since the 1990s.

In 2001, a University of Alabama study examined and mapped via magnetic resonance imaging (MRI) the areas of the brain that became engaged when listening to actual words versus pseudo-words or nonsense. In that study, brains listening to pseudo-words demonstrated significantly increased brain activity.

More recently, a 2012 study observed the brain function in monkeys undergoing MRI tests. That research showcased distinct differences in right and left brain dominance depending on whether the animals heard species-specific vocalizations, human speech, scrambled versions of both, or complete silence.      

The Takeaway

Even though Turkeltaub claims the effects observed in the 2012 Georgetown University study were minimal, they could very well lead to significant improvements in patients struggling to understand language following a stroke. Other patients challenged by speech problems, such as children with dyslexia, may also eventually benefit from the findings.

“The effects we observed on perception here were relatively small,” explains Dr. Turkeltaub, “but even a small beneficial effect on each day of [speech] therapy might accumulate over time, amplifying the long-term benefit of that therapy in a clinically meaningful way.”

No comments:

Post a Comment