University of Memphis researchers found that the degree of attention individuals pay to incoming sounds can influence even the earliest stages of brain processing — including those that scientists once thought were completely automatic.
Their findings, published March 29 in NeuroImage, could one day help improve the listening accuracy of people who have difficulty distinguishing speech in noisy environments. The study adopted a unique approach by examining brain-stem and auditory-cortex activity simultaneously, in order to better understand how both regions change when someone is instructed to pay attention to sounds in the environment. Previous studies that focused solely on the brain stem ran the risk of inadvertently picking up activity from the auditory cortex and mistaking it for signals that originated in other regions of the brain.
Audiologist Caitlin Price, a doctoral fellow at the University of Memphis and corresponding author for the study, was inspired to explore the connections between attention and early brain processing while working with clinical patients who'd had trouble distinguishing people's voices in busy, loud settings.
"A lot of these individuals were withdrawing from social situations and activities that they really enjoyed, just because they didn't feel like they could contribute to the conversation," Price told The Academic Times. "You can fit someone with hearing aids for amplification if they have hearing loss to make it sound audible, but that doesn't always help with the background noise."
The 20 volunteers who participated in the study — young adults with no psychiatric conditions — were asked to listen to three tones under varying circumstances. An active-listening task required the participants to carefully listen to two frequent tones and press a button upon hearing a third, infrequent tone. In some trials, the tones were accompanied by background babble that sounded similar to the din of a busy restaurant.
Meanwhile, a passive-listening task required participants to ignore the incoming audio tones while watching a silent, captioned movie. The scientists then compared how the brain stem responded to each task. They found that people's brain stems displayed more robust activity when they'd been asked to concentrate on the incoming noises, even when those tones were supplemented with background noise.
The 20 participants were required to have fewer than three years of musical experience, since formal musical training can have a profound effect on the way the brain filters sounds in the environment.
It's logical that the brain's higher-level cognitive structures would be more active when listeners are asked to concentrate on a particular sound, Price noted. But some previous research had suggested that the brain stem — one of the first points of contact between incoming stimuli and the brain — would respond in the same way no matter whether a person was concentrating on each tone or was preoccupied with another activity. It was thought, in other words, that the early stages of neural processing were entirely spontaneous and instinctual. After all, our brain stems can show activity even while we're sleeping or sedated.
The brain can provide feedback to the auditory system to help filter out unnecessary information and instead focus on relevant stimuli. Scientists often refer to this phenomenon as the "cocktail party" effect, named for the way that many people are able to focus on one conversation, even in a hectic, noisy environment, such as a party. But people with neurodegenerative disorders, such as Alzheimer's disease, may have difficulty separating incoming sounds.
"We know that attention is really important for speech and noise perception because that's what helps us distinguish and prioritize what we're interested in hearing while suppressing or ignoring that competing background noise," Price said.
Language processing requires two neural systems to work hand in hand: the perceptual parts of the brain that take in and process incoming signals, and the cognitive parts that interpret and make sense of that information. The study revealed that concentrating on particular tones increased the connectivity between the brain stem and the primary auditory cortex in both directions.
In the future, the researchers think that people with auditory impairments may be able to overcome some hearing challenges by engaging in listening activities that train areas of the brain to be more sensitive to incoming stimuli. Price has already noted that some older patients' brains are automatically compensating for certain deficits by repurposing regions that are not usually used for auditory processing.
"We're seeing neural differences and changes in how these regions are talking with each other. And so, that might give us some clues," Price said. "You know, the brain's already doing this, so let's capitalize on that and provide some more training to kind of boost and strengthen [those regions] if that's what they need to overcome these challenges due to hearing loss."
The study, "Attention reinforces human corticofugal system to aid speech perception in noise," published March 29 in NeuroImage, was authored by Caitlin Price and Gavin Bidelman, University of Memphis.