A new UCLA psychology study provides insights into how the brain combines sound and vision. The research suggests that there is not one sole mechanism in the brain that governs how much our senses work together to process information.
Among the implications of the study: It might not be as easy as many people had assumed to categorize the way in which we perceive and learn.
“We should be cautious not to make blanket statements about how we process information, like ‘I’m a visual learner,’” said Ladan Shams, an associate professor of psychology in the UCLA College and senior author of the research. “That’s not necessarily true across the board. For example, your brain may combine sights and sounds a lot in one task — watching TV, for example — but only a little in another task — such as playing the piano.”
The researchers found that people’s vision frequently influenced their hearing when they tried to identify the specific location of sounds and flashes of light, and that their hearing influenced vision when they counted the sounds and flashes.
In one part of the study, 59 participants, mostly UCLA undergraduates, were seated in front of a computer monitor with speakers on either side and asked to count the number of flashes of light on the screen and beeps played on the speakers. Sometimes they only saw flashes, sometimes they only heard beeps and sometimes they both saw flashes and heard beeps — in which case the numbers could vary, up to four of each. The researchers presented 360 combinations of beeps and flashes in a one-hour period.
“When people have to process different numbers of beeps and flashes, it’s really hard — the senses blend together,” said Brian Odegaard, a UCLA postdoctoral scholar who was the study’s lead author. “Most people, when presented with two beeps and one flash, mistakenly said they saw two flashes, while a few participants could accurately tease apart lights and sounds.”
Shams explained that m