A study has provided insight into how we perceive gender based on sounds.
Neuroscientists from Northwestern University have examined the brain's processing of such sensory information about another's gender.
"Researchers have long thought that one part of the brain does vision and another does auditory processing and that the two really don't communicate with each other," said Marcia Grabowecky, research assistant professor of psychology.
"But emerging research suggests that rich information from different senses come together quickly and influence each other so that we don't experience the world one sense at a time," she added.
The study builds upon scarce scientific evidence to support the idea that sounds can alter how masculine or feminine a person looks.
"Our vision can bias our experience of other senses, such as hearing," said lead author Eric Smith.
"We hear, for example, the ventriloquist's voice coming from the dummy. In this study we wanted to see if hearing could change our visual experience," Smith said.
To test whether a sound can influence perception of a face's gender, the researchers digitally morphed male and female faces to create androgynous faces not easily categorized as male or female. Study participants were asked to look at the faces while listening to brief auditory tones, which fell within the fundamental speaking frequency range of either male or female voices.
In the initial stage of auditory processing, sounds are decomposed into basic frequency components, the lowest one called the fundamental frequency and higher ones called the harmonics.
In higher auditory brain areas, these frequencies are put back together to be coded as a human voice. The researchers took advantage of the fact that pure tones can be used to deliver individual frequency components that are registered in early auditory brain areas.
The findings showed that when an androgynous face was paired with a pure tone that fell within the female fundamental-frequency range, people were more likely to report that the ambiguous face was that of a female. But when the same face was paired with a pure tone in the male fundamental-frequency range, people were more likely to see a male face. (The bias did not occur when a face was paired with a pure tone that was too low or too high to be in the typical speaking range).
"The strength of the study is that pure tones sound like beeps, and they primarily activate early stages of auditory processing," Grabowecky said.
"We think that the effect demonstrates a direct input from early auditory processing to visual perception," she said.
When people were forced to guess whether the tones were in the male range, the female range or outside of the typical speaking frequency range, their guesses were inaccurate and relative.
"Such relativity is not surprising, because our auditory experience depends on relative, rather than absolute, frequencies as most useful and entertaining auditory information, such as speech and music, is carried by how sound frequencies change over time," Grabowecky said.
Absolute frequencies do not matter much, as we readily understand speech spoken by people with low and high voices and enjoy songs regardless of the keys in which they are played. In contrast, it is the "neglected" absolute-frequency information that influences visual perception of gender.
"A conscious impression of your voice is not what enhances your look of masculinity or femininity. Sounds seem to influence visual gender in a much more fundamental way on the basis of their absolute frequencies processed in early auditory brain areas," said Suzuki.
The report "Auditory-Visual Cross-Modal Integration in Perception of Face Gender," appears in a recent issue of Current Biology.