According to a new Canadian study, a similar neural code in the human brain controls how people perceive a sound and an image.
Researchers from the Universite de Montreal and the Montreal Neurological Institute at McGill University have explained how the same neural code in the brain allows people to distinguish between different types of sounds, such as speech and music, or different images.
For the study, the researchers recruited participants to undergo functional magnetic resonance imaging (FMRI), a non-invasive form of brain mapping used to determine how the brain recognizes different characteristics in musical instruments, words from conversations or environmental sounds.
Subjects underwent an exhaustive three hours of FMRI exams to provide precise information about how the brain reacts when different sounds are played.
"It turns out that the brain uses the same strategy to encode sounds than it uses to encode different images. This may make it easier for people to combine sounds and images that belong to the same object, such as the dribbling of a basketball," explained lead author Marc Schonwiesner, a Universite de Montreal psychology professor.
The next step for the researchers is to determine exactly how the brain distinguishes between rock drumbeats to the strings of a symphony or from a French conversation to an English one.
"Our goal is to disentangle exactly how the brain extracts these different types of sounds. This is a step may eventually let us reconstruct a song that a person has heard from according to the activity pattern in their brain," explained Schonwiesner.
He also said that as scientists advance in decoding brain activation patterns, mind-boggling applications could be envisaged.
"If researchers can reconstruct a song a person has heard according to an fMRI reading, we're not far off to being able to record brain patterns during sleep and reconstruct dreams. That would be really cool, although this possibility is decades of research away," he predicted.
The research has been published in the online edition of the Proceedings of the National Academy of Sciences (PNAS).