In a new study it has been found that even though babies are too young to talk, they can understand words and process them in a grown up way.
Scientists at the University of California, San Diego, showed that babies process words they hear with the same brain structures as adults, and in the same amount of time.
And they don't just process the words as sounds, they grasp their meanings too.
To conduct the study, the team MEG - an imaging process that measures tiny magnetic fields emitted by neurons in the brain - and MRI to noninvasively estimate brain activity in 12 to 18-month old infants.
In the first experiment, the infants listened to words accompanied by sounds with similar acoustic properties, but no meaning, in order to determine if they were capable of distinguishing between the two. In the second phase, the researchers tested whether the babies were capable of understanding the meaning of these words.
For this experiment, babies saw pictures of familiar objects and then heard words that were either matched or mismatched to the name of the object: a picture of a ball followed by the spoken word ball, versus a picture of a ball followed by the spoken word dog.
Results shown through the scans indicated that the infants were capable of detecting the mismatch between a word and a picture, as shown by the amplitude of brain activity.
The "mismatched," or incongruous, words evoked a characteristic brain response located in the same left frontotemporal areas known to process word meaning in the adult brain.
Tests repeated in humans showed the same results.
"Our study shows that the neural machinery used by adults to understand words is already functional when words are first being learned," said Eric Halgren.
"This basic process seems to embody the process whereby words are understood, as well as the context for learning new words."
The researchers say their results have implications for future studies, for example development of diagnostic tests based on brain imaging which could indicate whether a baby has healthy word understanding even before speaking, enabling early screening for language disabilities or autism.
The work is published this week in the Oxford University Press journal Cerebral Cortex.