The neurological basis of speech motor control, the complex coordinated activity of tiny brain regions that controls our lips, jaw, tongue and larynx as we speak, is being uncovered by researchers.
The discovery has potential implications for developing computer-brain interfaces for artificial speech communication and for the treatment of speech disorders.
It also sheds light on an ability that is unique to humans among living creatures but poorly understood.
The complexity comes from the fact that spoken words require the coordinated efforts of numerous "articulators" in the vocal tract - the lips, tongue, jaw and larynx - but scientists have not understood how the movements of these distinct articulators are precisely coordinated in the brain.
To understand how speech articulation works, Chang and his colleagues recorded electrical activity directly from the brains of three people undergoing brain surgery at UCSF, and used this information to determine the spatial organization of the "speech sensorimotor cortex," which controls the lips, tongue, jaw, larynx as a person speaks. This gave them a map of which parts of the brain control which parts of the vocal tract.
They then applied a sophisticated new method called "state-space" analysis to observe the complex spatial and temporal patterns of neural activity in the speech sensorimotor cortex that play out as someone speaks. This revealed a surprising sophistication in how the brain's speech sensorimotor cortex works.
They found that this cortical area has a hierarchical and cyclical structure that exerts a split-second, symphony-like control over the tongue, jaw, larynx and lips.
"These properties may reflect cortical strategies to greatly simplify the complex coordination of articulators in fluent speech," said Kristofer Bouchard, PhD, a postdoctoral fellow in the Chang lab who was the first author on the paper.
The research was described this week in the journal Nature.