The complex human speech is perhaps the main trait that puts us apart from apes. For example, chimps have the same mental addition ability like humans, while speech allows humans counting and advanced calculations.
Human speech have been linked to two brain nuclei controlling the language (articulating control, data storing and integration of the grammar rules), located in the left hemisphere of the cortex. What we want to say is initiated in an area of the left cortex called "Wernicke zone". This communicates with "Broca zone", involved in grammatical rules. Impulses go from these areas to the muscles involved in speech. These zones are connected with the visual system (so we can read), auditory system (so we can hear what others say, understand and answer) and also have a memory bank for recalling valuable phrases.
The left hemisphere is also the center of analytical thinking, linked to logical abilities. Now, a study published in the journal "Neuron" has detected the brain nuclei where speech sounds got their abstract meaning, rather than being perceived as a stream of sounds. This discovery shows that the human speech understanding goes beyond lower-level detection of speech sounds, requiring specialized sensory nuclei.
To detect these areas, the team put the subjects to listen to a series of simple speech sounds, while looking at a video of people pronouncing the sounds. But, the speech sounds could either match the video or not.
Through various combinations of speech sounds and videos, the team detected, by using fMRI imaging, the brain nuclei involved in abstract processing of the speech sounds, not just their sensory perception: two areas in the left-hemisphere's (this is not a surprise) speech related area, named pars opercularis and planum polare.
"We have shown that there are neurophysiological substrates that code properties of an audiovisual utterance at a level of abstraction that corresponds to the speech category that is 'heard,' which can be independent of its sensory properties. We set out from the observation that there is no need to posit the existence of abstract coding to explain emergent features of audiovisual speech, because these features may just be the result of joint activity in lower-level unisensory regions. Yet, our results indicate that neural activity in left-hemisphere regions does indeed track the experienced speech percept, independent of its sensory properties", wrote the authors.