The finding could help find answers to language mysteries

Dec 23, 2009 19:01 GMT  ·  By
One of the sensor-laden palates used in the study, which allows for the position of the tongue to be measured about 100 times per second
   One of the sensor-laden palates used in the study, which allows for the position of the tongue to be measured about 100 times per second

English researchers in London have recently determined in a new study that our motor systems, which govern movements, tend to activate themselves automatically when we hear speech. They say that these newly found data could help experts understand and fight against a large number of speech difficulties in the near future, which nowadays plague children and adults alike, PhysOrg reports. Details of the work appear in this week's issue of the respected journal Proceedings of the National Academy of Sciences (PNAS), in a paper called “Activation of Articulatory Information in Speech Perception.”

The team, which is based at the Royal Holloway, the University of London, says that our motor systems are “recruited” every time we hear speech, regardless of whether we want them to be or not. This also happens irrespective of whether we want to acknowledge the speech we hear or not. RHUL Department of Psychology Professor Kathleen Rastle has been in charge of the new investigation, which has relied on the use of custom-made acrylic palates. The devices have been installed on willing participants and measured how many contacts appear between the tongue and the roof of the mouth.

The instruments were able to scan these interactions about 100 times each second, so the researchers are convinced that the data they gathered are accurate and relevant. The ultimate goal was to determine all possible locations of the tongue during speech. All those who took part in the experiments were asked to read aloud words like “koob,” while also listening to distracting, spoken words, such as “toob.”

“Our key question was whether there would be evidence that participants had constructed programs for the movements involved in speaking from the spoken distractor syllables. We hypothesized that if motor systems are recruited when we listen to speech, then the way that target syllables were produced would be influenced by the characteristics of the spoken distractors,” Rastle explains. The results of the research showed that the position of the tongue inside the mouth was different when participants read the words while also listening to the spoken language.

“These findings provide the first evidence that when we hear speech, we activate the movements involved in speaking in an automatic and involuntary manner. Research must now focus on precisely how the motor system impacts on speech perception and on why the motor system is recruited when we listen to speech,” the team leader concludes.