Their brain sound processing is different

Apr 16, 2008 08:32 GMT  ·  By

You don't have to wait for the age of speech learning. A team led by Professor of Neuroscience April Benasich, at Infancy Studies Laboratory at Rutgers University in Newark, found just how the brains of 3-month old infants differentiate sounds signal language issues.

The methods developed by this team can assess as early as 3 to 6 months if a baby will face language problems. A main role is played by the developing brain, which must process and distinguish acoustic differences in rapid succession. For example, the infants must differentiate similar sounds, like "ma" and "na" in time intervals of just 40 milliseconds.

During the first months of life, the brain is building an acoustic map of the sounds of his/her mother tongue, crucial in acquiring language. But, in some infants, the process is impaired.

"About 5 to 10% of all children beginning school are estimated to have language-learning impairments (LLI) leading to reading, speaking and comprehension problems. In families with a history of LLI, 40 to 50% of children are likely to have a similar problem. Many of these children go on to develop dyslexia," said Benasich.

Dense sensor array EEG/ERP recordings allowed the team to investigate EEG, ERPs and the ratio of gamma waves in infant brains. A soft bonnet of sensors was placed on a baby's head, who was put to listen various series of rapid tone sequences.

"We are finding that children who have difficulty processing rapid auditory input are not just showing a simple maturational lag, but are actually processing incoming acoustic information differently," said Benasich.

These kids used different brain areas than normal children, for the same task. The researchers detected less left hemisphere activity in the brains of children having problems with rapid auditory processing, compared with normal children. Knowing the precise differences in brain activity facing acoustic income could enable specialists to guide the brains of babies having language issues for an increased efficiency even before the age at which children normally start to speak.

"We can predict with about 90% accuracy what a baby's language capabilities will be just by their response to tones," said Benasich.

The team even developed a Magnetic Resonance Imaging (MRI) protocol for scanning naturally sleeping healthy babies for detecting those prone to language issues.

Kids of that age cannot lie still for long periods in a scanner, so that the team made the scans in the evening and put the parents to repeat their child's normal bedtime routine, like reading him/her a story, nursing him/her, rocking and snuggling. The sleeping child was provided with headphones emitting a steady stream of lullabies while the bonnet recorded brain activity.

"Our goal is not only to develop training techniques to correct rapid auditory processing problems, but to identify the period during infant development when the brain is most "plastic," or most able to change through learning," said Benasich.