University of Miami (UM) developmental psychologists and computer scientists from the University of California in San Diego (UC San Diego) are trying another approach to better understand the process of human behavior development.
They are studying child-mother interactions to implement social skills in a robot that will be able to learn them.
Not only this experiment is breaking the boundaries of robotics, it will also help reveal some of the mysteries of human cognitive development, so the first phase of the project is to study closely, face-to-face interactions between mother and child, and maybe learn how predictable early communication is, and what babies need to act intentionally.
The researchers looked at 13 mothers and their babies between one and six months of age, while they were playing for five minutes, every week, for approximately 14 weeks, in videotaped lab sessions.
The results showed that during their first six months of life, babies develop turn-taking skills, which are the first step to more complex human interactions.
Daniel Messinger, associate professor of Psychology in the UM College of Arts and Sciences and principal investigator of the study explains that babies and moms find a pattern in their play, which becomes more stable and predictable with age.
“As babies get older, they develop a pattern with their moms.
“When the baby smiles, the mom smiles; then the baby stops smiling and the mom stops smiling, and the babies learn to expect that someone will respond to them in a particular manner,” he says.
“Eventually the baby also learns to respond to the mom.”
The second phase of the project is to program a baby robot according to these findings, so that it will have basic social skills and be able to learn complex interactions.
This robot actually exists and it name is Diego-San, he is 1.3 meters tall and modeled after a 1-year-old child.
Diego-San was built by Kokoro Dreams and the Machine Perception Laboratory at UC San Diego and one of the things it will have to do is to shift gaze from people to objects, just like a real baby does in his/her learning process.
Messinger said that “one important finding here is that infants are most likely to shift their gaze, if they are the last ones to do so during the interaction.
“What matters most is how long a baby looks at something, not what they are looking at.”
What is rather interesting is that the babies are teaching the researchers how to program the robot and the robot will give them an insight on the development of human behavior, and allow them to better understand the babies.
Paul Ruvolo, six year graduate student in the Computer Science Department at UC San Diego and co-author of the study said that “a unique aspect of this project is that we have state-of-the-art tools to study development on both the robotics and developmental psychology side.
“On the robotics side we have a robot that mechanically closely approximates the complexity of the human motor system and on the developmental psychology side we have a fine-grained motion capture and video recording that shows the mother infant action in great detail.
“It is the interplay of these two methods for studying the process of development that has us so excited.”
This study is funded by the National Science Foundation, and its results are published in the current issue of the journal Neural Networks.