Medindia

X

Robots With Social Skills May Result from a Child-mother Interactions Study

by Tanya Thomas on  October 31, 2010 at 9:58 AM News on IT in Healthcare   - G J E 4
Researchers are studying infant-mother interactions and working to implement their findings in a baby robot capable of learning social skills.
 Robots With Social Skills May Result from a Child-mother Interactions Study
Robots With Social Skills May Result from a Child-mother Interactions Study
Advertisement

The first phase of the project by University of Miami (UM) developmental psychologists and computer scientists from the University of California in San Diego (UC San Diego) was studying face-to-face interactions between mother and child, to learn how predictable early communication is, and to understand what babies need to act intentionally.

Advertisement
The scientists examined 13 mothers and babies between 1 and 6 months of age, while they played during five-minute intervals weekly. There were approximately 14 sessions per dyad. The laboratory sessions were videotaped and the researchers applied an interdisciplinary approach to understanding their behavior.

The researchers found that in the first six months of life, babies develop turn- taking skills, the first step to more complex human interactions. According to the study, babies and mothers find a pattern in their play, and that pattern becomes more stable and predictable with age,explains Daniel Messinger, associate professor of Psychology in the UM College of Arts and Sciences and principal investigator of the study.

"When the baby smiles, the mom smiles; then the baby stops smiling and the mom stops smiling, and the babies learn to expect that someone will respond to them in a particular manner," he says.

"Eventually the baby also learns to respond to the mom."

The next phase of the project is to use the findings to program a baby robot, with basic social skills and with the ability to learn more complicated interactions. The robot's name is Diego-San. He is 1.3 meters tall and modeled after a 1-year-old child. The construction of the robot was a joint venture between Kokoro Dreams and the Machine Perception Laboratory at UC San Diego.

The robot will need to shift its gaze from people to objects based on the same principles babies seem to use as they play and develop.

"One important finding here is that infants are most likely to shift their gaze, if they are the last ones to do so during the interaction," says Messinger.

The findings are published in the current issue of the journal Neural Networks in a study titled "Applying machine learning to infant interaction: The development is in the details."

Source: ANI
Advertisement

Post your Comments

Comments should be on the topic and should not be abusive. The editorial team reserves the right to review and moderate the comments posted on the site.
User Avatar
* Your comment can be maximum of 2500 characters
Notify me when reply is posted I agree to the terms and conditions

You May Also Like

Advertisement
View All