UCSD introduces Diego-san, a baby robot with ‘tude
January 10, 2013

“I’m what? A robot? … Cool! … Uh, maybe not.” Diego-san demonstrating different facial expressions, using 27 moving parts in the head alone (credit: UCSD)
UCSD has introduced Diego-san, a new humanoid robot who mimicks the expressions of a one-year-old child
Demonstrated at CES and in a video, the robot will be used in studies on sensory-motor and social development — how babies “learn” to control their bodies and to interact with other people.
Diego-san’s hardware was developed by two leading robot manufacturers: the head by Hanson Robotics and the body by Japan’s Kokoro Co. The project is led by University of California, San Diego full research scientist Javier Movellan.
Movellan directs the Institute for Neural Computation’s Machine Perception Laboratory, based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2). The Diego-san project is also a joint collaboration with the Early Play and Development Laboratory of professor Dan Messinger at the University of Miami, and with professor Emo Todorov’s Movement Control Laboratory at the University of Washington.
Movellan and his colleagues are developing the software that allows Diego-san to learn to control his body and to learn to interact with people.

An earlier (apparently bummed-0ut) version of the Diego-san robot (credit: UCSD)
“We developed machine-learning methods to analyze face-to-face interaction between mothers and infants, to extract the underlying social controller used by infants, and to port it to Diego-san,” said Movellan. “We then analyzed the resulting interaction between Diego-san and adults.”
“This robotic baby boy was built with funding from the National Science Foundation and serves cognitive A.I. and human-robot interaction research,” wrote Hanson.
“With high definition cameras in the eyes, Diego San sees people, gestures, expressions, and uses A.I. modeled on human babies, to learn from people, the way that a baby hypothetically would. The facial expressions are important to establish a relationship, and communicate intuitively to people.”
Social robots
Diego-san is the next step in the development of “emotionally relevant” robotics, building on Hanson’s previous work with the Machine Perception Lab, such as the emotionally responsive Albert Einstein head.
The robot is a product of the “Developing Social Robots” project launched in 2008. The goal of the project was “to make progress on computational problems that elude the most sophisticated computers and Artificial Intelligence approaches, but that infants solve seamlessly during their first year of life.”
For that reason, the robot’s sensors and actuators were built to approximate the levels of complexity of human infants, including actuators to replicate dynamics similar to those of human muscles. The technology should allow Diego-san to learn and autonomously develop sensory-motor and communicative skills typical of one-year-old infants.
“Its main goal is to try and understand the development of sensory motor intelligence from a computational point of view,” explained principal investigator Movellan. “It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby’s brain faces when learning to move its own body and use it to interact with the physical and social worlds.”
The researchers are interested in studying Diego-san’s interaction with the physical world via reaching, grasping, etc., and with the social world through pointing, smiling and other gestures or facial expressions.
As outlined in the original proposal to the NSF, the project is “grounded in developmental research with human infants, using motion capture and computer vision technology to characterize the statistics of early physical and social interaction. An important goal is to foster the conceptual shifts needed to rigorously think, explore, and formalize intelligent architectures that learn and develop autonomously by interaction with the physical and social worlds.”
According to UCSD’s Movellan, the expression recognition technology his team developed for Diego-san has spawned a startup called Machine Perception Technologies (MPT). The company is currently looking for undergraduate interns and postgraduate programmers.
The project may also open new avenues to the computational study of infant development and potentially offer new clues for the understanding of developmental disorders such as autism and Williams syndrome.
This spring, Swiss researchers will demonstrate their nearly 4-foot-tall Roboy robot toddler (with a face selected via a Facebook contest).
Comments (13)
by Greg
Does it Pee and Poop?
Serious question. I think Peeing and Pooping (and the cleanup thereof) is an important factor of a baby’s social development. How does the parent handle the dirty diaper (not to mention the occasional ‘fountain’ of open-peeing).
Does it get a tummy-ache?
by a
incredible and also a bit scary…
by Bri
It would be interesting to see how the robot would relate to children. Although it’s a little big for a toddler. By it’s size it appears more like a 6 to 8 year old. It’s facial actions are a bit slow but appear to be more like a 2 to 3 year old. I’m not quite shure why there is so much fascination with child robots. Their interactive relationships tend to be almost exclusively with family members. Adults tend to act silly or childish around young children. I remember being very mad at my mother for not talking like an adult and giving me watered down or childish answers to my questions. I also hated childrens books. I wanted the real information. It’s almost a surreal time. I remember my parents bringing me to see Santa Klaus. I was like, WTF. I remember his seedy suit and all the other parents and there kids all wrapped up in it, and I was thinking the whole thing was just a sham. We infantilize children. The way adults act around children is contrived and forced. The real story is in how children relate to other kids. I remember being eight years old and observing how other kids would “try out” behaviors that they learned from significant others. I remember doing it consciously on my own. We see what works for ourselves in our social interactions. It would be interesting to observe these differences from the standpoint of this AI learning platform. In a nutshell, kids act natural, adults act goofy.
by brandon
Commence nightmares
by Editor
Yea, though I walk through the uncanny valley of bots I shall fear no evil…
by anthrobotic
Well, here we have what has got to be worst-named and ugliest/creepiest anthromimetic android robot yet. How is it that a project gets this far with NO ONE saying “Hey, hey guys – that name is stupid as hell and why does the robot baby have to be so damn ugly?”
…aside from all that it’s an awesome project.
Not completely precedented by MIT’s Humanoid Robotics Projects at all. http://www.ai.mit.edu/projects/humanoid-robotics-group/
-Reno at Anthrobotic.com
by Cybernettr
With the unfinished body it does look a little creepy but I don’t see anything wrong with the face.
by Mike
The Turing test for realistic emotions has been aced!
by Editor
Yes, a realistic emotion is everything. If you can fake that, you’ve got it made. (To mangle the old joke.)
by Ian Clarke
I think that’s the most realistic robotic face I’ve yet seen. Great stuff!
by Gorden Russell
This is no joke, this is really happening. These robo-kids will grow up to take all of our jobs…and they’ll do it with such a charming smile.
by Bob Vasquez
At last; if our kids don’t behave, we will be able to replace them. Let them know.
by Tom B.
Poor kid needs some skin on his body.