We all know that babies learn about the world around them from humans, but now scientists from the  University of Washington have discovered that robots can learn in the same way, as demonstrated in their study which was published in the November edition of PLOS ONE. Although roboticists typically teach robots by writing code or moving their limbs in order to show them how to perform an action, the recent findings open up a whole new realm for "robot learning."

"You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans," Rajesh Rao, senior author of the study, said in a press release.

The scientists showed how robots can learn by gaining data through firsthand exploration as well as observing humans perform tasks and figuring out how to conduct them on their own.

"If you want people who don't know anything about computer programming to be able to teach a robot, the way to do it is through demonstration -showing the robot how to clean your dishes, fold your clothes, or do household chores," said Rao. "But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own."

Rao and his team used prior research on babies to create machine-learning algorithms that give robots the ability to examine and contrast their own actions and their resulting outcomes. Using this information, it can infer what it is supposed to do and even ask for help when it needs to.

Although the current study focused on learning through inference and imitation, the team's next step is using the same model to teach the robots more complex tasks.

"Babies learn through their own play and by watching others," says Andrew Meltzoff, who collaborated with the researchers. "And they are the best learners on the planet - why not design robots that learn as effortlessly as a child?"