. 24/7 Space News .
ROBO SPACE
UW roboticists learn to teach robots from babies
by Staff Writers
Seattle WA (SPX) Dec 08, 2015


A collaboration between UW developmental psychologists and computer scientists aims to enable robots to learn in the same way that children naturally do. The team used research on how babies follow an adult's gaze to 'teach' a robot to perform the same task. Image courtesy University of Washington.

Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing. But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot's arm or body to show it how to perform an action.

Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can "learn" much like kids - by amassing data through exploration, watching a human perform a task and determining how best to carry out that task on its own.

"You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans," said senior author Rajesh Rao, a UW professor of computer science and engineering.

"If you want people who don't know anything about computer programming to be able to teach a robot, the way to do it is through demonstration - showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own."

The research, which combines child development research from the UW's Institute for Learning and Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.

In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.

The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult's actions and develop alternate ways of reaching that goal themselves.

In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods - they wrapped their tiny fingers all the way around the ends and yanked especially hard - duplicating what the adult intended to do.

Children acquire intention-reading skills, in part, through self-exploration that helps them learn the laws of physics and how their own actions influence objects, eventually allowing them to amass enough knowledge to learn from others and to interpret their intentions. Meltzoff thinks that one of the reasons babies learn so quickly is that they are so playful.

"Babies engage in what looks like mindless play, but this enables future learning. It's a baby's secret sauce for innovation," Meltzoff said. "If they're trying to figure out how to work a new toy, they're actually using knowledge they gained by playing with other toys. During play they're learning a mental model of how their actions cause changes in the world. And once you have that model you can begin to solve novel problems and start to predict someone else's intentions."

Rao's team used that research on babies to develop machine learning algorithms that allow a robot to explore how its own actions result in different outcomes. Then it uses that learned probabilistic model to infer what a human wants it to do and complete the task, and even to "ask" for help if it's not certain it can.

The team tested its robotic model in two different scenarios: a computer simulation experiment in which a robot learns to follow a human's gaze, and another experiment in which an actual robot learns to imitate human actions involving moving toy food objects to different areas on a tabletop.

In the gaze experiment, the robot learns a model of its own head movements and assumes that the human's head is governed by the same rules. The robot tracks the beginning and ending points of a human's head movements as the human looks across the room and uses that information to figure out where the person is looking. The robot then uses its learned model of head movements to fixate on the same location as the human.

The team also recreated one of Meltzoff's tests that showed infants who had experience with visual barriers and blindfolds weren't interested in looking where a blindfolded adult was looking, because they understood the person couldn't actually see. Once the team enabled the robot to "learn" what the consequences of being blindfolded were, it no longer followed the human's head movement to look at the same spot.

"Babies use their own self-experience to interpret the behavior of others - and so did our robot," said Meltzoff.

In the second experiment, the team allowed a robot to experiment with pushing or picking up different objects and moving them around a tabletop. The robot used that model to imitate a human who moved objects around or cleared everything off the tabletop. Rather than rigidly mimicking the human action each time, the robot sometimes used different means to achieve the same ends.

"If the human pushes an object to a new location, it may be easier and more reliable for a robot with a gripper to pick it up to move it there rather than push it," said lead author Michael Jae-Yoon Chung, a UW doctoral student in computer science and engineering. "But that requires knowing what the goal is, which is a hard problem in robotics and which our paper tries to address."

Though the initial experiments involved learning how to infer goals and imitate simple behaviors, the team plans to explore how such a model can help robots learn more complicated tasks.

"Babies learn through their own play and by watching others," says Meltzoff, "and they are the best learners on the planet - why not design robots that learn as effortlessly as a child?"


Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Contributor
$5 Billed Once


credit card or paypal
SpaceDaily Monthly Supporter
$5 Billed Monthly


paypal only


.


Related Links
University of Washington
All about the robots on Earth and beyond!






Comment on this article via your Facebook, Yahoo, AOL, Hotmail login.

Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

Previous Report
ROBO SPACE
Japan shows off disaster-response robots at android fair
Tokyo (AFP) Dec 2, 2015
Japan on Wednesday displayed a pair of two-legged humanoid robots that can operate in harsh conditions as the country prone to earthquakes and volcanic eruptions prepares for the next catastrophe. Simulating work in a tunnel after a quake, two slender robots with tiny heads attached with sensors walked through fake debris to extinguish a fire during a demonstration at the International Robot ... read more


ROBO SPACE
Gaia's sensors scan a lunar transit

SwRI scientists explain why moon rocks contain fewer volatiles than Earth's

All-female Russian crew starts Moon mission test

Russian moon mission would need 4 Angara-A5V launches

ROBO SPACE
Letter to Mars? Royal Mail works it out for British boy, 5

European payload selected for ExoMars 2018 surface platform

ExoMars has historical, practical significance for Russia, Europe

ExoMars prepares to leave Europe for launch site

ROBO SPACE
Orion's power system to be put to the test

The Ins and Outs of NASA's First Launch of SLS and Orion

Aerojet Rocketdyne tapped for spacecraft's crew module propulsion

Brits Aim for the Stars with Big Bucks on Offer to Conquer Final Frontier

ROBO SPACE
China's indigenous SatNav performing well after tests

China launches Yaogan-29 remote sensing satellite

China's scientific satellites to enter uncharted territory

China to launch Dark Matter Satellite in mid-December

ROBO SPACE
Getting Into the Flow on the ISS

Orbital to fly first space cargo mission since 2014 explosion

Russian-US Space Collaboration Intact Despite Chill in Bilateral Ties

ISS EarthKAM ready for student imaging request

ROBO SPACE
DXL-2: Studying X-ray emissions in space

Arianespace selected to launch Azerspace-2/Intelsat 38 satellites

"Cyg"-nificant Science Launching to Space Station

Flight teams prepare for LISA Pathfinder liftoff

ROBO SPACE
What kinds of stars form rocky planets

Half of Kepler's giant exoplanet candidates are false positives

Exiled exoplanet likely kicked out of star's neighborhood

Neptune-size exoplanet around a red dwarf star

ROBO SPACE
Conductor turned insulator amid disorder

World's tiniest temperature sensor can track movement from inside cement

Researchers discover mother of pearl production process

New 'self-healing' gel makes electronics more flexible









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.