home about sciencentral contact
sciencentral news : making sense of science
life sciences physical sciences technology full archive
April 7, 2013

Household Robots

Post/Bookmark this story:

Search (Archive Only)

Mind Controlled Robot (05.02.07)

Crusher Robot (09.05.06)

Whole House Machines (05.20.05)


Association for the Advancement of Artificial Intelligence

Domo Videos on YouTube Video Stream

STAIR Videos from Stanford

email to a friend
(movie will open in a separate window)
Choose your format:

There are dancing robots and robot dogs, but many of us are still waiting for a robot that can do chores. As this ScienCentral News video explains, a new breed of robots may one day lead to a robot in every home.

Mr. Roboto

We live in a world surrounded by robots. While many of the robots today don't look like the ones in the movies "Transformers" or "I, Robot," there's artificial intelligence in our automobiles, toys, and even our microwaves. But the days of humanoid robotic assistants helping us do chores, like Rosie in "The Jetsons," may not be too far away.

Domo uses artificial intelligence to grasp the bag of coffee and locate the shelf.
In a robotics lab at the Massachusetts Institute of Technology (MIT), the torso of a robot grabs a box and holds it out to a roboticist as he cleans up the lab. We may consider it mindless work, but Aaron Edsinger says it takes intelligence for this robot named Domo to lend a helping hand with household chores.

"Our big goal is to have the robot adapt to the world instead of having the world adapt to the robot," he says. This is key, he says, because robots without artificial intelligence can currently perform very complex tasks, like assembling an automobile; but they must be taught beforehand exactly what to do. "A lot of the really advanced robotics that you see particularly coming out of Japan right now, these robots are very pre-scripted," says Edsinger. "The actions they're going to take are sort of figured out beforehand. You hit play, and it sort of does the same thing over and over again."

Edsinger has been getting Domo to work in domestic settings, exposing the robot to objects it hasn't seen before.

"Adaptivity is going to be critical as soon as we want robots to come out of the car factory and into our homes, into our daily lives, because we can't program it with everything it needs to know," says Edsinger. For example, he says, household robots must be able to distinguish the countless objects within our home. "A car factory can be very well understood and predicted ahead of time. Your kitchen and all your dishes in the kitchen sink are much harder for a robot to understand," he says.

Roboticist Aaron Edsinger through Domo's eyes
A key part of having a robot in the home is ensuring that it's safe. Edsinger designed Domo -- named after the lyric "Domo Arigato, Mister Roboto" from the popular 80's rock song by Styx -- to interact with people, giving it big friendly eyes and sensors that respond to human touch. It even says "ouch" if you bend its arm the wrong way. While there is a debate in the robotics community, Edsinger says, of whether robots should look like humans or a regular appliance, this "human connection" is key for Edsinger's robot. "By having the eye contact, it's very intuitive for us to understand what the robot is doing," he says. "When it reaches to me with an open hand, I understand that gesture very intuitively as a request to be handed something. So having a human body is very important for that."

Domo's human-like body also allows it to work well in our settings, filled with objects that are designed for human hands.

But typically, object manipulation has been a tricky problem for the robotics world. "You can pick something up, pass it between your hands. You talk on your cell phone, and it's a very manual task ... for a robot to do that, it's really, really hard," he says.

But Domo's software allows it to learn for itself. When putting away groceries, Domo finds the best place to grasp a bag of coffee before placing it on the shelf. "It'll wiggle it around and watch the object while it's moving it ... It learns the length, the size of the object, and the orientation in the hands, so it knows how to adjust the orientation of the object so it can pass it between its hands," he says.

Robot Fetch

In this demonstration, the STanford Artificial Intelligence Robot (STAIR) grabs a stapler after being asked to retrieve it.
image: Andrew Ng, Stanford University
At Stanford University, a robot called "STAIR," which stands for the STanford Artificial Intelligence Robot, is also trying to tackle object manipulation. "If you take a coffee mug in front of a robot, a robot doesn't know how to pick it up, it doesn't have a 3-D model," says Stanford roboticist Andrew Ng. "We took a small set of objects, and we showed the robot how to pick up these objects. Things like a book, cup, and coffee mug. Then, a machine-learning algorithm takes what we showed it and what are the visual properties of an object that determine what's a good place to pick."

According to Ng, after the team taught STAIR how to pick up six different items, the robot was 88 percent reliable at picking up unknown objects. Ng says that the robot was able to quickly learn, for example, that a semicircular shape of a handle was a particularly good place to start.

The Stanford team was able to recently demonstrate how STAIR could be asked to retrieve a stapler from one room and deliver it to the person in another. This not only pulls in manipulation research, but also voice recognition, navigation, and other previously separate fields of research. "For the STAIR project, it was time to revisit bringing these artificial intelligence tools back together. Taking tools from different threads of AI for a general purpose robot," Ng says.

A Robotic Future

Both robot designers say that these robots aren't only used for annoying chores around the house, they could help people like the elderly stay self-sufficient. "So doing a lot of chores around the house can enable them to be self sufficient for a longer period of time," Edsinger says. "And I think that would be a huge win not just for the people that get to live at home longer, but also for the healthcare industry in general."

Outside of the home, this artificial intelligence technology could help in manual labor. "One thing would be in agriculture ... picking fruit that's currently picked by hand," Edsinger says. "One would be to actually help people in a warehouse lift up heavy boxes. So if things that currently require a lot of physical, manual labor, hopefully we can assist people with these types of things in the future with the technology that's behind Domo."

Both researchers stress that these robots are still works in progress. Andrew Ng's research will continue to research having STAIR do various tasks, like emptying out a dishwasher or cook simple meals in a kitchen. For the first few meals, Ng says, "We'll have the robot heat up frozen meals. That's a start." Edsinger is founding a new company, Meka Robotics, which will find commercial applications for the core technologies behind Domo.

Aaron Edsinger's research was published in the "IEEE Robotics & Automation Magazine," 2007 and the "Proceedings of the IEEE/RSJ International Conference on Humanoid Robotics," 2006, and funded by NASA and Toyota Motor Corporation.

Andrew Ng's research is in press with the "Proceedings of the Twentieth International Joint Conference on Artificial Intelligence," and "Neural Information Processing Systems," 2007, and funded by the National Science Foundation, DARPA, Powerset, Ricoh Innovations, Google, and Intel.

       email to a friend by Victor Limjoco

Science Videos     Terms of Use     Privacy Policy     Site Map      Contact      About
ScienCentral News is a production of ScienCentral, Inc. in collaboration with The Center for Science and the Media 248 West 35th St., 17th Fl., NY, NY 10001 USA (212) 244-9577. The contents of these WWW sites © ScienCentral, 2000-2013. All rights reserved. This material is based on work supported by the National Science Foundation under Grant No. ESI-0515449 The views expressed in this website are not necessarily those of The National Science Foundation or any of our other sponsors. Image Credits National Science Foundation