Robots Learning To Cook Like Rachael Ray

In our home, we have 5 kids, 1 dog and 2 robots.  These robots help us clean the floors primarily.  While my wife loves to cook, it can get a little demanding being a short order cook for a variety of household diets.  I can’t wait to tell her what the people at the University of Maryland have created – Robotic Chefs!

Researchers at the University of Maryland, funded by the DARPA’s Mathematics of Sensing, Exploitation and Execution (MSEE), are teaching robots how to process visual data and to learn from what they see. The robots that were shown cooking videos (via YouTube!) were able to grab and manipulate the correct kitchen tools and use them to complete specific tasks with great accuracy, according to DARPA, which stands for Defense Advanced Research Projects Agency.  And, this exercise requires no additional programming from humans. The robots learned how to complete tasks, such as picking up a pitcher, and to put that new knowledge into practice in the physical world.

“We are trying to create a technology so that robots eventually can interact with humans,” Cornelia Fermüller, an associate research scientist at the University of Maryland Institute for Advanced Computer Studies, said in a statement.


The robots use computer vision software to identify objects, tools and hand movements in the cooking videos. That information is broken down into smaller chunks of simple actions that when combined together result in the robots performing complex actions, Yiannis Aloimonos, a professor of computer science and director of the university’s Computer Vision Lab.  Aloimonos compared the process to that of creating a sentence. In this case, each chunk of information is a word and the robots learn the appropriate grammar to organize that information.



“For example, if you want to cut a cucumber, then first you have to grasp knife. Then you have to bring the knife to the cucumber. Then you have to do the cutting movement. Then you have to look to make sure that you cut and see two pieces,” Aloimonos said.

The big question is, what is DARPA doing with RoboChefs in their arsenal?  Reza Ghanadan, program manager in DARPA’s Defense Sciences Offices, said about the research, “The MSEE program initially focused on sensing, which involves perception and understanding of what’s happening in a visual scene, not simply recognizing and identifying objects. We’ve now taken the next step to execution, where a robot processes visual cues through a manipulation action-grammar module and translates them into actions.”

“This system allows robots to continuously build on previous learning — such as types of objects and grasps associated with them — which could have a huge impact on teaching and training,” explained Ghanadan. “Instead of the long and expensive process of programming code to teach robots to do tasks, this research opens the potential for robots to learn much faster, at much lower cost and, to the extent they are authorized to do so, share that knowledge with other robots. This learning-based approach is a significant step towards developing technologies that could have benefits in areas such as military repair and logistics.”

The researchers aren’t interested in just copying movements. They want to give robots the ability to complete goal-oriented actions without the time-intensive process of programming that act into the robot’s system. This could potentially save humans from doing dangerous work by sending robots in their place. Aloimonos envisions a future in which robots are able to defuse bombs and clean up nuclear disasters.

However, it is still too early to tell if a Robot will help my wife or replace the barefoot contessa.  According to Aloimonos, “We have gone as far as a simple Greek salad because that involves tomatoes, cucumbers and olive oil and oregano and salt. So you have to do lots of not-so-trivial actions.” I can only imagine the cost of that salad…

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: