Fashion Week: Zipperbot Hits The Runway

Removing myself from city life last week, I couldn’t help notice inefficiencies that have yet to be solved by technology.  For example, United Airlines’ inability to deliver my luggage to my destination on time. Watching my wife encounter the same frustration, I had an epiphany,  women are unbelievably more patient than men with life’s hiccups.

Women encounter the biggest inefficiently almost on a daily basis, the zippers on the back of their dresses.  Contorted and frustrated, my wife constantly begs me to lend a helping hand.  Adam Whiton of MIT must have come to a similar conclusion as part of his aptly-named Sartorial Robotics Thesis, “Zipperbot.”

In an email to Mashable, Whiton says, “fashion is a form of play with our identities and it will be important for robots/machines to have an understanding of that.”

Zipperbot

Whiton, who just received his Ph.D from MIT’s Personal Robots group, is so convinced in the future of sartorial robotics that he founded a company (Betazip LLC) dedicated to the area. Its first product is, naturally, Zipperbot.

The robot, by the way, does far more than just zip up your jacket. It uses optical sensors to properly mesh the zipper teeth and motion sensors to zip and unzip at the right time. In one test, Whiton put Zipperbot in a form-fitting hobble skirt. When the hobble skirt-wearer began to walk, Zipperbot detected the motion and slightly unzipped to make it easier for the hobble skirt-wearer to move.

Originally working as a researcher focused on robot skin, Whiton soon switched to clothing and intelligent fabrics for wearables. Eventually, he turned to fashion.

“As robots become more and more sophisticated and work more closely with people, robots will need to understand social signaling which of course includes understanding fashion and sartorial cues,” says Whiton.

Along with the Zipperbot, Whiton also built a computer vision system that analyzes a person’s preferred color palettes based on the clothes they’re wearing. It can then suggest new patterns and colors based on that palette analysis.  This offers an entirely new category of wearable technology and sensors to provide smart clothes and accessories.

For Whiton, fashion is more than just about looking good, it may be the key to harmonious robot-human interaction. “[Robots] should understand simple differences like formal business attire versus casual in order to give context to an interaction or something more complex like the act of loosening a tie, which might indicate relaxation.”

Zipperbot 2

It’ll be a while before a robot quotes Billy Crystal’s charac you Fernando, to say  “you look marvelous.”  In the meantime, Whiton thinks Zipperbot could start a trend in “assistive clothing” and help people with disabilities dress themselves and be useful in situations where touching any part of clothing (for example, chemical and biohazard suits) could be detrimental to one’s health.

Whiton proves once again that creativity is THE killer app, now if he can just help those drones in Denver find my lost suitcase…

Robots Learning To Cook Like Rachael Ray

In our home, we have 5 kids, 1 dog and 2 robots.  These robots help us clean the floors primarily.  While my wife loves to cook, it can get a little demanding being a short order cook for a variety of household diets.  I can’t wait to tell her what the people at the University of Maryland have created – Robotic Chefs!

Researchers at the University of Maryland, funded by the DARPA’s Mathematics of Sensing, Exploitation and Execution (MSEE), are teaching robots how to process visual data and to learn from what they see. The robots that were shown cooking videos (via YouTube!) were able to grab and manipulate the correct kitchen tools and use them to complete specific tasks with great accuracy, according to DARPA, which stands for Defense Advanced Research Projects Agency.  And, this exercise requires no additional programming from humans. The robots learned how to complete tasks, such as picking up a pitcher, and to put that new knowledge into practice in the physical world.

“We are trying to create a technology so that robots eventually can interact with humans,” Cornelia Fermüller, an associate research scientist at the University of Maryland Institute for Advanced Computer Studies, said in a statement.

 

The robots use computer vision software to identify objects, tools and hand movements in the cooking videos. That information is broken down into smaller chunks of simple actions that when combined together result in the robots performing complex actions, Yiannis Aloimonos, a professor of computer science and director of the university’s Computer Vision Lab.  Aloimonos compared the process to that of creating a sentence. In this case, each chunk of information is a word and the robots learn the appropriate grammar to organize that information.

CookingRobot-m-0204

 

“For example, if you want to cut a cucumber, then first you have to grasp knife. Then you have to bring the knife to the cucumber. Then you have to do the cutting movement. Then you have to look to make sure that you cut and see two pieces,” Aloimonos said.

The big question is, what is DARPA doing with RoboChefs in their arsenal?  Reza Ghanadan, program manager in DARPA’s Defense Sciences Offices, said about the research, “The MSEE program initially focused on sensing, which involves perception and understanding of what’s happening in a visual scene, not simply recognizing and identifying objects. We’ve now taken the next step to execution, where a robot processes visual cues through a manipulation action-grammar module and translates them into actions.”

“This system allows robots to continuously build on previous learning — such as types of objects and grasps associated with them — which could have a huge impact on teaching and training,” explained Ghanadan. “Instead of the long and expensive process of programming code to teach robots to do tasks, this research opens the potential for robots to learn much faster, at much lower cost and, to the extent they are authorized to do so, share that knowledge with other robots. This learning-based approach is a significant step towards developing technologies that could have benefits in areas such as military repair and logistics.”

The researchers aren’t interested in just copying movements. They want to give robots the ability to complete goal-oriented actions without the time-intensive process of programming that act into the robot’s system. This could potentially save humans from doing dangerous work by sending robots in their place. Aloimonos envisions a future in which robots are able to defuse bombs and clean up nuclear disasters.

However, it is still too early to tell if a Robot will help my wife or replace the barefoot contessa.  According to Aloimonos, “We have gone as far as a simple Greek salad because that involves tomatoes, cucumbers and olive oil and oregano and salt. So you have to do lots of not-so-trivial actions.” I can only imagine the cost of that salad…

The Newest Superhero – BatBot

Today is Superbowl Sunday and the world is waiting at baited breath to see the outcome of deflategate.  In other news, a drunk drone operator crashed his UAV into the White House.  This is bad news for the robot industry as parnoia is sure to spread across the capital.  Now, the new drone “villain” is a versatile “Deployable Air-Land Exploration Robot” or DALER.

DALER is the creation of  Swiss Federal Institute of Technology in Lausanne (EPFL) inspired by Desmodus rotundas or the vampire bat.  This bat unlike Count Dracula as it adapts easily to different terrains by automatically changing shapes to quickly switch between walking to flying:

By studying and emulating the behavior of the vampire bat, the team created a wing covered in soft fabric that folds into a smaller space when on the ground and rotates around a hinge attaching the whegs to the body. This deformable and retractable wing morphology solves the issue in producing a drone capable of ambulation over ground due to the different center of mass requirements needed for flying and walking.

“The robot consists of a flying wing with adaptive morphology that can perform both long distance flight and walking in cluttered environments for local exploration,” said Ludovic Daler, lead researcher and Ph.D student at EPFL. “The robotʼs design is inspired by the common vampire bat Desmodus rotundus, which can perform aerial and terrestrial locomotion with limited trade-offs.”

To achieve these limited trade-offs, the researchers experimented until they found the ideal distance between the center of mass of the drone and the axis of rotation of the wingerons, in order to improve energy efficiency. As a result, this optimum balance of masses allows the DALER to reach speeds of about 70 km/h (45 mph) in the air and around a 6 cm/s (2.5 in/s) on the ground, with a maximum step distance of approximately 6 cm (2.5 in).

The EPFL team believes that the versatility of a walking and flying drone would be of great assistance in helping to locate survivors in dangerous or unstable areas after a natural disaster. The researchers see DALER remotely deployed to an affected area, where it would fly to an area of damaged buildings or destroyed infrastructure and then land to begin walking around to find victims, thus leaving human rescue teams to concentrate their efforts on moving large amounts of people in open areas.

 

The researchers also claim that potential future developments of their drone will include possible hover capabilities and the ability to take off autonomously from the ground after a mission and to return to base automatically. The DALER is still in the prototype stage, and no announcement has been made as to any future commercial development.  The research was published in the journal Bionspiration and Biomimetics.

The big question is what do people fear more – Vampires or Drones?  If one wayward drone could send the Secret Service in hysteria around Washington, imagine the reaction from a hybrid vampire bot…

There’s a new sheriff in town

One of my favorite scenes in Blazing Saddles is when Cleavon Little is riding into town to become the new Sheriff of Rockridge.  In many ways that movie broke down racial barriers through its comedic genius.  The question I ask today, is if we need a movie to make the police and citizens trust robots.

Earlier this week CNN reported that a smuggler’s drone flying from Mexico crash-landed just south of the U.S. border city of San Ysidro, California, in a failed drug delivery this week, Tijuana Municipal Police said.  The incident showed that smugglers aren’t just going underground anymore — using tunnels beneath the U.S.-Mexico border to transport drugs and migrants.  Now, the smugglers are trying to do business using unmanned aerial vehicles.

 

“To date, U.S. Customs and Border Protection has not intercepted any drones smuggling narcotics across the borders into the United States,” said Alberto Vallina, supervisory Border Patrol agent in San Diego. “In collaboration with our federal, state, local and international law enforcement partners, CBP remains vigilant against emerging trends and ever-changing tactics employed by transnational criminal organizations behind illegal attempts to smuggle narcotics into the U.S.”

The drone was loaded with more than six pounds of the synthetic drug crystal meth, Tijuana police said.  “In San Diego, the street value, at last account, for a six-pound load would be about $48,000,” DEA Special Agent Matt Barden said. “Once you get it across the border, that stuff’s like gold.”

Drones are emerging as the latest technological gadget used by cartels and smugglers in trying to outfox border authorities. The crashed drone was a prototype that used a global positioning system, or GPS, to send it to a particular destination, Tijuana police said on the department’s Facebook page.

“The cartels have been using drones for surveillance. Transporting drugs is a bit more complicated,” said Sylvia Longmire, a leading drug war analyst. “This is further evidence that the cartels have unlimited funds and creativity.”

In response to the pressure by the cartels, the US government has been employing a fleet of robots to scourer tunnels looking for drugs.  The nation’s increasingly high-tech battle against drug smuggling along the Southwest border just got another ally: a wireless, compact, camera-equipped robot.  Since 1990, authorities have discovered 168 tunnels in Arizona and California used mostly to smuggle drugs. More than half were dug up along the border stretch in Nogales, Ariz., where covert diggers often breach an underground flood-control system to enter the US.

“We’ve found all types of contraband in Nogales,” border patrol Agent Kevin Hecht says. “We’ve had marijuana, we’ve had cocaine, we’ve had heroine, we’ve had some meth.”

“That is not an option we needed right now,…Once you determine there’s no threats and it’s safe for the agent to make entry, then the agent can clear the tunnel and investigate further beyond what the robot was able to do,” said Agent Hecht.

The military-grade Pointman Tactical Robot is only 19 inches wide and can flip, negotiate rough terrain, and climb stairs (shown left with shotgun).

“Predominantly SWAT teams use them to get a look inside buildings before they enter,” says Alex Kaufman, who works for Applied Research Associates, Inc., of Albuquerque, N.M., which sells the robot.  The robot’s range and mobility will allow it to be more effective than the tethered robots currently used in the sometimes rudimentary, sometimes elaborate tunnels found along the border, he says.

So the war on drugs has come down to a war between robots, cartels with drones and DEA agents with tactical mobile units.  As previously written, the concept of Robocop is within reach.  Sheriff Bart you can now retire.

VolcanoBot: Journey to the Center of the Earth

A few weeks ago, I encamped opposite an active Volcano in Costa Rica.  As I was traveling with my family to Arenal, all the kids were starting to feel a little anxious about a possible eruption.  Volcanos are one of the last frontiers on earth to explore, however the extreme conditions make it next to impossible (for a human) to investigate.

Earlier this week, NASA announced its latest mission, exploring inner earth. Space may be vast, but the planets can be pretty cramped – especially when it comes to volcanoes.  This is unfortunate because the difficult to navigate fissures that are a major volcanic feature contain clues as to the interior of planets and moons and the mechanisms that formed them.  To help learn more, NASA is dropping miniature robots, called VolcanoBots, down crevices inaccessible to humans as a way of extracting information about volcanoes on and off the Earth.

VolcanoBot 1 in a lava tube (Image: NASA/JPL/Caltech)

VolcanoBot 1 (shown above) was based on NASA’s Durable Reconnaissance and Observation Platform (DROP) and was created by NASA postdoctoral fellow Carolyn Parcheta. and robotics researcher Aaron Parness at NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California. Measuring 12 in (30cm) long with 6.7 in (17 cm) wheels, it was designed to navigate into the narrow fissures that are a common feature of volcanoes on Earth, Mars, Mercury, and the moons of Enceladus and Europa, and to retrieve data that may provide insights into how these volcanoes formed.

In May of last year, VolcanoBot 1 was sent down a fissure at Mount Kilauea volcano on the island of Hawaii (see video below). During this, it descended to 82 ft (25 m) in two locations, which was the limit of its tether, though not the bottom of the unexpectedly deep fissure. In addition, the robot built up a 3D map of the crevice, and discovered that surface bulges on the volcano were reflected underground.

“In order to eventually understand how to predict eruptions and conduct hazard assessments, we need to understand how the magma is coming out of the ground,” says Parcheta. “This is the first time we have been able to measure it directly, from the inside, to centimeter-scale accuracy.”

The JPL team’s next step will be to build an improved robot called VolcanoBot 2, which will have a longer tether, stronger motors, better communications, and the ability to store data in onboard memory.  It will also be smaller at only 10 in (25 cm) long with 5 in (12 cm) wheels. According to Parcheta, these modifications will allow the robot to go deeper while dragging fewer cords behind it. In addition, it will have improved cameras, which will be able to pan and tilt. VolcanoBot 2 is scheduled to be deployed at Kilauea in March.

VolcanoBot is just one example of the many possibilities robots have now enabled us to explore.  Once the tether is removed the possibilities could be literally endless, who knows they may even discover the hidden world of the Zarn.