Meet the Robotic Octopus

Last month, I toured around CES with a board member of the CTA who shared with me the image of the show 30 years ago, folding tables and all. A qualitative read on how well a market is growing could be the number and size of trade shows in that industry.  My inbox has been flooded lately with a plethora of new robot conferences from RoboUniverse to RoboBusiness to XPONENTIAL, to the newest entry – Soft Robotics Week in Livorno, Italy.

Soft Robotics has become a hot segment of the robotic industry. Earlier this year, a start-up company out of Cambridge, Massachusetts, named (originally) Soft Robotics raised $4.5 million of venture capital money to market its unique gripper to the logistics and agricultural industry.  As the CEO, Carl Vause, explains,  “we use no force sensors, no feedback systems and we don’t do a lot of planning…we just go and grab an object.” Last year, I personally witnessed the value proposition of Soft Robotics at the inaugural show of RoboUniverse. For those that missed it, here is a quick product demo:

However, the genesis of the discovery using air instead of metal to grip objects might date back to close to 10 years ago when Cecilia Laschi asked her father to catch a live octopus for her seaside lab in Livorno, Italy. He thought she was crazy: as a recreational fisherman, he considered the octopus so easy to catch that it must be a very stupid animal. And what did a robotics researcher who worked with metal and microprocessors want with a squishy cephalopod anyway?

Nevertheless, the elder Laschi caught an octopus off the Tuscan coast and gave it to his daughter, who works for the Sant’Anna School of Advanced Studies in Pisa, Italy. She and her students placed the creature in a saltwater tank where they could study how it grasped titbits of anchovy and crab. The team then set about building robots that could mimic those motions.

octupus

Prototype by prototype, they created an artificial tentacle with internal springs and wires that mirrored an octopus’s muscles, until the device could undulate, elongate, shrink, stiffen and curl in a lifelike manner. “It’s a completely different way of building robots,” says Laschi.

This approach has become a major research front for robotics in the past ten years. Scientists and engineers in the field have long worked on hard-bodied robots, often inspired by humans and other animals with hard skeletons. These machines have the virtue of moving in mathematically predictable ways, with rigid limbs that can bend and straighten only around fixed joints. But they also require meticulous programming and extensive feedback to avoid smacking into things; even then, their motions often become erratic or even dangerous when dealing with humans, new objects, bumpy terrain or other unpredictable situations.

Robots inspired by flexible creatures such as octopuses, caterpillars or fish offer a solution. Instead of requiring intensive (and often imperfect) computations, soft robots built of mostly pliable or elastic materials can just mould themselves to their surroundings. Although some of these machines use wires or springs to mimic muscles and tendons, as a group, soft robots have ditched the skeletons that defined previous robot generations. With nothing resembling bones or joints, these machines can stretch, twist, scrunch and squish in completely new ways. They can transform in shape or size, wrap around objects and even touch people more safely than ever before.

Inspired by the the octopus, engineers are creating robots that can twist their way around problems that rigid robots can’t handle. Researchers have already produced a wide variety of such machines, including crawling robotic caterpillars, swimming fish-bots and undulating artificial jellyfish. 

“If you look in biology, and you ask what Darwinian evolution has coughed up, there are all kinds of incredible solutions to movement, sensing, gripping, feeding, hunting, swimming, walking and gliding that have not been open to hard robots,” says chemist George Whitesides, a soft-robotics researcher at Harvard University in Cambridge, Massachusetts. “The idea of building fundamentally new classes of machines is just very interesting.”

The millions of industrial robots around the world today are all derived from the same basic blueprint. The metal-bound machines use their hefty, rigid limbs to shoulder the grunt work in car-assembly lines and industrial plants with speed, force and mindless repetition that humans simply can’t match. But standard robots require specialized programming, tightly controlled conditions and continuous feedback of their own movements to know precisely when and how to move each of their many joints. They can fail spectacularly at tasks that fall outside their programming parameters, and they can malfunction entirely in unpredictable environments. Most must stay behind fences that protect their human co-workers from inadvertent harm.

“Think about how hard it is to tie shoelaces,” says Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology in Cambridge. “That’s the kind of capability we’d like to have in robotics.”

Over the past decade, that desire has triggered an increased interest in lighter, cheaper machines that can handle fiddly or unpredictable situations and collaborate directly with humans. Some roboticists, including Laschi, think that soft materials and bioinspired designs can provide an answer.

That idea was a tough sell at first, Laschi says. “In the beginning, very traditional robotics conferences didn’t want to accept my papers,” she says. “But now there are entire sessions devoted to this topic.” Helping to fuel the surge in interest are recent advances in polymer science, especially the development of techniques for casting, moulding or 3D printing polymers into custom shapes. This has enabled roboticists to experiment more freely and quickly with making soft forms.

As a result, more than 30 institutions have now joined the RoboSoft collaboration, which kicked off in 2013. The following year saw the launch of a dedicated journal, Soft Robotics, and of an open-access resource called the Soft Robotics Toolkit: a website developed by researchers at Trinity College Dublin and at Harvard that allows researchers and amateurs to share tips and find downloadable designs and other information. 

Perhaps the most fundamental challenge is getting the robots’ soft structures to curl, scrunch and stretch. Laschi’s robotic tentacle houses a network of thin metal cables and springs made of shape-memory alloys— easily bendable metals that return to their original shapes when heated. Laid lengthwise along the ‘arm’, some of these components simulate an octopus’s longitudinal muscles, which shorten or bend the tentacle when they contract. Others radiate out from the tentacle’s core, simulating transverse muscles that shrink the arm’s diameter. Researchers can make the tentacle wave — or even curl around a human hand — by pulling certain combinations of cables with external motors, or by heating springs with electrical currents.

A similar system helps to drive the soft-robotic caterpillars that neurobiologist Barry Trimmer has modelled on his favourite experimental organism, the tobacco hornworm (Manduca sexta). At his lab at Tufts University in Medford, Massachusetts, 20 hornworms are born each day, and Trimmer 3D prints a handful of robotic ones as well. The mechanical creatures wriggle along the lab bench much like the real ones, and they can even copy the caterpillar’s signature escape move: with a pull here and a tug there on the robot’s internal ‘muscles’, the machine snaps into a circle that wheels away5. Trimmer, who is editor-in-chief of Soft Robotics, hopes that this wide range of movements could one day turn the robot into an aide for emergency responders that can rapidly cross fields of debris and burrow through rubble to locate survivors of disasters.

Whitesides, meanwhile, is pioneering robots that are powered by air — among them a family of polymer-based devices inspired by the starfish (see previous blog posts). Each limb consists of an internal network of pockets and channels, sandwiched between two materials of differing elasticity.  Whitesides reserarch led to Soft Robotics (the company that just got funded) and its grippers that are made mainly of soft, stretchy materials can envelop and conform to objects of different shapes and sizes.

The RoboSoft challenge in April could help to spur more development. There, the entries will be put through their paces: challenges include racing across a sand pit, opening a door by its handle, grabbing a number of mystery objects and avoiding fragile obstacles under water. The goal, says Laschi, is to demonstrate that soft robots can accomplish some of the same tasks that stiff robots do, as well as others that they cannot.

“I don’t think soft robotics is going to replace traditional robotics, but it will be combination of the two in the future,” says Laschi. Many researchers think that rigid robots might retain their superiority in jobs requiring great strength, speed or precision. But for a growing number of applications involving close interactions with people, or other unpredictable situations, soft robots could find a niche.

At Kings College London, for example, Laschi’s collaborators are developing a surgical endoscope based on her tentacle technology. And her team in Italy is developing a full-bodied robot octopus that swims by fluid propulsion, and could one day be used for underwater research and exploration. The prototype already pulses silently through a tank in her lab, as the real octopuses swim in the salty waters just outside.

“When I started with the octopus, people asked me what it was for,” says Laschi. “I said, ‘I don’t know, but I’m sure if it succeeds there could be many, many applications’.”

Robots build launch pads on Mars to explore the Galaxy

Martian is a great plane movie. Long enough to pass the time for at least half a Vegas flight. We know the story, Astronaut Mark Watney is presumed dead after a fierce storm and left behind by his crew. But Watney has survived and finds himself stranded and alone on the hostile planet. With only meager supplies, he must draw upon his ingenuity, wit and spirit to subsist and find a way to signal to Earth that he is alive…

martian

However, humans have never built another structure on another planet. So far, everything hurled beyond our atmosphere and into the great beyond was constructed on Earth, made by human hands or human-built machines using resources from sweet mother Terra herself. If we want to venture forth into the cosmos like Watney, and say, launch a return rocket home, it’d be nice to have a launch pad in place on the alien planet. Instead of hauling a launch pad there, why not make a machine that can use local materials to build one?

rovah-1453755788329

Over the course of several months, a remotely-controlled robot (above) from the Pacific International Space Center for Exploration (PISCES) did just that. And now, thanks to Project Manager Rodrigo Romo (and HoneyBee Robotics of NYC) we can watch that construction in all its impressive, tedious glory:

The project is a first-of-its-kind and aims to robotically build a vertical take-off and landing pad using basalt found on the Big Island. The goal is to successfully build a landing pad on Earth using local materials, so that it can be done in space.

Landing pads will be crucial in future space missions. Spacecrafts can cause high velocity dust storms during take-off and landing, blasting planetary dust in all directions. These jet-propelled sandblasts could cause significant damage to neighboring structures and space equipment. To mitigate this problem, landing pads offer a flat, stable surface to prevent such damages.

This was part of NASA’s larger Additive Construction with Mobile Emplacement (ACME) project, which wants to use found materials on alien worlds, builder robots like this one, and 3D printing, to build structures without needing to bring all the parts from Earth. So if a robot can flatten and smooth a tract of Hawaii into a serviceable landing pad on Earth, then cover it in durable interlocking tiles, it’s likely a robot on Mars or elsewhere could do the same for a future mission. Good news for space exploration, even better news for getting space explorers home. Run the video tape:

One last point, if robots can build structures on Mars imagine what they could do with Manhattan real estate?

Biodegradable Robots

My favorite robot movie hands down is Blade Runner (the film adaptation of “Do Androids Dream of Electric Sheep?“).  As Blade Runner takes place in the future, three year from now, it raises big ethical questions of disposing of humanoids when they have reached their expiration date…

matchrobot

Scientists at the Materials Group at the Italian Institute of Technology (IIT) in Genoa have been struggling with robotic death for quite some time. Robots are getting ever more life-like, but underneath their synthetic skin it’s a different story. Their insides are still made mostly from metal and plastic – materials that are hard to dispose of. But these researchers in Italy have developed ‘smart materials’ that could allow robots to be built from substances that will biodegrade when they’ve reached the end of their life-span (no need to call Deckard).

“We are infusing any material with nano technology. So what we are doing apart from making these new composite materials – smart materials – we’re also using them to change the properties of other materials, other existing materials like paper or cotton or different foams; from synthetic foams like polyurethane or forms of cotton. So like this, in all these existing materials we are giving new properties that these materials don’t have so we can open up their application range,” explained Athanassia Athanassiou, who leads the Smart Materials Group at IIT).

The researchers say their ‘smart materials’ could eventually replace conventional plastic which is made from petroleum, a fossil fuel, and contributes to climate change. Bioplastics are made from plant material, but are more energy-intensive to produce. Athanassiou’s team have developed a way to create bioplastic from food waste, and so hope to mitigate the additional energy required by using resources that would normally go to waste.

In particular, robotics could be an important application for their research, according to Athanassiou, “these biodegradable materials, natural materials, they are very flexible so they can be used for robotic skins. But they can be also very hard so they can be used for internal parts of a robot. And also, in this flexible skin – robotic skin let’s say – we can incorporate sensors so they have this tactile sensing that the robots need, but with biodegradable materials.”

biorobot

Nikos Tsagarakis, lead researcher on a humanoid robot project at the IIT, said that roboticists will have to move on from metal in order to build the next generation of robot.

“The main issue is it’s actually difficult to see how you can achieve the properties that you want to have; say matching more the properties of the human body. So going to alternative materials would be this advantage – it will help us to make lighter robots, more efficient and, finally, also recyclable,” said Tsagarakis, who is developing the Walk-Man humanoid robot to operate human tools and interact with its environment in the same way a person would.

walkman

Robots made from biodegradable material would certainly make them more human-like, and perhaps more easily accepted in the real-world. And if robots are to ever be truly ubiquitous, they also need to be easily disposed of once they reach the end of their useful life-span (commonly known as the the Blade Runner issue). While Athanassiou believes biodegradable materials are imminent for the skin-like outer layer, she believes eventually the entire robot body could decompose just as if it was flesh and blood.

As Deckard drives off into the sunset with his humanoid love Rachel, he prophesies, “It’s a shame she won’t live – but then again, who does?” How true…

Robots lurking below the high seas

My kids are obsessed with the Titanic, the ship not the movie. While we do live near Strauss Park (dedicated to the victims), I think there is something bigger and more mysterious when one reflects on falling into the giant void of the sea. Oceans cover over 70% of the Earth’s surface, yet more than 95% of the underwater world remains unexplored. As example of the magnitude of Neptune’s dudgeons, the deepest part of the ocean is more than 36,000 feet below the surface. While Elon Musk looks beyond the skies, the real final frontier is right off the coast waiting for mechanical explorers.

Now for the first time ever, the world’s first deep-sea robots are poised to jump into the depths of the Bismarck Sea, near Papua New Guinea diving down over 5,000 feet below the waves in for search for gold. These massive machines, which are to be beta tested sometime this year, are part of a high-stakes gamble for the Toronto-based mining company Nautilus Minerals. Nautilus’s machines have been ready to go since 2012, when a dispute between the firm and the Papua New Guinean government stalled the project. What broke the impasse was the company’s offer, in 2014, to provide Papua New Guinea with certain intellectual property from the mining project.

img

The deal enabled Nautilus to get financing to build a $200 million ship, the first of its kind, which will deploy the subsea mining robots and process the ore they recover. This 227-meter-long production vessel is now being built in a Chinese shipyard and is scheduled to depart for Papua New Guinea in early 2018.

The mining robots were built for Nautilus by Soil Machine Dynamics, based in the United Kingdom, which supplies construction equipment for laying undersea cables, servicing offshore oil platforms, and other heavy-duty deep-sea jobs. The main robots are a pair of tractor-trailer-size excavators. One uses 4-meter-wide counterrotating heads studded with tungsten carbide picks to chew through the metal-rich chimneys that form around superhot water spewing from sulfurous vents in the seafloor. Its partner adds brute strength, using a studded drum that is 2.5 meters in diameter and 4 meters wide to pulverize rock walls.

Dredge pumps built into these machines will push the smashed ore back to a central pile on the seafloor, where a third Nautilus robot will feed a slurry of crushed rock and water up a pipe dangling from the production vessel. There the water will be wrung out from the ore, which will be loaded on another ship and carried to China for processing.

As 2015 drew to a close, Nautilus was still negotiating for access to a shallow-water site for an initial subsea test of these machines, which it hoped to begin in mid-2016. The plan is to do some rock cutting, though in an interview Nautilus’s CEO, Michael Johnston, says it is “difficult getting materials that are a good proxy for the materials we’ll be mining.” If time allows, the machines will also get a deep-sea trial before they are integrated with the production vessel, Johnston adds. Barring that, they will have to prove their stuff at Nautilus’s first mining site, called Solwara 1, which is located some 30 kilometers from shore in Papua New Guinea’s New Ireland province.

Assuming all goes well, the robotic diggers will spend 30 months scouring the Solwara 1 site, bringing up 2.5 million metric tons of ore containing metals worth more than US $1.5 billion at today’s prices. Next, the robots will likely set to work on one of Nautilus’s 18 other prospects in the Bismarck Sea or one of its 19 discoveries off the shores of the Polynesian archipelago of Tonga.

Competitors are staking out deep-sea mining sites of their own, with much of the development activity focused on rich deposits of polymetallic nodules in a vast region southeast of Hawaii known as the Clarion-Clipperton Fracture Zone. The potato-size nodules, found in waters more than 4 km deep, contain manganese along with nickel, cobalt, and other metals.

But some marine biologists warn that deep-sea mining interests are outpacing the readiness of scientists and governments to assess and manage the environmental impact.Verena Tunnicliffe, a specialist in deep-sea vent ecosystems at the University of Victoria, in British Columbia, Canada, says robo-miners will strip away deep-sea ecosystems that are as unique as they are poorly understood.

Johnston points out that Nautilus is taking pains to study these ecosystems and will protect them to the extent possible. A refuge zone within the leased area, for example, will provide a source of local fauna for recolonization of the company’s deep-sea strip mine.

Tunnicliffe worries that this vision for recolonization could prove wildly optimistic: “The habitat is going to be pulverized, and the energy flow of the system will be completely altered. I do not believe recolonization of these types of populations is going to happen.”

Other marine biologists are more sanguine, however. With luck, the mining will prove no more devastating to these vent communities over the long term than the frequent earthquakes and outpourings of lava that these amazing deep-sea creatures are somehow able to survive.

CES 2016 Diary: Drones and Other Stupid Pet Tricks…

“What happens in Vegas stays in Vegas,” so what I am about to share is really hush hush…  Walking the halls of every part of CES 2016, there are 3 pervasive themes that ring out almost too loud and clear – drones, virtual reality, and IoT/Smart Homes.  As a Frontier Tech investor, this is very encouraging for the speed of the marketplace, while, at the same time, it is scary also, so much glut of me2s in the industry.  I mean, really, how many companies can the drone hobby market support?

One of the biggest impressions on me from CES 2016 was the prominence of Intel’s RealSense acquisition into their future strategy. RealSense is a computer vision technology that leverages three cameras that act like one — a 1080p HD camera, an infrared camera, and an infrared laser projector, to “see-like a human eye” to sense depth and track human motion.  According to Intel’s website, “RealSense technology redefines how we interact with our devices for a more natural, intuitive and immersive experience, supported by the powerful performance of Intel® processors.”

IMG_1440

The bigger story of RealSense is the evolution of the technology from mobile to virtual reality to drone navigation.  According to Intel, the new Intel RealSense Smartphone featuring Project Tango “represents the best in depth and motion sensing technology integrated into a sleek and thin smartphone form factor. The prototype will allow Android developers to create new applications and experiences for the Intel RealSense technology and Project Tango ecosystems including 3-D scanning, indoor navigation, depth-enabled photography and video, measurements and immersive augmented reality and virtual reality.”

Intel has been trying for years to catch up in the smartphone game – with less than stellar success. While there are several notable Intel-inside smartphones on the market from Lenovo, Motorola, and others, the lion’s share of smartphones run on chips developed by ARM and Qualcomm. The phone demonstrated last week at the Intel Developers Forum (IDF) sported an Intel Atom x5 Quad-Core Z8500 processor, with the phone running a specially developed version of the Android Lollipop operating system.

RealSense is Intel’s version of advanced gesture-based computing. Built into a camera designed for smartphones, tablets, or any other “Internet of Things” networked device, appliance, or object, the system allows users to interact with the camera and computers, allowing them, for example, to change the TV channel by moving their fingers in the air. Tango supplies the 3D side of the integrated system, with technology to understand movement, depth, and space.

Another example beyond VR and Drones for RealSense is robotics. Segway’s new hoverboard has an Intel RealSense eye, which is useful for when the self-balancing board morphs into a personal robot. The company showed off the multipurpose hoverboard during a presentation at CES 2016 in Las Vegas on Tuesday.

ninebot

It’s a new leaf for the Segway company, which was recently taken in by China’s Ninebot. The Ninebot Segway employs a bar-less build, giving users hands-free control over the self-balancing board. But beyond that, the Ninebot Segway has a mind of its own.

“For years, there has been the promise of a personal robot that would provide real help in and around the house,” Ninebot says. “This Tuesday at CES, Segway took a big step towards making that dream a reality with its Segway Robot showcase in the opening demo for the CES16 Intel Keynote.”

When a rider hops off of the hoverboard, the Ninebot Segway can shape shift into a robot that’ll follow that person around, taking pictures and commands. The robot uses Intel’s RealSense camera to make its way around dynamic environments and it can interact with both users and sensors in the home. The robot also has an Intel Atom processor inside.

The Ninebot Segway also includes a depth sensing camera, fish eye tracking camera and a camera for taking photos. The Ninebot Segway can get smarter. The robot’s platform is open and compatible with Android, and the company has issued a call to developers to have a go at it.

“Segway plans to make the robot commercially available and will initially introduce a developer kit based on Android platform in second half of this year,” Ninebot says. “Developers worldwide will be able to use this SDK to allow the robot to perform new applications and to interact with other devices.”

Intel hopes that RealSense will enable devices makers to use more Intel chips by offering a new ecosystem of computer vision for the post-PC era.

According to Intel representatives, “The combination brings a wide-ranging set of computer vision technologies into a single mobile platform…the solution is for Android developers to create new applications and experiences for the Intel RealSense technology and Project Tango ecosystems including 3-D scanning, indoor navigation, depth-enabled photography and video, measurements and immersive augmented reality and virtual reality. This complementary set of technologies enables Android developers to experiment with and create a new class of end-user applications on a single mobile platform.”

Of course, walking across the hall, Qualcomm had their own drone mapping technology on display.  The difference between the two feature sets is that RealSense is not limited to photography and mapping, but a robust platform for the new age of autonomous mobility, robotics, virtual reality and yes IoT.

One last snapshot, readers what do you think of this self-driving eco-car?

IMG_1430

%d bloggers like this: