There’s a new sheriff in town

One of my favorite scenes in Blazing Saddles is when Cleavon Little is riding into town to become the new Sheriff of Rockridge.  In many ways that movie broke down racial barriers through its comedic genius.  The question I ask today, is if we need a movie to make the police and citizens trust robots.

Earlier this week CNN reported that a smuggler’s drone flying from Mexico crash-landed just south of the U.S. border city of San Ysidro, California, in a failed drug delivery this week, Tijuana Municipal Police said.  The incident showed that smugglers aren’t just going underground anymore — using tunnels beneath the U.S.-Mexico border to transport drugs and migrants.  Now, the smugglers are trying to do business using unmanned aerial vehicles.

 

“To date, U.S. Customs and Border Protection has not intercepted any drones smuggling narcotics across the borders into the United States,” said Alberto Vallina, supervisory Border Patrol agent in San Diego. “In collaboration with our federal, state, local and international law enforcement partners, CBP remains vigilant against emerging trends and ever-changing tactics employed by transnational criminal organizations behind illegal attempts to smuggle narcotics into the U.S.”

The drone was loaded with more than six pounds of the synthetic drug crystal meth, Tijuana police said.  “In San Diego, the street value, at last account, for a six-pound load would be about $48,000,” DEA Special Agent Matt Barden said. “Once you get it across the border, that stuff’s like gold.”

Drones are emerging as the latest technological gadget used by cartels and smugglers in trying to outfox border authorities. The crashed drone was a prototype that used a global positioning system, or GPS, to send it to a particular destination, Tijuana police said on the department’s Facebook page.

“The cartels have been using drones for surveillance. Transporting drugs is a bit more complicated,” said Sylvia Longmire, a leading drug war analyst. “This is further evidence that the cartels have unlimited funds and creativity.”

In response to the pressure by the cartels, the US government has been employing a fleet of robots to scourer tunnels looking for drugs.  The nation’s increasingly high-tech battle against drug smuggling along the Southwest border just got another ally: a wireless, compact, camera-equipped robot.  Since 1990, authorities have discovered 168 tunnels in Arizona and California used mostly to smuggle drugs. More than half were dug up along the border stretch in Nogales, Ariz., where covert diggers often breach an underground flood-control system to enter the US.

“We’ve found all types of contraband in Nogales,” border patrol Agent Kevin Hecht says. “We’ve had marijuana, we’ve had cocaine, we’ve had heroine, we’ve had some meth.”

“That is not an option we needed right now,…Once you determine there’s no threats and it’s safe for the agent to make entry, then the agent can clear the tunnel and investigate further beyond what the robot was able to do,” said Agent Hecht.

The military-grade Pointman Tactical Robot is only 19 inches wide and can flip, negotiate rough terrain, and climb stairs (shown left with shotgun).

“Predominantly SWAT teams use them to get a look inside buildings before they enter,” says Alex Kaufman, who works for Applied Research Associates, Inc., of Albuquerque, N.M., which sells the robot.  The robot’s range and mobility will allow it to be more effective than the tethered robots currently used in the sometimes rudimentary, sometimes elaborate tunnels found along the border, he says.

So the war on drugs has come down to a war between robots, cartels with drones and DEA agents with tactical mobile units.  As previously written, the concept of Robocop is within reach.  Sheriff Bart you can now retire.

VolcanoBot: Journey to the Center of the Earth

A few weeks ago, I encamped opposite an active Volcano in Costa Rica.  As I was traveling with my family to Arenal, all the kids were starting to feel a little anxious about a possible eruption.  Volcanos are one of the last frontiers on earth to explore, however the extreme conditions make it next to impossible (for a human) to investigate.

Earlier this week, NASA announced its latest mission, exploring inner earth. Space may be vast, but the planets can be pretty cramped – especially when it comes to volcanoes.  This is unfortunate because the difficult to navigate fissures that are a major volcanic feature contain clues as to the interior of planets and moons and the mechanisms that formed them.  To help learn more, NASA is dropping miniature robots, called VolcanoBots, down crevices inaccessible to humans as a way of extracting information about volcanoes on and off the Earth.

VolcanoBot 1 in a lava tube (Image: NASA/JPL/Caltech)

VolcanoBot 1 (shown above) was based on NASA’s Durable Reconnaissance and Observation Platform (DROP) and was created by NASA postdoctoral fellow Carolyn Parcheta. and robotics researcher Aaron Parness at NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California. Measuring 12 in (30cm) long with 6.7 in (17 cm) wheels, it was designed to navigate into the narrow fissures that are a common feature of volcanoes on Earth, Mars, Mercury, and the moons of Enceladus and Europa, and to retrieve data that may provide insights into how these volcanoes formed.

In May of last year, VolcanoBot 1 was sent down a fissure at Mount Kilauea volcano on the island of Hawaii (see video below). During this, it descended to 82 ft (25 m) in two locations, which was the limit of its tether, though not the bottom of the unexpectedly deep fissure. In addition, the robot built up a 3D map of the crevice, and discovered that surface bulges on the volcano were reflected underground.

“In order to eventually understand how to predict eruptions and conduct hazard assessments, we need to understand how the magma is coming out of the ground,” says Parcheta. “This is the first time we have been able to measure it directly, from the inside, to centimeter-scale accuracy.”

The JPL team’s next step will be to build an improved robot called VolcanoBot 2, which will have a longer tether, stronger motors, better communications, and the ability to store data in onboard memory.  It will also be smaller at only 10 in (25 cm) long with 5 in (12 cm) wheels. According to Parcheta, these modifications will allow the robot to go deeper while dragging fewer cords behind it. In addition, it will have improved cameras, which will be able to pan and tilt. VolcanoBot 2 is scheduled to be deployed at Kilauea in March.

VolcanoBot is just one example of the many possibilities robots have now enabled us to explore.  Once the tether is removed the possibilities could be literally endless, who knows they may even discover the hidden world of the Zarn.

Report From CES: Mercedes Rocks Autonomous Driving

We have seen many robotic innovations at CES this year, from GrillBots (to clean your BBQ) to wearable drones that fly off your wrist. Whiles these are cute novelties, the one inventor that blew my mind was Mercedes Benz.  The German car manufacturer is terrified of loosing marketshare to Google in the new realm of driverless vehicles, so earlier this week they introduced the world to the new luxury of self-driving vehicles (also read on for their Truck version).

What was even more encouraging is that Mercedes was not the only car company at CES focused on the robotic revolution.   Ford’s CEO Mark Fields stated that he expects to see self-driving cars on America’s roads by the end of the decade. Fields didn’t say if Ford would be among the first to produce an autonomous vehicle.

“Fully autonomous vehicles are a real possibility,” Fields told Automotive News. “Probably, in the next five years, you’ll see somebody introduce autonomous vehicles.”

Mercedes-Benz-F-015-Luxury-in-Motion-concept-105-876x535 Mercedes-Benz-F-015-Luxury-in-Motion-concept-126-876x535

Fields thunder was muffled by the Mercedes’  F 015 concept car on the opening day of CES (read Car & Driver Review). The company’s vision of a driverless future, includes features like a steering wheel that can be completely stowed within the dashboard. It should be noted that several other automakers have vowed to have autonomous vehicles on the road within the next few years, including General Motors’ Cadillac, Nissan and Audi.

Based upon the F 015 unveiling, Daimler Benz is one of the only automobile companies to truly dreams of a future that is fully automated.  This evolution would eliminate the need for traffic signs and could remove pedestrian accidents in its entirety.  In addition, to the new level of luxury and leisure (see video), the F 015 is fully programmed to delivered its occupants to their destination,and find its own parking spot—potentially well out of the city, in order to keep high-population zones free of excessive traffic.

Benz is not only thinking of the consumer autonomous car market (which would dramatically improve the lives of homebound individuals), but the commercial space as well.  According to a statement made from the company earlier this year, “in 10 years trucks will be able to drive autonomously on the motorways and highways of Europe… The Mercedes-Benz Future Truck 2025 constitutes a revolution in efficiency and safety, a revolution for road traffic and its infrastructure, for professional driving and for the road transport sector.”

 

Like the F 015 the new truck bleeds innovation – LEDs replace than traditional headlights, cameras are mounted on side wings instead of mirrors, making the design more aerodynamic (and very cool).  Radar sensors and camera technology allow the truck to drive autonomously, creating a Highway Pilot system similar to the autopilot function used by aircraft.  Sensors positioned all around the truck register both moving and stationary objects in the vicinity to form a three-dimensional map of the surroundings.  Using this information, the steering is automatically controlled to keep the vehicle in the center of the lane and fuel consumption is kept at a minimum as the truck self-navigates different topographies. A feature named Blind Spot Assist employs radar sensors to monitor the sides of the truck and alert the driver to the presence of other road users that may not be immediately visible.

Mercedes Benz Future Truck 2025

The truck also has the ability to connect with other vehicles on the road, increasing the awareness of their speed and proximity to help prevent collisions, but doesn’t require this information to drive autonomously. The design still features a driver’s cab with a steering wheel, but instruments and switches are replaced with touch-screen displays. The driver’s seat can be swiveled 45 degrees to allow the operator to perform other tasks while the vehicle is driving itself, including processing data or communicating with family and friends during long trips. Daimler Trucks demonstrated the vehicle’s autonomous driving capabilities at up to 80 kilometres per hour on a section of motorway in Germany in July.

“The challenge now is to leverage this momentum and to continue our open dialogue with all parties involved, so that in 10 years’ time the autonomously driving truck will indeed have become an accepted feature on our roads,” said Wolfgang Bernhard, head of Diamler Trucks.

Last March, Google revealed its self-driving car that will change the definition of car ownership as well the driving experience. Mercedes obviously does not want to be caught sleeping at the wheel, so it is building a fleet of consumer and enterprise vehicles for an autonomous future.  The big question is when will these innovations cross the chasm into mainstream adoption from a regulatory and psychological perceptive.  If the Highway changed how Americans experienced the road in the 1950s, the robot will revolutionize the interior experience into a new social platform – sit back, relax, and enjoy the ride…

Using Our Mind To Control Robots

The Holiday season is a time for miracles.  We often take little simple tasks like walking, grasping, and eating for granted, as ultimately living a healthy life is a miracle.  Allow me to introduce you to Jan  Scheuermann, 55, mother of two and a successful entrepreneur that was diagnosed with spinocerebellar degeneration in 1998. As a result, Jan became a quadriplegic loosing all ability to use her limbs. Jan recently completed a study at the University of Pittsburgh to regain her independence through a robotic arm, she calls Hector, controlled by 96 sensors in her brain  (see video).

Hector or the University of Pittsburgh’s robotic hand connects to the brain using electrodes that enables the wearer to control with their thoughts. The original  model in 2012 could grasp and even do a high-five, but new 10D model now can work almost like real digits with abducting fingers (top left and right), scooping, pinching (bottom left) and extending the thumb (bottom right).

The University of Pittsburgh’s hand connects to the brain using electrodes and lets the wearer control it using their thoughts. The 7D model of the hand could grasp and high-five. The new 10D model can now abduct fingers (top left and right), scoop, pinch (bottom left) and extend the thumb (bottom right)

Jan signed up to the University of Pittsburgh’s study in 2012, and was fitted with two quarter-inch electrode grids.  Each electrode has 96 contact points covering regions of her brain that are responsible for right arm and hand movements. After the electrode grids in Ms Scheuermann’s brain were connected to a computer, creating a brain-machine interface (BMI), the contact points picked up pulses of electricity that were fired between the neurons.

Computer algorithms were used to decode brain signals and identify the patterns associated with a particular arm movement. The researchers then used a virtual reality computer program to calibrate Ms Scheuermann’s control over the robotic arm Following this training, Ms Scheuermann was able to make the robotic arm reach for objects, as well as move in a number of directions, and flex and rotate the wrist, simply by thinking the movements.  It also enabled her to give the researchers a high-five and even feed herself a chocolate bar (which is a major feat!).

Computer algorithms were used to decode brain signals and identify the patterns associated with a particular arm movement. The researchers then used a virtual reality computer program to calibrate Ms Scheuermann’s control over the robotic arm (pictured)

The new and improved Hector offers a range of motions that enables Jan to pick up, grasp and move more objects, more precisely than before.  It is hoped the results of Jan’s study, published in the Journal of Neural Engineering, can build on previous demonstrations and eventually allow robotic arms to restore natural arm and hand movements in people with upper limb paralysis.

Co-author of the study Dr Jennifer Collinger said, “10D control allowed Jan to interact with objects in different ways, just as people use their hands to pick up objects depending on their shapes and what they intend to do with them. We hope to repeat this level of control with additional participants and to make the system more robust, so that people who might benefit from it will one day be able to use brain-machine interfaces in daily life. We also plan to study whether the incorporation of sensory feedback, such as the touch and feel of an object, can improve neuroprosthetic control.”

Jan Scheuermann added,  “This has been a fantastic, thrilling, wild ride, and I am so glad I’ve done this. This study has enriched my life, given me new friends and co-workers, helped me contribute to research and taken my breath away. For the rest of my life, I will thank God every day for getting to be part of this team.”

 Jan’s story is just one of many in today’s robotic age that is actively improving the lives of our most vulnerable citizens.  As we gather during these weeks with family and friends, we should remember to celebrate every blessing we have been given and hope that we can continue to share in these celebrations for years to come.  Happy 2015!