The first episode of Star Trek, The Menagerie, Captain Christopher Pike (not Kirk) lands on a distant planet and falls in love with a girl (Vina) that is captured by aliens that communicate telepathically. Pike attempts to rescue her, but quickly learns that her beauty is really only an illusion put on by the aliens. Later on in the episode, Pike is injured and becomes as deformed as Vina. Now, imprisoned in a wheelchair, the Captain returns to the planet to reunite with his true love under the guise of the aliens’ ruse. A team of researchers at the Swiss Federal Institute of Technology’s Defitech Foundation Chair in Brain-Machine Interface in Lausanne, Switzerland, are working on creating the ultimate joystick – a brain-to-computer interface that could enable disabled people to control robots with their minds. According to the project lead,Professor José del R. Millán, “we have been developing brain-computer interfaces for people who suffer different kinds of motor disabilities so that they can translate their mental intentions into commands for the robots.”
The project, which has been underway for a year, has tested the interface with nine people who are disabled and 10 healthy people across Italy, Germany and Switzerland. First, the user has to train to communicate with the robot. A certain thought will light up an area of the brain. To turn the robot left and right, the user will have to think in a specific way. These electrical brain signals are picked up by a non-invasive cap fitted with electrodes. After training how to use the cap, the users then controlled a telepresence robot on location in a laboratory in Switzerland while still in their homes, sometimes in different countries, in real-time. The robot, still in its early stages, consists of a laptop on a wheeled frame. This has a camera that allows the user to see the environment around the robot, as well as a display that shows the user’s face via Skype, letting the user have conversations with people at the robot’s location (imagine if this technology could work with Vgo or iRobot’s Cisco telepresence device). Additionally, the robot is fitted with sensors that detect the proximity of objects in the room — allowing it to avoid collisions on its own, without being micromanaged by the user. So far, the laboratory is reporting a 100 percent success rate. Users were capable of directing the robot from room to room since the first trial, said Professor Millán. “But then we went over to compare their performance against 10 people without any kind of motor disabilities, and we saw that their performance was essentially the same.” Each of the users who are disabled were able to easily control the telepresence robot with less than 10 days of training.
The project, part of the European Commission-funded Tools for Brain-Computer Interface project, is still in the testing phase, and it’s yet to be made available to users. The team hopes that the technology will be made available to the public, but there are some issues that need to be overcome. “We would like to see this technology at the user’s site, not confined to the laboratory,” Professor Millán said. “For this to happen, insurance companies will have to help finance these technologies.”
Unlike Star Trek, this telepathic control is not an illusion but a real quality of life improvement that arguments ones disabilities with robots.