Wouldn't it be cool to command an entire army of enslaved robots only by thinking their marching orders? Honda may not be going that far, but one of their research and development teams claims to have created a new thought-based interface to send instructions to an ASIMO robot. Honda Research Institute Japan has partnered with Advanced Telecommunications Research and Shimadzu Corporation develop the system that may have a wide range of applications.
Hopefully, this story has nothing to do with the upcoming April Fool's day. Considering the groups released research about controlling a robotic hand with a functional magnetic resonance imaging (fMRI) scanner three years ago, we decided to run with the story today.
The engineering team abandoned fMRI for the time being, because the "large size and powerful magnetic field generated by the" machine put severe limitations on where and when the technology could be used. The study then focused on a new Brain Machine Interface based on electroencephalography (EEG), which measures electrical activity in the brain caused by the firing of neurons. Honda and the team also introduced near-infrared spectroscopy (NIRS) to their latest interface. NIRS examines changes in blood hemoglobin caused by neuron activity, and has been used in medical fields like cognitive psychology for the last two decades.
In the end, the non-invasive, newly created interface combines with software to interpret the data, and translate it to commands that come directly from the brain. After EEG and NIRS sensors are placed on the user's scalp, the individual is told to think about moving either the left hand, right hand, tongue, or feet. A change in blood flow and electrical activity in the brain is noticed by the computer, which then moves the corresponding part on the ASIMO robot.
Honda claims to have gotten up to 90% accuracy in the tests.