The Robot That Can Be Controlled By Human Brainwaves

A couple of years ago I wrote about a fascinating project by researchers at Johns Hopkins University, which aimed to control a robotic arm using nothing but our thoughts.  It sounds like science fiction, but a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have shown how much progress has been made since then.

The work is capable of monitoring brain activity and detecting in real-time if a human being spots an error made by a robot.  The system works by measuring muscle activity so that the wearer can make hand gestures that tell the robot the correct action to execute.

The system has already been put through its paces on a number of tasks, such as one involving the use of a power drill.  Equally, the system was operational even with people the robot had not encountered before, thus making it suitable for a range of real-world settings.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” the team say. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

Human-robot interaction

Previous work in this area has resulted in systems that were able to recognize brain signals from people who had been trained to think in a very specific way so that the system could itself be trained to recognize these signals.  Such systems have obvious flaws however as they are, and indeed us, are pretty unreliable in how we think.

To overcome this, the MIT team focused on a specific kind of brain signal, known as ‘error-related potentials’ (ErrPs).  These naturally occur whenever we notice mistakes being made, so if the system detects an ErrP, it’s trained to stop so that it can be corrected.

“What’s great about this approach is that there’s no need to train users to think in a prescribed way,” the team say. “The machine adapts to you, and not the other way around.”

The team were able to improve the performance of the Baxter robot (from Rethink Robotics) from 70% accuracy to 97% accuracy via the brainwave controlled feedback.  The feedback was delivered via both electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” the team say. “This helps make communicating with a robot more like communicating with another person.”

They believe there are a number of possible use cases for the system, including providing care for the elderly or supporting people with language disorders.

“We’d like to move away from a world where people have to adapt to the constraints of machines,” they conclude. “Approaches like this show that it’s very much possible to develop robotic systems that are a more natural and intuitive extension of us.

You can see the system in action via the video below.

Related

Facebooktwitterredditpinterestlinkedinmail