Correcting robots using brain signals and hand gestures

What if we could control robots more intuitively, using just hand gestures and brainwaves?

video (brainwaves + gestures) | videos (brainwaves) | poster | publications | news

Collaborators: Andres F. Salazar-Gomez, Stephanie Gil,
Ramin M. Hasani, Frank H. Guenther, Daniela Rus

Robots are becoming more common in settings ranging from factories and labs to classrooms and homes, yet there’s still somewhat of a language barrier when trying to communicate with them.  Instead of writing code or learning specific keywords and new interfaces, we’d like to interact with robots the way we do with other people.  This is especially important in safety-critical scenarios, where we want to detect and correct mistakes before they actually happen.

We use brain and muscle signals that a person naturally generates to create a fast and intuitive interface for supervising a robot.  In our experiments, the robot chooses from multiple targets for a mock drilling task.  We process brain signals to detect whether the person thinks the robot is making a mistake, and we process muscle signals to detect when they gesture to the left or right; together, this allows the person to stop the robot immediately by just mentally evaluating its choices and then indicate the correct choice by scrolling through options with gestures.

Photos by Joseph DelPreto, MIT CSAIL

Two neural network pipelines are used to classify the brain and muscle signals. One classifies brain signals (EEG) to detect naturally occurring error-related potential signals at the moment the robot makes a choice, and the other network constantly classifies forearm muscle activity (EMG) to detect left or right gestures at any time. Both networks are only trained on data from previous users, so a person that hasn’t used the system before can immediately start controlling the robot without requiring additional training.

Supervise robots using brainwaves and hand gestures - RSS Poster
Poster presented at RSS 2018

Using brain signals to detect when you think the robot made a mistake

This portion of the project aims to let people detect robot mistakes with nothing more than their brain signals. It uses an EEG cap to measure brain activity, then uses a classifier to detect error-related potentials (ErrPs). These ErrPs occur naturally within the brain whenever we notice something going wrong, so they can be used to stop a robot whenever the person thinks it’s making the wrong choice.

This project in particular allows a human operator to supervise a robot as it chooses between two targets for a sorting or selection task – if they simply think it’s choosing wrongly, the robot immediately switches to the other option. An interesting result is that a stronger ErrP signal is generated when the classifier makes a mistake – the robot seemingly ignores the person’s feedback, so the person is probably more surprised and engaged. Using this ‘secondary interactive error-related potential’ generated during the closed-loop robot task could therefore improve classification performance in the future and suggests new ways in which robots can acquire human feedback.

Media Version Conference Version (ICRA 2017)

Publications

  • J. DelPreto, A. F. Salazar-Gomez, S. Gil, R. Hasani, F. H. Guenther, and D. Rus, “Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection,” Autonomous Robots, 2020. doi:10.1007/s10514-020-09916-x
    [BibTeX] [Abstract] [Download PDF]

    Effective human supervision of robots can be key for ensuring correct robot operation in a variety of potentially safety-critical scenarios. This paper takes a step towards fast and reliable human intervention in supervisory control tasks by combining two streams of human biosignals: muscle and brain activity acquired via EMG and EEG, respectively. It presents continuous classification of left and right hand-gestures using muscle signals, time-locked classification of error-related potentials using brain signals (unconsciously produced when observing an error), and a framework that combines these pipelines to detect and correct robot mistakes during multiple-choice tasks. The resulting hybrid system is evaluated in a “plug-and-play” fashion with 7 untrained subjects supervising an autonomous robot performing a target selection task. Offline analysis further explores the EMG classification performance, and investigates methods to select subsets of training data that may facilitate generalizable plug-and-play classifiers.

    @article{delpreto2020emgeegsupervisory,
    title={Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection},
    author={DelPreto, Joseph and Salazar-Gomez, Andres F. and Gil, Stephanie and Hasani, Ramin and Guenther, Frank H. and Rus, Daniela},
    journal={Autonomous Robots},
    year={2020},
    month={August},
    publisher={Springer},
    doi={10.1007/s10514-020-09916-x},
    url={https://link.springer.com/article/10.1007/s10514-020-09916-x},
    abstract={Effective human supervision of robots can be key for ensuring correct robot operation in a variety of potentially safety-critical scenarios. This paper takes a step towards fast and reliable human intervention in supervisory control tasks by combining two streams of human biosignals: muscle and brain activity acquired via EMG and EEG, respectively. It presents continuous classification of left and right hand-gestures using muscle signals, time-locked classification of error-related potentials using brain signals (unconsciously produced when observing an error), and a framework that combines these pipelines to detect and correct robot mistakes during multiple-choice tasks. The resulting hybrid system is evaluated in a ``plug-and-play'' fashion with 7 untrained subjects supervising an autonomous robot performing a target selection task. Offline analysis further explores the EMG classification performance, and investigates methods to select subsets of training data that may facilitate generalizable plug-and-play classifiers.}
    }

  • J. DelPreto, A. F. Salazar-Gomez, S. Gil, R. M. Hasani, F. H. Guenther, and D. Rus, “Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection,” in Robotics: Science and Systems (RSS), 2018. doi:10.15607/RSS.2018.XIV.063
    [BibTeX] [Abstract] [Download PDF]

    Control of robots in safety-critical tasks and situations where costly errors may occur is paramount for realizing the vision of pervasive human-robot collaborations. For these cases, the ability to use human cognition in the loop can be key for recuperating safe robot operation. This paper combines two streams of human biosignals, electrical muscle and brain activity via EMG and EEG, respectively, to achieve fast and accurate human intervention in a supervisory control task. In particular, this paper presents an end-to-end system for continuous rolling-window classification of gestures that allows the human to actively correct the robot on demand, discrete classification of Error-Related Potential signals (unconsciously produced by the human supervisor’s brain when observing a robot error), and a framework that integrates these two classification streams for fast and effective human intervention. The system also allows ‘plug-and-play’ operation, demonstrating accurate performance even with new users whose biosignals have not been used for training the classifiers. The resulting hybrid control system for safety-critical situations is evaluated with 7 untrained human subjects in a supervisory control scenario where an autonomous robot performs a multi-target selection task.

    @inproceedings{delpreto2018emgeegsupervisory,
    title={Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection},
    author={DelPreto, Joseph and Salazar-Gomez, Andres F. and Gil, Stephanie and Hasani, Ramin M. and Guenther, Frank H. and Rus, Daniela},
    booktitle={Robotics: Science and Systems (RSS)},
    year={2018},
    month={June},
    doi={10.15607/RSS.2018.XIV.063},
    url={http://groups.csail.mit.edu/drl/wiki/images/d/d8/delpreto_rss2018_emg_eeg.pdf},
    abstract={Control of robots in safety-critical tasks and situations where costly errors may occur is paramount for realizing the vision of pervasive human-robot collaborations. For these cases, the ability to use human cognition in the loop can be key for recuperating safe robot operation. This paper combines two streams of human biosignals, electrical muscle and brain activity via EMG and EEG, respectively, to achieve fast and accurate human intervention in a supervisory control task. In particular, this paper presents an end-to-end system for continuous rolling-window classification of gestures that allows the human to actively correct the robot on demand, discrete classification of Error-Related Potential signals (unconsciously produced by the human supervisor's brain when observing a robot error), and a framework that integrates these two classification streams for fast and effective human intervention. The system also allows 'plug-and-play' operation, demonstrating accurate performance even with new users whose biosignals have not been used for training the classifiers. The resulting hybrid control system for safety-critical situations is evaluated with 7 untrained human subjects in a supervisory control scenario where an autonomous robot performs a multi-target selection task.}
    }

  • A. F. Salazar-Gomez, J. DelPreto, S. Gil, F. H. Guenther, and D. Rus, “Correcting robot mistakes in real time using eeg signals,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017. doi:10.1109/ICRA.2017.7989777
    [BibTeX] [Abstract] [Download PDF]

    Communication with a robot using brain activity from a human collaborator could provide a direct and fast feedback loop that is easy and natural for the human, thereby enabling a wide variety of intuitive interaction tasks. This paper explores the application of EEG-measured error-related potentials (ErrPs) to closed-loop robotic control. ErrP signals are particularly useful for robotics tasks because they are naturally occurring within the brain in response to an unexpected error. We decode ErrP signals from a human operator in real time to control a Rethink Robotics Baxter robot during a binary object selection task. We also show that utilizing a secondary interactive error-related potential signal generated during this closed-loop robot task can greatly improve classification performance, suggesting new ways in which robots can acquire human feedback. The design and implementation of the complete system is described, and results are presented for realtime closed-loop and open-loop experiments as well as offline analysis of both primary and secondary ErrP signals. These experiments are performed using general population subjects that have not been trained or screened. This work thereby demonstrates the potential for EEG-based feedback methods to facilitate seamless robotic control, and moves closer towards the goal of real-time intuitive interaction.

    @inproceedings{salazar2017eegcorrecting,
    title={Correcting robot mistakes in real time using eeg signals},
    author={Salazar-Gomez, Andres F and DelPreto, Joseph and Gil, Stephanie and Guenther, Frank H and Rus, Daniela},
    booktitle={2017 IEEE International Conference on Robotics and Automation (ICRA)},
    organization={IEEE},
    year={2017},
    month={May},
    doi={10.1109/ICRA.2017.7989777},
    url={http://groups.csail.mit.edu/drl/wiki/images/e/ec/Correcting_Robot_Mistakes_in_Real_Time_Using_EEG_Signals.pdf},
    abstract={Communication with a robot using brain activity from a human collaborator could provide a direct and fast feedback loop that is easy and natural for the human, thereby enabling a wide variety of intuitive interaction tasks. This paper explores the application of EEG-measured error-related potentials (ErrPs) to closed-loop robotic control. ErrP signals are particularly useful for robotics tasks because they are naturally occurring within the brain in response to an unexpected error. We decode ErrP signals from a human operator in real time to control a Rethink Robotics Baxter robot during a binary object selection task. We also show that utilizing a secondary interactive error-related potential signal generated during this closed-loop robot task can greatly improve classification performance, suggesting new ways in which robots can acquire human feedback. The design and implementation of the complete system is described, and results are presented for realtime closed-loop and open-loop experiments as well as offline analysis of both primary and secondary ErrP signals. These experiments are performed using general population subjects that have not been trained or screened. This work thereby demonstrates the potential for EEG-based feedback methods to facilitate seamless robotic control, and moves closer towards the goal of real-time intuitive interaction.}
    }

In the News

Brainwaves and hand gestures (EEG + EMG)

Special thanks to the MIT CSAIL communications team,
especially Adam Conner-Simons and Rachel Gordon.

MIT News

MIT News

How to control robots with brainwaves and hand gestures
Read More
CSAIL News

CSAIL News

How to control robots with brainwaves and hand gestures
Read More
Popular Mechanics

Popular Mechanics

Humans Can Now Correct Robots With Brainwaves
Read More
Inverse

Inverse

Brain-Reading Robots Can Now Collaborate With Humans on Multiple Choice Tests
Read More
TechCrunch

TechCrunch

New system connects your mind to a machine to help stop mistakes
Read More
Fast Company

Fast Company

Oops! This MIT robot knows it made a mistake by reading human brainwaves
Read More
Engadget

Engadget

MIT uses brain signals and hand gestures to control robots
Read More
New Atlas

New Atlas

Brain/gesture-reading system lets users stop and correct errant robots
Read More
BBC

BBC

The robot controlled by your thoughts
Read More
International Business Times

International Business Times

Mind control: Robots can be controlled via brainwaves, muscle gestures by humans now
Read More
New York Post

New York Post

Mind-reading robots will exist much sooner than you think
Read More
Co.Design

Co.Design

Mind-reading robots are no longer science fiction
Read More
Geek

Geek

MIT Scientists Control Robots With Their Minds
Read More
Science News

Science News

With this new system, robots can "read" your mind
Read More
Digital Trends

Digital Trends

MIT researchers develop a robot system controlled by brainwaves
Read More
BGR

BGR

Robots that read our minds will actually exist sooner than you think
Read More
VentureBeat

VentureBeat

MIT develops a system that lets operators control robots with their minds
Read More
SlashGear

SlashGear

MIT researchers control robots with brainwaves and gestures
Read More
TechXplore

TechXplore

Controlling robots with brainwaves and hand gestures
Read More
Science Daily

Science Daily

Controlling robots with brainwaves and hand gestures
Read More
The Engineer

The Engineer

US team controls robot with brainwaves and hand gestures
Read More
Tech Times

Tech Times

This Technology Will Allow You To Control Robots With BrainWaves And Hand Gestures
Read More
Gadgets360

Gadgets360

MIT Develops Technology to Control Robots With Brainwaves and Hand Gestures
Read More
Express UK

Express UK

Mind-reading robot controlled by humans using nothing but THOUGHTS and hand gestures
Read More
Yahoo!

Yahoo!

MIT researchers develop a robot system controlled by brainwaves
Read More
IEEE Spectrum

IEEE Spectrum

Video Friday: Robot World Cup, New Co-Bots, and 1000 SpotMinis
Read More
Daily Mail

Daily Mail

The remarkable mind-reading robot servant that humans can control using nothing but their THOUGHTS and hand gestures
Read More
RT

RT

Power of thought and flick of a finger: MIT creates mind & gesture controlled robot (VIDEO)
Read More
ECN Mag

ECN Mag

Controlling Robots With Brainwaves And Hand Gestures
Read More
GearBrain

GearBrain

Watch how this robot is controlled by brain waves and hand gestures
Read More

Times of India

New system lets you control robots with brain waves and hand gestures
Read More

DNA India

New system lets you control robots with brain waves and hand gestures
Read More

Express Tribune Pakistan

MIT researchers control robots using brain signals
Read More

Brain signals (EEG)

Special thanks to the MIT CSAIL communications team,
especially Adam Conner-Simons, Tom Buehler, and Jason Dorfman.

MIT News

MIT News

Brain-controlled robots
Read More
CSAIL News

CSAIL News

Brain-controlled robots
Read More
Tech Crunch

Tech Crunch

MIT programs a robot to self-correct when a human detects a mistake
Read More
Mashable

Mashable

Researchers use brain waves to correct robot mistakes
Read More
Newsweek

Newsweek

Mind-controlled robots make machines ‘natural extension of us’
Read More
Digital Trends

Digital Trends

Scientists figure out a way to correct a robot's mistakes via brain waves
Read More
Huffington Post

Huffington Post

This Mind-Controlled Robot Corrects Its Mistakes When It Senses It's Done Wrong
Read More
Forbes

Forbes

Mind-Reading Robot Can Tell From Your Brainwaves When It's Made A Mistake
Read More
Fast Company

Fast Company

MIT’s Mind-Reading Robots Understand The One Command That Matters
Read More
New Scientist

New Scientist

Humans control robots with their minds by watching for mistakes
Read More
Wired

Wired

Baxter The Robot Fixes Its Mistakes By Reading Your Mind
Read More
NPR

NPR

Researchers Take A Step Toward Mind-Controlled Robots
Read More
Discover

Discover

How to Train Your Robot with Brain Oops Signals
Read More
Hackaday

Hackaday

How to telepathically tell a robot it screwed up
Read More
BBC

BBC

Brain Wave for Controlling Bots
Read More
Engadget

Engadget

MIT finds an easy way to control robots with your brain
Read More
New Atlas

New Atlas

Correcting robot mistakes using mind control without thinking
Read More
Gizmodo

Gizmodo

You Can Shame This Robot With Your Mind When It Screws Up
Read More
Inverse

Inverse

Welcome to the Age of Brain-Controlled Robots
Read More
Financial Times

Financial Times

Scientists make step towards humans guiding robots by telepathy
Read More
Tech Times

Tech Times

Scientists Programming Robots To Become 'More Natural Extension Of Us'
Read More
The Inquirer

The Inquirer

MIT robot uses mind control to correct errors
Read More
PC Mag

PC Mag

Researchers Show Off 'Mind-Reading' Robot
Read More
Quartz

Quartz

MIT’s new robot reads your thoughts and knows when it made a mistake
Read More
CNN Business

CNN Business

MIT's latest robot gets embarrassed
Read More
Tech Republic

Tech Republic

How MIT's new AI system lets you control a robot with your mind
Read More
International Business Times

International Business Times

Humans will soon be able to control robots with mind, says research
Read More
Recode

Recode

You can control this robot with your mind
Read More
CBC

CBC

MIT, Boston University working on brain-controlled robots
Read More

Leave a Reply