MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has created a system that allows robots to be controlled using nonverbal cues such as gestures with minimal setup or calibration.
The system, called Conduct-a-Bot, detects gestures made by humans from wearable muscle and motion sensors called electromyography (EMG) sensors, which are worn on biceps and triceps to detect when the upper arm muscles are tensed. A wireless device with EMG and motion sensors are worn on the forearm.
Machine learning processes those muscle and motion signals to classify eight predefined navigational gestures at any time and Gaussian Mixture Models (GMMs) are updated to cluster the streaming data. This allows the system to calibrate itself to each person’s signals while making gestures to control robots.
The wearable contains sensors and plug-and-play algorithms and builds on an expandable vocabulary for communicating with robot assistants or other electronic devices. Additional scenarios and evaluations can be created with more users and robots.