Aerospace

Video: Wearable can control robots through gesture recognition

30 April 2020

MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has created a system that allows robots to be controlled using nonverbal cues such as gestures with minimal setup or calibration.

The system, called Conduct-a-Bot, detects gestures made by humans from wearable muscle and motion sensors called electromyography (EMG) sensors, which are worn on biceps and triceps to detect when the upper arm muscles are tensed. A wireless device with EMG and motion sensors are worn on the forearm.

Machine learning processes those muscle and motion signals to classify eight predefined navigational gestures at any time and Gaussian Mixture Models (GMMs) are updated to cluster the streaming data. This allows the system to calibrate itself to each person’s signals while making gestures to control robots.

The wearable contains sensors and plug-and-play algorithms and builds on an expandable vocabulary for communicating with robot assistants or other electronic devices. Additional scenarios and evaluations can be created with more users and robots.

To contact the author of this article, email PBrown@globalspec.com


Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement