Audio and Video

The Next Step in Artificial Intelligence: a Keyboard that Can Gauge Emotions

27 December 2016

How many times has a written communication been misinterpreted—sometimes with comedic results? The exchange could have ended just as easily in anger or disappointment. A new keyboard developed by three students at the Jacobs Technion-Cornell Institute, part of Cornell University in New York, may help alleviate written misunderstandings by reading—and, eventually, conveying to the person on the other side—the user’s emotional state as he or she typed the message. The technology represents the next phase in artificial intelligence (AI) and machine learning—algorithms that can process data and learn from the information received.

How It Works

Jacobs Technion-Cornell Institute students and application developers Claire Opila,?Hsiao-Ching Lin and Huai-Che Lu (left to right) Jacobs Technion-Cornell Institute students and application developers Claire Opila,?Hsiao-Ching Lin and Huai-Che Lu (left to right) The application, called Keymochi and developed for both Android and iOS phones, collects information on behaviors such as typing speed, punctuation changes, the amount of phone movement and a rough sentiment analysis of the words typed to assess emotions. Users can also choose one of 16 pictures to indicate their mood. The data is then encrypted and transmitted to a database, where the team builds a user-specific, machine-learning algorithm. The data is sent anonymously, and the application does not store the content of the message—just the signals used to gauge sentiment.

User-specific algorithms are necessary, because not everyone conveys emotions in the same way while they type. So far, using data collected from the three developers, the application can report emotions with 82% accuracy.

Potential Applications for Affective Computing

The Cornell Institute students originally developed the application to assist in the mental health field. But the application could have even broader applications. It could help people “read” other people better in situations such as customer service or tech support. It could also, potentially, help internet marketers and social media specialists to better gauge sentiment and customer intent at various stages of the sales funnel.

The technology could even enable smart-home systems to gauge a resident’s mood when he or she returns home and adjusts the lights and entertainment systems accordingly. For instance, on the train ride home, your smartphone detects that you have had a bad day—based not on the words in texts you have sent, but by the way you typed those words. When you get home, your streaming music system plays your favorite song and the lights in the entryway and kitchen shine a little brighter, in an attempt to cheer you up.

The Future of Affective Computing and AI

Affective computing systems like this one, developed in parallel with other AI systems, might also help AI systems better read the humans they are serving. And although many of today’s popular AI systems—such as Amazon’s Alexa—are powered by voice control, the keyboard interface is not going away any time soon.
In the short term, at the very least, this application could help couples, friends, parents and children communicate their intent better in short texts. So when you text your teenage child, “Come home,” for the third time in an hour, he or she will know you are definitely not joking.

Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Find Free Electronics Datasheets