How many times has a written communication been misinterpreted—sometimes with comedic results? The exchange could have ended just as easily in anger or disappointment. A new keyboard developed by three students at the Jacobs Technion-Cornell Institute, part of Cornell University in New York, may help alleviate written misunderstandings by reading—and, eventually, conveying to the person on the other side—the user’s emotional state as he or she typed the message. The technology represents the next phase in artificial intelligence (AI) and machine learning—algorithms that can process data and learn from the information received.
How It Works
The application, called Keymochi and developed for both Android and iOS phones, collects information on behaviors such as typing speed, punctuation changes, the amount of phone movement and a rough sentiment analysis of the words typed to assess emotions. Users can also choose one of 16 pictures to indicate their mood. The data is then encrypted and transmitted to a database, where the team builds a user-specific, machine-learning algorithm. The data is sent anonymously, and the application does not store the content of the message—just the signals used to gauge sentiment.
User-specific algorithms are necessary, because not everyone conveys emotions in the same way while they type. So far, using data collected from the three developers, the application can report emotions with 82% accuracy.
Potential Applications for Affective Computing
The Cornell Institute students originally developed the application to assist in the mental health field. But the application could have even broader applications. It could help people “read” other people better in situations such as customer service or tech support. It could also, potentially, help internet marketers and social media specialists to better gauge sentiment and customer intent at various stages of the sales funnel.
The Future of Affective Computing and AI
Affective computing systems like this one, developed in parallel with other AI systems, might also help AI systems better read the humans they are serving. And although many of today’s popular AI systems—such as Amazon’s Alexa—are powered by voice control, the keyboard interface is not going away any time soon.