Researchers from the University of Colorado at Boulder noticed that machine learning technology can recognize images. They wondered, can it also recognize and register emotions? The team created a machine learning system that can connect emotions with images and categorize movies by genre. The team’s system, which is part brain imaging study and part machine learning innovation, is a step toward applying emotions to neural networks.
Before the study, the team identified the stereotypical responses a human would have to certain images. The team used an existing neural network, named AlexNet, to test how a machine learning system will categorize emotions. AlexNet helps computers to recognize options. They recalibrated AlexNet to predict how a person would feel when they are viewing an image. The new network was named EmoNet.
The team showed EmoNet 25,000 images ranging from nature scenes to erotic images. The network was asked to organize the images into 20 emotion-based categories, like craving, sexual desire, horror, awe and surprise.
EmoNet could accurately and consistently categorize 11 types of emotions. The system was better at recognizing some emotions over others. It was 95% accurate at identifying craving and sexual desire, but it had a hard time identifying confusion, awe and surprise. When EmoNet was asked to categorize movies, it was correct three-fourths of the time. EmoNet could even categorize a single-color screen into an emotion. For example, when EmoNet was shown a black screen, it categorized that image under anxiety. Even the smallest details affected how EmoNet categorized an image. When shown an image of a puppy, Emonet categorized it as amusement. But when the image contained two puppies, it categorized the image as romance.
The team wanted to further test EmoNet with human subjects. Eighteen participants underwent an MRI scan that measured their brain activity when they were shown four-second flashes of 112 images. EmoNet underwent the same testing, acting as the 19th subject of the study. The study showed that EmoNet’s neural activity matched human neural net activity.
"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science.
Over the course of its development, EmoNet learned to show emotions in a biologically plausible way, even though the researchers didn’t train the system to do this.
The team hopes that EmoNet could be used to help people digitally screen out negative emotions and find positive ones to replace them. It could also be used to improve how humans and computers interact to advance emotional research.
The paper was published in Science Advances.