Technology 3 min read

"EmoNet" Tells How You Feel Just by Looking at Images You're Seeing

Olena Yakobchuk / Shutterstock.com

Olena Yakobchuk / Shutterstock.com

Feelings and emotions, and the ways our brain has developed to manage them — like empathy, sympathy, and compassion — have helped humans cement social behavior and evolve society.

Generally speaking, there’s a fellow feeling between people. We understand what others might be feeling based on shared experiences. It’s like: “I get it! I’ve been there!”

A war photo depicting harrowing scenes doesn’t have the same emotional impact as, let’s say, a picture of butterflies flying in a colorful meadow.

And as your brain knows the difference, it turns out that machine learning can too.

EmoNet can Tell how you Feel

Machine learning systems are known for their ability to decode the content of images and recognize objects. They’re getting better and better as they train more on different data.

But how would AI fare with emotions? Would it be able to guess feelings accurately? The answer is yes according to a team of neuroscientists from the University of Colorado Boulder.

For their experiment, the team repurposed an existing convolutional neural network architecture, called AlexNet, designed for object recognition.

Based on prior research on stereotypical emotional responses to images, they adapted AlexNet to identify emotional situations rather than objects and predict feelings that the image may trigger in a person.

“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

The CU Boulder team dubbed the computational model they developed EmoNet, which they tested using a large dataset that contains 25,000 images of pretty much everything, “from erotic photos to nature scenes.” They then asked EmoNet to categorize the images “into 20 categories such as craving, sexual desire, horror, awe, and surprise.”

Accurately and consistently, EmoNet categorized 11 of the 20 emotion types, although it fared better with some categories than others.

For instance, EmoNet “identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe, and surprise. Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance.”

The researchers see EmoNet as an essential step that would pave the way toward the application of the human brain-inspired neural networks to the investigation of emotion.

Systems like EmoNet that can accurately identify emotions based on the visual information contained in images have their place in many applications from social robots and emotional chatbots to healthcare and entertainment.

Another important finding of the study, which as “part machine-learning innovation, part human brain-imaging,” suggests that visual stimuli, all that we see “even briefly — could have a greater, more swift impact on our emotions than we might assume.”

Read More: Wearable Device Provides Real-Time Insight Into People’s Emotions

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Zayan Guedim know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Zayan Guedim

Trilingual poet, investigative journalist, and novelist. Zed loves tackling the big existential questions and all-things quantum.

Comments (0)
Most Recent most recent
You
103
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.