Technology 2 min read

Researchers are Planning to Introduce Emotion Recognition to VR Gaming

Researchers from Yonsei University are proposing a new approach that would allow developers to integrate emotion recognition on VR gaming consoles.

Image courtesy of Shutterstock

Image courtesy of Shutterstock

The advent of Virtual Reality (VR) ushered in an era of video games development unlike any we’ve ever seen. Not only are games getting increasingly realistic and interactive, but we now enjoy a more immersive gaming experience too.

Thanks to VR consoles, gamers now feel like they are almost inside the game. As such, limitations that result from latency or display resolution became a non-issue.

Now a paper presented at the 2019 IEEE Conference on Virtual Reality and 3-D User Interfaces suggests a smart way to take VR gaming experience to the next level. It involves integrating emotion recognition into consoles. That way, developers can create games that’ll respond to a user’s experience in real-time.

In their paper, the researchers at Yonsei University and Motion Device Inc. proposed a deep-learning-based method to integrate emotion recognition into VR gaming experience.

Using Convolutional Neural Networks For Emotion Recognition

VR functions through head-mounted displays (HMD). Users have to wear the console for the game content to be presented in front of their eyes.

Since most machine learning models for predicting emotions work by analyzing people’s face, merging emotion recognition tools with VR gaming is somewhat challenging. In VR, a user’s face is partially covered by the head-mounted display.

To overcome this challenge, the Yonsei University and Motion Device team trained three convolutional neural networks (CNNs) architecture to predict gamer’s emotions from a partial image of their face. These CNNs include DenseNet, ResNet and Inception-ResNet-V2.

The researchers took 8,040 face images from 67 subjects and covered the parts that would be occluded by the HMD when using VR. That means, the CNNs had to analyze just the eyes, ears, and eyebrows to determine the user’s emotion.

The deep learning model worked, with DenseNet attaining an impressive average accuracy of 90 percent.

In their paper, the researchers wrote:

“Our study showed the possibility of estimating emotions from images of humans wearing HMDs using machine vision.”

The study suggests that emotion recognition could become part of VR technology in the nearest future. Also, the researchers noted that their CNNs could inspire new emotion recognition techniques to provide a richer experience for VR gamers.

Read More: How the Oculus Quest Will Change the VR Game Forever

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.