Technology 2 min read

UCLA Researchers Develop Sign Language Translation Gloves

Photographee.eu / Shutterstock.com

Photographee.eu / Shutterstock.com

A team of bioengineers from UCLA has designed an electronic glove that translates sign language into English speech via a smartphone app.

Roughly one million people use American Sign Language (ASL) as their primary way to communicate. It’s a visual gesture language that serves as the predominant sign language of deaf communities in the U.S.

Recently, American Sign Language has enjoyed more resources and attention in both pop culture and research. For example, some wearable systems now offer translations from ASL to the English language.

Unfortunately, these devices have a heavy or bulky design. As a result, they can be uncomfortable to wear.

Researchers at the University of California Los Angeles have developed an ASL translation glove that addresses this issue. The new glove uses lightweight, inexpensive polymers that are stretchable and long-lasting.

In a statement on the project, principal investigator on the research, Jun Chen said:

“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them.”

The team described how the electronic gloves work in a published paper in the journal Nature Electronics.

Translating American Sign Language into Speech in Real-Time

To create the system, the researchers equipped a pair of gloves with thin, stretchable sensors that run the length of all five fingers.

The sensor sense hand motion and converts into electrical signals sent to the circuit board on the wrist. In turn, the board transmits the signals wirelessly to a smartphone that translates them into spoken words.

Aside from the sensors on the gloves, the researchers also added adhesive sensors to testers’ faces — between the eyebrow and on one side of the mouth. That way, they can capture facial expressions that are an essential part of ASL.

In a test, the researchers worked with four deaf volunteers that use American Sign Language. The volunteers repeated each hand gesture 14 times for a machine-learning algorithm to convert into words, numbers, and words.

Findings from the test suggest that the system was able to recognize 660 signs. These include letters of the alphabets, as well as numbers 0 through 9.

Currently, UCLA has filed a patent for the technology. However, more work is required before the device becomes available commercially.

According to Chen, it needs to learn more words and translate quicker.

 

Read More: New Electronic Gloves Allow Users to Sense Virtual Objects

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.