Science 3 min read

New Brain Machine Interface Could Give Power of Speech to ALS Patients

Geralt | Pixabay.com

Geralt | Pixabay.com

Scientists may soon develop a brain machine interface that could help people with communication disorders.

A new study conducted by the researchers from the Northwestern Medicine and Weinberg College of Arts and Sciences has uncovered the secrets of how the brain controls speech. The discovery could reportedly help in the development of a brain machine interface (BMI) that can help people with speech restrictions communicate better.

For years, motor diseases have rendered millions of people around the world paralyzed. Diseases like amyotrophic lateral sclerosis (ALS), which tied famous theoretical physicist Stephen Hawking to a wheelchair, not only restrict the movements of affected individuals, it also leaves them with speech disorders.

However, paralyzed or ‘locked-in’ individuals are now given a small ray of hope as the latest progress in brain study could potentially lead to the creation of BMIs. Apparently, the information gathered by the researchers from the Northwestern University has just made the once impossible feat now nearly possible.

“This can help us build better speech decoders for BMIs, which will move us closer to our goal of helping people that are locked-in speak again,” Marc Slutzky, lead author of the study and an associate professor of neurology and physiology at the Northwestern University Feinberg School of Medicine, said.

The researchers reportedly discovered that the brain manipulates speech production the same way it controls arm and hand movements. For their experiment, the researchers recorded two signals from two parts of the brain.

The brain signals were recorded using the electrodes placed at the cortical surface. The researchers used patients going through brain tumor surgery since they need subjects who are awake during the operation. They made the patients read a handful of words from a screen while under the knife.

“We studied two parts of the brain that help to produce speech. The precentral cortex represented gestures to a greater extent than phonemes. The inferior frontal cortex, which is a higher level speech area, represented both phonemes and gestures,” Slutzky explained.

The scientists’ findings revealed that the brain represents both the goals of what a person tries to say and the individual movements a person uses to achieve the said goals like lip, palate, tongue, and larynx movements.

“We hypothesized speech motor areas of the brain would have a similar organization to arm motor areas of the brain. The precentral cortex would represent movements (gestures) of the lips, tongue, palate, and larynx, and the higher level cortical areas would represent the phonemes to a greater extent.”

The team’s next plan is to develop a machine algorithm the would not only enable the brain interface to decode gestures but also use the decoded gestures to create words and eventually, speech.

Do you believe that it’s possible to create a brain machine interface that could decode gestures and convert them into speech?

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Chelle Fuertes know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Chelle Fuertes

Chelle is the Product Management Lead at INK. She's an experienced SEO professional as well as UX researcher and designer. She enjoys traveling and spending time anywhere near the sea with her family and friends.

Comments (0)
Least Recent least recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.