Paralysed woman talks after brain signals turned into world-first talking avatar

426     0
Ann, 48, suffered her brainstem stroke 18 years ago (Image: Noah Berger / SWNS)
Ann, 48, suffered her brainstem stroke 18 years ago (Image: Noah Berger / SWNS)

A paralysed woman whose brain signals were intercepted to create a talking avatar with facial expressions is now able to speak once again in an incredible world first.

Ann, 48, was struck down with a brainstem stroke when she was 30, leaving her paralysed and unable to speak. 18 years later, scientists in San Francisco underwent a miraculous world-first experiment on Ann to allow her to speak once again - through a tiny chip in her brain.

The scientists place a thin rectangle of 253 electrodes on the part of her brain which is vital for speech. The electrodes intercept the ‘talking’ brain signals, which enter a computer system via a cable, plugged into a port fixed to her head.

After converting the brain signals into texts at a rate of 80-words-a-minute, the incredible computer then uses an audio recording of Ann’s wedding to simulate her speaking tone. It adds the voice to an avatar that includes facial expressions.

The University of California (UC) scientists from San Francisco and Berkeley used AI to create the brain-computer interface (BCI). They say it is the first time that speech or facial expressions have been synthesised from brain signals. The brain signals intercepted by the electrodes would have gone to the muscles around her face and mouth.

'Half-moon' shape seen in eye could signal condition leading to a heart attack eiqrkikuiurinv'Half-moon' shape seen in eye could signal condition leading to a heart attack

For all the latest news, politics, sports, and showbiz from the USA, go to The Mirror US

Paralysed woman talks after brain signals turned into world-first talking avatarDeveloping the speech system took weeks of repetitive work from Ann and the team (Noah Berger / SWNS)

Dr Edward Chang, chair of neurological surgery at UCSF, has worked on this technology for over 10 years - and hopes to “restore a full, embodied way of communicating” and make it a “real solution for patients”. He intends for a system enabling speech from brain signals to be available in the near future.

Ann worked with the team of scientists for weeks to help train the AI algorithms to recognise her unique brain signals for speech. She had to repeat different phrases from a wide-ranging 1,024-word conversational vocabulary over and over again - until the computer recognised the brain activity patterns associated with different sounds.

The researchers produced a system that decodes words from phonemes - the sub-units of speech which create full words in the same way letters create written words. For example, the phonemes “HH”, “AH, “L”, and “OW” create the word “hello”.

This made the system faster and more accurate, as the computer needed to learn just 39 phonemes to decipher any English word. The developer of the text decoder, Sean Metzger of the Bioengineering Programme at UC, said: “The accuracy, speed and vocabulary are crucial. It’s what gives a user the potential, in time, to communicate almost as fast as we do, and to have much more naturalistic and normal conversations.”

Paralysed woman talks after brain signals turned into world-first talking avatarThe monitor shows the neural signals from Ann's brain - which the computer translates to sounds (Noah Berger / SWNS)

The animated avatar was made using software which simulates and animates muscle movements in the face. A machine-learning process was developed to allow the software to mesh with Ann’s brain signals.

The signals were converted into movements on the avatar’s face, including lips protruding and pursing, the tongue going up and down, and facial expressions for happiness, sadness and surprise.

Graduate student Kaylo Littlejohn added: “We’re making up for the connections between the brain and vocal tract that have been severed by the stroke. When the subject first used this system to speak and move the avatar’s face in tandem, I knew that this was going to be something that would have a real impact.”

The next step for the team is to create a version which doesn’t require the user to be connected to computers. A wireless version may allow those unable to speak to go about daily business talking through their avatar.

Paralysed woman talks after brain signals turned into world-first talking avatarAnn watches on as the incredible avatar replicates facial expressions and her own voice (Noah Berger / SWNS)

Co-first author Dr David Moses, an adjunct professor in neurological surgery, said: "Giving people the ability to freely control their own computers and phones with this technology would have profound effects on their independence and social interactions."

Stroke patients report sudden confusion ‘up to a week’ before medical emergencyStroke patients report sudden confusion ‘up to a week’ before medical emergency

The study is published in Nature journal, adding to previous research in which Dr Chang’s team decoded brain signals into text - in a different man who had a brainstem stroke years earlier. Decoding the signals into speech and facial expressions, as they have done in the research with Ann, is a major step forward.

In a separate study published in Nature, another method was devised to allow a disabled person to ‘speak’ in text. Former human resources director, equestrian, and daily jogger, Pat Bennett, 68, developed a neurodegenerative disease that will leave her paralysed, called amyotrophic lateral sclerosis.

With four small sensors implanted in her brain by a team from Stanford Medicine, signals are transmitted from two speech-related regions in her brain to software which decodes the brain activity and converts it to text. After four months of training the software, her attempted words were being converted into text at 62 words-per-minute.

Paralysed woman talks after brain signals turned into world-first talking avatarCurrently Ann needs to be hooked up to the computer - the next step is to make the system wireless (Noah Berger / SWNS)

This was over three times as fast as the previous record for BCI-assisted communication. The surgeon behind the implants, Dr Jaimie Henderson, said: “We’ve shown you can decode intended speech by recording activity from a very small area on the brain’s surface.”

The pace has since increased from the 62 words-per-minute and Pat is now nearing 160 words-per-minute - the natural rate of conversation. The vocabulary was expanded to 125,000 words, covering most of the English language.

But this caused a rise in the error rate to 23.8 percent. The team says this is far from perfect, but is a large step from previous attempts. One of the researchers, Dr Frank Willett, said: “This is a scientific proof of concept, not an actual device people can use in everyday life. But it’s a big advance toward restoring rapid communication to people with paralysis who can’t speak.”

Mrs Bennett wrote: “Imagine how different conducting everyday activities like shopping, attending appointments, ordering food, going into a bank, talking on a phone, expressing love or appreciation or even arguing will be when nonverbal people can communicate their thoughts in real time.”

Jim Leffman

Print page

Comments:

comments powered by Disqus