Saturday, April 4, 2020

Scientists develop AI that can turn brain activity into text

Researchers in US tracked the neural data from people while they were speaking
Reading minds has just come a step closer to reality: scientists have developed artificial intelligence that can turn brain activity into text.
While the system currently works on neural patterns detected while someone is speaking aloud, experts say it could eventually aid communication for patients who are unable to speak or type, such as those with locked in syndrome.
“We are not there yet but we think this could be the basis of a speech prosthesis,” said Dr Joseph Makin, co-author of the research from the University of California, San Francisco.

Writing in the journal Nature Neuroscience, Makin and colleagues reveal how they developed their system by recruiting four participants who had electrode arrays implanted in their brain to monitor epileptic seizures.
These participants were asked to read aloud from 50 set sentences multiple times, including “Tina Turner is a pop singer”, and “Those thieves stole 30 jewels”. The team tracked their neural activity while they were speaking.
This data was then fed into a machine-learning algorithm, a type of artificial intelligence system that converted the brain activity data for each spoken sentence into a string of numbers.
To make sure the numbers related only to aspects of speech, the system compared sounds predicted from small chunks of the brain activity data with actual recorded audio. The string of numbers was then fed into a second part of the system which converted it into a sequence of words.
At first the system spat out nonsense sentences. But as the system compared each sequence of words with the sentences that were actually read aloud it improved, learning how the string of numbers related to words, and which words tend to follow each other.
The team then tested the system, generating written text just from brain activity during speech.
The system was not perfect. Among its mistakes, “Those musicians harmonise marvellously” was decoded as “The spinach was a famous singer”, and “A roll of wire lay near the wall” became “Will robin wear a yellow lily”.
However, the team found the accuracy of the new system was far higher than previous approaches. While accuracy varied from person to person, for one participant just 3% of each sentence on average needed correcting – higher than the word error rate of 5% for professional human transcribers. But, the team stress, unlike the latter, the algorithm only handles a small number of sentences.
“If you try to go outside the [50 sentences used] the decoding gets much worse,” said Makin, adding that the system is likely relying on a combination of learning particular sentences, identifying words from brain activity, and recognising general patterns in English.
The team also found that training the algorithm on one participant’s data meant less training data was needed from the final user – something that could make training less onerous for patients.
Dr Christian Herff an expert in the field from Maastricht University who was not involved in the study, said the research was exciting because the system used less than 40 minutes of training data for each participant, and a limited collection of sentences, rather than the millions of hours typically needed.
“By doing so they achieve levels of accuracy that haven’t been achieved so far,” he said.
However, he noted the system was not yet usable for many severely disabled patients as it relied on the brain activity recorded from people speaking a sentence out loud.
“Of course this is fantastic research but those people could just use ‘OK Google’ as well,” he said. “This is not translation of thought [but of brain activity involved in speech].”
Herff said people should not worry about others reading their thoughts just yet: the brain electrodes must be implanted, while imagined speech is very different to inner voice.
But Dr Mahnaz Arvaneh, an expert in brain machine interfaces at Sheffield University, said it was important to consider ethical issues now. “We [are still] very, very far away from the point that machines can read our minds,” she said. “But it doesn’t mean that we should not think about it and we should not plan about it.”

Since you’re here...

... we’re asking readers, like you, to make a contribution in support of the Guardian’s open, independent journalism. This is turning into a turbulent year with a succession of international crises. The Guardian is in every corner of the globe, calmly reporting with tenacity, rigour and authority on the most critical events of our lifetimes. At a time when factual information is both scarcer and more essential than ever, we believe that each of us deserves access to accurate reporting with integrity at its heart.
More people than ever before are reading and supporting our journalism, in more than 180 countries around the world. And this is only possible because we made a different choice: to keep our reporting open for all, regardless of where they live or what they can afford to pay.
We have upheld our editorial independence in the face of the disintegration of traditional media – with social platforms giving rise to misinformation, the seemingly unstoppable rise of big tech and independent voices being squashed by commercial ownership. The Guardian’s independence means we can set our own agenda and voice our own opinions. Our journalism is free from commercial and political bias – never influenced by billionaire owners or shareholders. This makes us different. It means we can challenge the powerful without fear and give a voice to those less heard.
None of this would have been attainable without our readers’ generosity – your financial support has meant we can keep investigating, disentangling and interrogating. It has protected our independence, which has never been so critical. We are so grateful.
We need your support so we can keep delivering quality journalism that’s open and independent. And that is here for the long term. Every reader contribution, however big or small, is so valuable. 


No comments:

Post a Comment