Scientists develop world’s first ‘mind-reading helmet’ that translates brainwaves into words

  • Scientists have announced the world's first mind-reading AI that's also portable
  • This technology translates brain waves into written text using sensors on the head
  • Read more: Scientists are developing wearable computers that read your mind

Scientists have developed the world's first mind-reading AI, which translates brainwaves into readable text.

It works by using a helmet covered with a sensor that monitors specific electrical activity in the brain as the wearer thinks, and converts it into words.

The revolutionary technology was pioneered by the team at TUniversity of Technology Sydney, which says it could revolutionize the care of patients rendered mute by stroke or paralysis.

The portable, non-invasive system is a milestone, providing transformative communications solutions for individuals affected by stroke or paralysis

A demonstration video shows a person thinking about a sentence displayed on a screen, which then switches to what the AI ​​model has decoded – and the results are almost exactly identical.

The team also believes the innovation will allow seamless control of devices, such as bionic limbs and robots, allowing humans to give directions simply by thinking about them.

Lead researcher Professor CT Lin said: “This research represents a pioneering effort in translating raw EEG waves directly into language, representing a major advance in the field.”

“It is the first to integrate discrete coding techniques into the brain-to-text translation process, offering an innovative approach to neural decoding.”

“Integration with large language models also opens new horizons in neuroscience and artificial intelligence.”

Previous technology to translate brain signals into language required either surgery to implant electrodes in the brain, such as Elon Musk's Neuralink, or scanning with an MRI machine, which is important, expensive, and difficult to use in everyday life.

An explanatory video shows a person thinking about a sentence displayed on the screen

An explanatory video shows a person thinking about a sentence displayed on the screen

The screen then switched to what the AI ​​model had decoded, and the results were almost exactly identical

The screen then switched to what the AI ​​model had decoded, and the results were almost exactly identical

However, new technology uses a simple helmet over the head to read what a person is thinking.

To test the technology, Lin and his team conducted experiments with 29 participants who were shown a sentence or phrase on a screen and had to think about reading it.

The AI ​​model then displayed what it had interpreted from the subject's brainwaves.

One example asked the participant to think, “Good evening!” I hope you are doing well. I'll start with a cappuccino, please, with an extra shot of espresso.

The screen then showed the AI ​​”thinking” and displayed its response: “Afternoon!” You are fine? Cappuccino, shot extra. espresso.'

DeWave can translate EEG signals into words using large language models (LLMs) based on large amounts of EEG data from the BART model, combining bidirectional BERT context and the left-to-right ChatGPT decoder.

The team notes that translation accuracy is currently around 40 percent, but is continuing its work to boost that to 90 percent.

(Tags for translation)dailymail