Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • Hubble revisits stunning Trifid Nebula after 30 years, and spots a growing jet of energy — Space photo of the week
  • What you’ll need to see May 2026’s night sky
  • Indigenous Americans’ DNA reveals how natural selection affected people who moved into Earth’s ‘final frontier’
  • ‘Eventually, it becomes you’: Inventors of new ‘living’ knee replacement describe why this tech is desperately needed and how it works
  • The Trump administration wants to open precious East Coast forests to logging and mining
  • ‘The push towards renewables is unstoppable because it’s in a country’s self-interest’: Climate scientist Andy Reisinger on Trump, Iran, and the future of Earth
  • Science news this week: Atlantic current edges closer to collapse, scientists make artificial-neuron breakthrough, and a copy of the “Iliad” is found inside an Egyptian mummy
  • Why are some constellations visible for only part of the year?
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Lifestyle»Meta scientists use AI to decode magnetic brain scans, revealing how thoughts translate into typed sentences
Lifestyle

Meta scientists use AI to decode magnetic brain scans, revealing how thoughts translate into typed sentences

EditorBy EditorMarch 10, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

Scientists at Meta have used artificial intelligence (AI) and noninvasive brain scans to unravel how thoughts are translated into typed sentences, two new studies show.

In one study, scientists developed an AI model that decoded brain signals to reproduce sentences typed by volunteers. In the second study, the same researchers used AI to map how the brain actually produces language, turning thoughts into typed sentences.

The findings could one day support a noninvasive brain-computer interface that could help people with brain lesions or injuries to communicate, the scientists said.

“This was a real step in decoding, especially with noninvasive decoding,” Alexander Huth, a computational neuroscientist at the University of Texas at Austin who was not involved in the research, told Live Science.

Related: AI ‘brain decoder’ can read a person’s thoughts with just a quick brain scan and almost no training

Brain-computer interfaces that use similar decoding techniques have been implanted in the brains of people who have lost the ability to communicate, but the new studies could support a potential path to wearable devices.

In the first study, the researchers used a technique called magnetoencephalography (MEG), which measures the magnetic field created by electrical impulses in the brain, to track neural activity while participants typed sentences. Then, they trained an AI language model to decode the brain signals and reproduce the sentences from the MEG data.

Get the world’s most fascinating discoveries delivered straight to your inbox.

The model decoded the letters that participants typed with 68% accuracy. Frequently occurring letters were decoded correctly more often, while less-common letters, like Z and K, came with higher error rates. When the model made mistakes, it tended to substitute characters that were physically close to the target letter on a QWERTY keyboard, suggesting that the model uses motor signals from the brain to predict which letter a participant typed.

The team’s second study built on these results to show how language is produced in the brain while a person types. The scientists collected 1,000 MEG snapshots per second as each participant typed a few sentences. From these snapshots, they decoded the different phases of sentence production.

Decoding your thoughts with AI

They found that the brain first generates information about the context and meaning of the sentence, and then produces increasingly granular representations of each word, syllable and letter as the participant types.

“These results confirm the long-standing predictions that language production requires a hierarchical decomposition of sentence meaning into progressively smaller units that ultimately control motor actions,” the authors wrote in the study.

To prevent the representation of one word or letter from interfering with the next, the brain uses a “dynamic neural code” to keep them separate, the team found. This code constantly shifts where each piece of information is represented in the language-producing parts of the brain.

That lets the brain link successive letters, syllables, and words while maintaining information about each over longer periods of time. However, the MEG experiments were not able to pinpoint exactly where in those brain regions each of these representations of language arises.

Taken together, these two studies, which have not been peer-reviewed yet, could help scientists design noninvasive devices that could improve communication in people who have lost the ability to speak.

Although the current setup is too bulky and too sensitive to work properly outside a controlled lab environment, advances in MEG technology may open the door to future wearable devices, the researchers wrote.

“I think they’re really at the cutting edge of methods here,” Huth said. “They are definitely doing as much as we can do with current technology in terms of what they can pull out of these signals.”

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleOil tanker and cargo vessel collide in the North Sea off English coast
Next Article Who is DOGE’s Amy Gleason? Data cruncher and former nurse is a ‘straight shooter’
Editor
  • Website

Related Posts

Lifestyle

Hubble revisits stunning Trifid Nebula after 30 years, and spots a growing jet of energy — Space photo of the week

April 26, 2026
Lifestyle

What you’ll need to see May 2026’s night sky

April 26, 2026
Lifestyle

Indigenous Americans’ DNA reveals how natural selection affected people who moved into Earth’s ‘final frontier’

April 25, 2026
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • Hubble revisits stunning Trifid Nebula after 30 years, and spots a growing jet of energy — Space photo of the week
  • What you’ll need to see May 2026’s night sky
  • Indigenous Americans’ DNA reveals how natural selection affected people who moved into Earth’s ‘final frontier’
  • ‘Eventually, it becomes you’: Inventors of new ‘living’ knee replacement describe why this tech is desperately needed and how it works
  • The Trump administration wants to open precious East Coast forests to logging and mining
calendar
April 2026
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
27282930  
« Mar    
Recent Posts
  • Hubble revisits stunning Trifid Nebula after 30 years, and spots a growing jet of energy — Space photo of the week
  • What you’ll need to see May 2026’s night sky
  • Indigenous Americans’ DNA reveals how natural selection affected people who moved into Earth’s ‘final frontier’
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2026 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.