Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • Kelly Osbourne Wears His Sunglasses
  • Lakers star Luka Dončić says he took one month off from basketball to transform body
  • ‘Universal’ cancer vaccine heading to human trials could be useful for ‘all forms of cancer’
  • Trump official slams critics of Sydney Sweeney American Eagle commercial
  • Apple’s iPhone 17 could come with a $50 price hike
  • Takeovers of six The Hundred franchises completed as ECB pledges £500m grassroots investment | Cricket News
  • Micka Wright Perry Pregnant at the Same Time as Daughter
  • Trump hits India with 25% tariff
Get Your Free Email Account
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Lifestyle»AI ‘brain decoder’ can read a person’s thoughts with just a quick brain scan and almost no training
Lifestyle

AI ‘brain decoder’ can read a person’s thoughts with just a quick brain scan and almost no training

EditorBy EditorFebruary 17, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

Scientists have made new improvements to a “brain decoder” that uses artificial intelligence (AI) to convert thoughts into text.

Their new converter algorithm can quickly train an existing decoder on another person’s brain, the team reported in a new study. The findings could one day support people with aphasia, a brain disorder that affects a person’s ability to communicate, the scientists said.

A brain decoder uses machine learning to translate a person’s thoughts into text, based on their brain’s responses to stories they’ve listened to. However, past iterations of the decoder required participants to listen to stories inside an MRI machine for many hours, and these decoders worked only for the individuals they were trained on.

“People with aphasia oftentimes have some trouble understanding language as well as producing language,” said study co-author Alexander Huth, a computational neuroscientist at the University of Texas at Austin (UT Austin). “So if that’s the case, then we might not be able to build models for their brain at all by watching how their brain responds to stories they listen to.”

In the new research, published Feb. 6 in the journal Current Biology, Huth and co-author Jerry Tang, a graduate student at UT Austin investigated how they might overcome this limitation. “In this study, we were asking, can we do things differently?” he said. “Can we essentially transfer a decoder that we built for one person’s brain to another person’s brain?”

The researchers first trained the brain decoder on a few reference participants the long way — by collecting functional MRI data while the participants listened to 10 hours of radio stories.

Then, they trained two converter algorithms on the reference participants and on a different set of “goal” participants: one using data collected while the participants spent 70 minutes listening to radio stories, and the other while they spent 70 minutes watching silent Pixar short films unrelated to the radio stories.

Get the world’s most fascinating discoveries delivered straight to your inbox.

Using a technique called functional alignment, the team mapped out how the reference and goal participants’ brains responded to the same audio or film stories. They used that information to train the decoder to work with the goal participants’ brains, without needing to collect multiple hours of training data.

Next, the team tested the decoders using a short story that none of the participants had heard before. Although the decoder’s predictions were slightly more accurate for the original reference participants than for the ones who used the converters, the words it predicted from each participant’s brain scans were still semantically related to those used in the test story.

For example, a section of the test story included someone discussing a job they didn’t enjoy, saying “I’m a waitress at an ice cream parlor. So, um, that’s not…I don’t know where I want to be but I know it’s not that.” The decoder using the converter algorithm trained on film data predicted: “I was at a job I thought was boring. I had to take orders and I did not like them so I worked on them every day.” Not an exact match — the decoder doesn’t read out the exact sounds people heard, Huth said — but the ideas are related.

“The really surprising and cool thing was that we can do this even not using language data,” Huth told Live Science. “So we can have data that we collect just while somebody’s watching silent videos, and then we can use that to build this language decoder for their brain.”

Using the video-based converters to transfer existing decoders to people with aphasia may help them express their thoughts, the researchers said. It also reveals some overlap between the ways humans represent ideas from language and from visual narratives in the brain.

“This study suggests that there’s some semantic representation which does not care from which modality it comes,” Yukiyasu Kamitani, a computational neuroscientist at Kyoto University who was not involved in the study, told Live Science. In other words, it helps reveal how the brain represents certain concepts in the same way, even when they’re presented in different formats.,

The team’s next steps are to test the converter on participants with aphasia and “build an interface that would help them generate language that they want to generate,” Huth said.

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleJoseph Parker to rematch Anthony Joshua if he beats Daniel Dubois? ‘I want to avenge my losses’ | Boxing News
Next Article Kate Middleton shares hand-drawn portraits by her and her children
Editor
  • Website

Related Posts

Lifestyle

‘Universal’ cancer vaccine heading to human trials could be useful for ‘all forms of cancer’

July 30, 2025
Lifestyle

Ancient ‘superfood’ discovered in 2,500-year-old bronze jars in southern Italy

July 30, 2025
Lifestyle

Google has turned 2 billion smartphones into a global earthquake warning system — it’s just as effective as seismometers

July 30, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • Kelly Osbourne Wears His Sunglasses
  • Lakers star Luka Dončić says he took one month off from basketball to transform body
  • ‘Universal’ cancer vaccine heading to human trials could be useful for ‘all forms of cancer’
  • Trump official slams critics of Sydney Sweeney American Eagle commercial
  • Apple’s iPhone 17 could come with a $50 price hike
calendar
July 2025
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  
« May    
Recent Posts
  • Kelly Osbourne Wears His Sunglasses
  • Lakers star Luka Dončić says he took one month off from basketball to transform body
  • ‘Universal’ cancer vaccine heading to human trials could be useful for ‘all forms of cancer’
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2025 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.