Brain imaging reveals the movies in our mind

September 22, 2011
This set of paired images provided by Shinji Nishimoto of the University of California, Berkeley on Wednesday, Sept. 21, 2011 shows original video images, upper row, and those images reconstructed by computer from brain scans. While volunteers watched movie clips, a scanner watched their brains. And from their brain activity, a computer made rough reconstructions of what they viewed. Scientists reported that result Thursday, Sept. 22, 2011 and speculated such an approach might be able to reveal dreams and hallucinations someday. In the future, it might help stroke victims or others who have no other way to communicate, said Jack Gallant, a neuroscientist at the University of California, Berkeley, and co-author of the paper. (University of California, Berkeley, Shinji Nishimoto)

Imagine tapping into the mind of a coma patient, or watching one's own dream on YouTube. With a cutting-edge blend of brain imaging and computer simulation, scientists at the University of California, Berkeley, are bringing these futuristic scenarios within reach.

Using () and computational models, UC Berkeley researchers have succeeded in decoding and reconstructing people's dynamic visual experiences – in this case, watching Hollywood movie trailers.

As yet, the technology can only reconstruct movie clips people have already viewed. However, the breakthrough paves the way for reproducing the movies inside our heads that no one else sees, such as dreams and memories, according to researchers.

"This is a major leap toward reconstructing internal imagery," said Professor Jack Gallant, a UC Berkeley neuroscientist and coauthor of the study to be published online Sept. 22 in the journal Current Biology. "We are opening a window into the movies in our minds."

Eventually, practical applications of the technology could include a better understanding of what goes on in the minds of people who cannot communicate verbally, such as stroke victims, coma patients and people with neurodegenerative diseases.

It may also lay the groundwork for brain-machine interface so that people with cerebral palsy or paralysis, for example, can guide computers with their minds.

However, researchers point out that the technology is decades from allowing users to read others' thoughts and intentions, as portrayed in such sci-fi classics as "Brainstorm," in which scientists recorded a person's sensations so that others could experience them.

Previously, Gallant and fellow researchers recorded brain activity in the visual cortex while a subject viewed black-and-white photographs. They then built a that enabled them to predict with overwhelming accuracy which picture the subject was looking at.

In their latest experiment, researchers say they have solved a much more difficult problem by actually decoding brain signals generated by moving pictures.

"Our natural visual experience is like watching a movie," said Shinji Nishimoto, lead author of the study and a post-doctoral researcher in Gallant's lab. "In order for this technology to have wide applicability, we must understand how the brain processes these dynamic visual experiences."

Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.

They watched two separate sets of Hollywood movie trailers, while fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information. On the computer, the brain was divided into small, three-dimensional cubes known as volumetric pixels, or "voxels."

"We built a model for each voxel that describes how shape and motion information in the movie is mapped into brain activity," Nishimoto said.

The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.

Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.

Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.

Reconstructing movies using brain scans has been challenging because the blood flow signals measured using fMRI change much more slowly than the neural signals that encode dynamic information in movies, researchers said. For this reason, most previous attempts to decode have focused on static images.

"We addressed this problem by developing a two-stage model that separately describes the underlying neural population and blood flow signals," Nishimoto said.

Ultimately, Nishimoto said, scientists need to understand how the brain processes dynamic visual events that we experience in everyday life.

"We need to know how the brain works in naturalistic conditions," he said. "For that, we need to first understand how the works while we are watching movies."

Explore further: Brain cell electrical activity studied

More information:
Current Biology: http://www.cell.com/current-biology

Movie reconstructions: http://bit.ly/nzm9Tw

Related Stories

Brain cell electrical activity studied

August 4, 2005

U.S. researchers say they have discovered a relationship between functional magnetic resonance imaging signals and brain cell electrical activity.

Brain scans used to predict behavior

November 30, 2005

Washington University scientists in St. Louis say they can predict whether people will win or lose a brief visual game by analyzing their brain scans.

Recommended for you

A new window to understanding the brain

August 29, 2016

Scientists in recent years have made great strides in the quest to understand the brain by using implanted probes to explore how specific neural circuits work.

Special nerve cells cause goose bumps and nipple erection

August 29, 2016

The sympathetic nerve system has long been thought to respond the same regardless of the physical or emotional stimulus triggering it. However, in a new study from Karolinska Institutet published in the Nature Neuroscience, ...

8 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

shwhjw
5 / 5 (1) Sep 22, 2011
Another application for being able to reconstruct a person's mental image is art: you look at a blank screen and imagine a picture, a blurry shape comes out. You visualise detail on this blurry shape, and it appears. Repeat for more and more detail!
krundoloss
5 / 5 (1) Sep 22, 2011
Thats great, just dont let my wife see the images in my head of all her hot friends........
Isaacsname
not rated yet Sep 22, 2011
.....Ruh-roh.

I just see it now. In the future we'll have all sorts of people plugged in, vacantly staring at the sky like turkeys, the meaning of the word " interweb " will take on a whole new meaning.
Birthmark
not rated yet Sep 22, 2011
The pictures formed from the brain are beautiful. This will be very interesting when this technique could be mastered in the future; it has the potential for a lot of implications.
antonima
not rated yet Sep 22, 2011
I don't know, but from the pictures it seems that the machine is reading the person's subconscious view rather than the conscious image. The actor is seen just like an ordinary guy; some people will try to see movies through an actor's eyes.
The plane looks like a building on the horizon; perhaps the brain just subcategorizes images into more manageable paradigms.
When someone sees an elephant, they just think - hey its an 'elephant'. They rarely have actual experience with these creatures, instead they think they know, or rather, they can identify what it is and it suffices for their purposes. Thats why the elephant is just a blob! The background is lush - blue and green. In most people's eyes elephants are exotic creatures in exotic locales.

It would be very very interesting to see the same clips through different people's eyes!
Jeddy_Mctedder
1 / 5 (2) Sep 23, 2011
i call shenanigans on this. they are promising much more than they will ever be able to deliver. remote sensing (fmri) will NEVER be able to do anything like this . it is total bullshit. wake me up when there's some sort of set of implanted multi-pin electrodes and maybe i'll believe this crap. yea, 'simulating' what a person is seeing, when people themeselves frequently don't know what they are consciously seeing.
Bob_Kob
not rated yet Sep 23, 2011
@antonima

Thats not the case. They said the reason why the elephant didnt come out well was because none of the youtube videos used to reconstruct the images contained elephants!
krundoloss
5 / 5 (1) Sep 23, 2011
Thats really cool! Its amazing! We are "analog" machines, that sort of output looks pretty good! The thing about our "inner eye" that makes it harder to reproduce is that we only create the images we require, no extra detail, just enough.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.