Medindia LOGIN REGISTER
Medindia

Movies of Your Dreams, Fantasies and Memories

by Kathy Jones on Sep 25 2011 3:11 PM

 Movies of Your Dreams, Fantasies and Memories
Thanks to the efforts by the scientists at the University of California, Berkeley, your dreams could end up like Hollywood movie trailers on YouTube.
Using functional Magnetic Resonance Imaging (fMRI) and computational models, they have succeeded in decoding, then recreating, people's dynamic visual experiences - a breakthrough they say "opens a window into the movie of our minds".

The researchers believe that the finding could be used to reproduce dreams, fantasies and memories from inside our heads.

It is hoped the revolutionary process could eventually be used to understand the minds of those who cannot communicate verbally, such as stroke victims, coma patients and people with neurodegenerative diseases.

Experts, however, warn that the technology is decades from allowing users to read others' thoughts and intentions, as portrayed in such sci-fi classics as 'Brainstorm'.

"This is a major leap toward reconstructing internal imagery," said co-author Professor Jack Gallant, a UC Berkeley neuroscientist.

Gallant and fellow researchers had previously recorded brain activity in the visual cortex while a subject viewed black-and-white photographs.

Advertisement
They then built a computational model that enabled them to predict with overwhelming accuracy which picture the subject was looking at.

In their latest experiment, the researchers say they have solved a much more difficult problem by actually decoding brain signals generated by moving pictures.

Advertisement
Test subjects watched two separate sets of Hollywood movie trailers, while a fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information.

On the computer, the brain was divided into small, three-dimensional cubes - a computer-imaging term known as volumetric pixels, or 'voxels'.

The brain activity recorded while subjects viewed clips was fed into a computer program that learned, second by second, to associate visual patterns in a particular film with the corresponding brain activity.

Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm.

This was done by feeding 18 million seconds of random YouTube videos into the computer program.

Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.

The study is published online Sept. 22 in the journal Current Biology.

Source-ANI


Advertisement