Legions of science-fiction authors have imagined a future that includes mind-reading technology. Although the ability to play back memories like a movie remains a distant dream, a new study has taken a provocative step in that direction by decoding neural signals for images.

Neuroscientist Kendrick Kay and his colleagues at the University of California, Berkeley, were able to successfully determine which of a large group of never-before-seen photographs a subject was viewing based purely on functional MRI data. By analyzing fMRI scans of viewers as they looked at thousands of images, Kays team created a computer model that uses picture elements such as angles and brightness to predict the neural activity elicited by a novel black-and-white photograph. Then the researchers scanned subjects while showing them new snapshots. Most of the time Kays model could single out which image the subject was viewing by matching its prediction of brain activity to the actual activity measured by the fMRI scanner, although very similar pictures tended to baffle the program.

Kays reproduction of the age-old pick a card, any card trick is intriguing to visual neuroscience researchers because of his algorithms versatility. Perhaps more interesting to science-fiction buffs is Kays opinion that someday his algorithm might perform at least some degree of [image] reconstruction based on fMRI data. Starting from brain activity alone, his model should be able to deduce, for example, an images overall brightness. The team has not yet studied the model in this capacity, however; Kay says it is too early to gauge exactly how much information the program can glean from a brain scan.

As for truly reading peoples thoughts, Kay does not foresee anything of that nature in this century. Technological improvement, he explains, may yield piles of brain data. Without sufficient insight into the brains workings, however, we will have no idea what it all means.

This story was originally printed with the title, "Can You Read My Mind?".