Standardization is likely to be necessary for many of the talked-about applications of brain decoding — those that would involve reading someone's hidden or unconscious thoughts. And although such applications are not yet possible, companies are taking notice. Haynes says that he was recently approached by a representative from the car company Daimler asking whether one could decode hidden consumer preferences of test subjects for market research. In principle it could work, he says, but the current methods cannot work out which of, say, 30 different products someone likes best. Marketers, he says, should stick to what they know for now. “I'm pretty sure that with traditional market research techniques you're going to be much better off.”
Companies looking to serve law enforcement have also taken notice. No Lie MRI in San Diego, California, for example, is using techniques related to decoding to claim that it can use a brain scan to distinguish a lie from a truth. Law scholar Hank Greely at Stanford University in California, has written in the Oxford Handbook of Neuroethics (Oxford University Press, 2011) that the legal system could benefit from better ways of detecting lies, checking the reliability of memories, or even revealing the biases of jurors and judges. Some ethicists have argued that privacy laws should protect a person's inner thoughts and desires as private, but Julian Savulescu, a neuroethicist at the University of Oxford, UK, sees no problem in principle with deploying decoding technologies. “People have a fear of it, but if it's used in the right way it's enormously liberating.” Brain data, he says, are no different from other types of evidence. “I don't see why we should privilege people's thoughts over their words,” he says.
Haynes has been working on a study in which participants tour several virtual-reality houses, and then have their brains scanned while they tour another selection. Preliminary results suggest that the team can identify which houses their subjects had been to before. The implication is that such a technique might reveal whether a suspect had visited the scene of a crime before. The results are not yet published, and Haynes is quick to point out the limitations to using such a technique in law enforcement. What if a person has been in the building, but doesn't remember? Or what if they visited a week before the crime took place? Suspects may even be able to fool the scanner. “You don't know how people react with countermeasures,” he says.
Other scientists also dismiss the implication that buried memories could be reliably uncovered through decoding. Apart from anything else, you need a 15-metric ton, US$3-million fMRI machine and a person willing to lie very still inside it and actively think secret thoughts. Even then, says Gallant, “just because the information is in someone's head doesn't mean it's accurate”. Right now, psychologists have more reliable, cheaper ways of getting at people's thoughts. “At the moment, the best way to find out what someone is going to do,” says Haynes, “is to ask them.”