To Tell the Truth: Brain Scans Are Not Ready for the Courtroom

Brain scans should not be used for lie detection unless their reliability is proved

Join Our Community of Science Lovers!

Neuroscientists have been using brain scans to learn how to read minds. This research is increasing our basic understanding of the human brain and offering hope for medical breakthroughs. We should all applaud this work. Commercial firms, however, are beginning to apply this research to lie detection, selling their services. The technology is tempting, but before we accept it, we need to think hard about it—and go slow.

The trouble is not with the pace of research. Neuroscientists have been publishing articles about detecting lies with functional magnetic resonance imaging (fMRI) for nearly 10 years. About 25 published studies have found correlations between when experimental subjects were telling a lie and the pattern of blood flow in their brains. The trouble is that different studies, using different methods, have drawn conclusions based on the activity of different brain regions. And all the studies so far have taken place in the artificial environment of the laboratory, using people who knew they were taking part in an experiment and who were following instructions to lie. None of the studies examined lie detection in real-world situations. No government agency has found that this method works; no independent bodies have tested the approach. Yet people are buying lie-detection reports, wrapped in the glamour of science, to try to prove their honesty. In May two separate cases wound up in the courts.

One case hinged on whether the technology works. In a federal district court in Tennessee, the defendant in a Medicare fraud case wanted to introduce an fMRI lie-detection report into evidence to prove that he had not intended to commit fraud. After more than 12 hours of expert testimony, the judge concluded that the evidence should not be admitted. He found, correctly, that the accuracy of the method was unknown in real-world settings, that there were no standards for how the method should be applied, and that the scientific community did not generally accept this application of the technology.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The other case turned on the question of whether we should use the technology, even if it worked. The plaintiff in a state court civil case in Brooklyn, N.Y., wanted to introduce an fMRI report to show that her main witness was telling the truth. The judge in that case ruled that the credibility of a fact witness was solely a question for the jury; expert testimony about the witness’s credibility was inadmissible, whether or not it was reliable.

These judges made good decisions, but tens of thousands of trial judges in America may have to rule on this technology, sometimes after hearing from good lawyers and expert witnesses and sometimes not. More important, millions of lives may be affected by the use of these lie-detection reports outside the courtroom—in criminal investigations, in business deals, perhaps in the military or the intelligence community, even in love and marriage.

Before the technology gets a foothold in society, we must answer, more broadly, the questions these judges confronted. We should ban nonresearch use of neuroimaging for lie detection until the method has been proved effective by rigorous, independent, scientific testing. Otherwise we risk hurting people and tarnishing the good name of neuroscience.

I don’t know if fMRI will ever pass that test. If it does, when and how would we use it? Would we force defendants to submit to it? What about suspects, terrorists, misbehaving students, unruly passengers in airport security lines, or teenage children? Lie detection isn’t the only mind-reading use of brain scans that the legal profession could use—scientists are working on detecting pain, biases and memories. We may ultimately decide to reject or accept these technologies. Either way, we must prepare for them.

Scientific American Magazine Vol 303 Issue 6This article was published with the title “To Tell the Truth: Brain Scans Are Not Ready for the Courtroom” in Scientific American Magazine Vol. 303 No. 6 ()
doi:10.1038/scientificamerican122010-WtCoYOkSHmisS4WNySvcY

It’s Time to Stand Up for Science

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, can't-miss newsletters, must-watch videos, challenging games, and the science world's best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American

Subscribe