Rationality is the crowning achievement of our species. The ability to use evidence is true the cornerstone of science, medicine, and our legal system. We use rational methods, too, in daily life – we assess an applicant’s resume, a child’s IQ, or the mileage of a used car to predict the likelihood of good performance later on. Given that we often use information to make decisions both large and small, how good are we at assessing evidence?
There is a line of psychological research that studies precisely this, by measuring how accurate we are at making probability judgments. One way to study this is to control the nature of information itself and see whether people are accurate judges of its strength. Interestingly, people’s responses tend to be conservative: they are less sure of their conclusion than the evidence justifies. Yet we are not just affected by the strength of the evidence, but by how it is presented. A recent study by Jennifer Whitman and Todd Woodward found that when pieces of evidence are doled out one at a time, instead of being shown all at once, people conclude that the evidence is stronger.
Imagine, for example, that you are in a library (assuming people still do such things), and you’ve become lost. Are you in the Science Fiction or the Fantasy section? Of course, you could wander the shelves until you find a helpful sign, but it’s faster to simply look at the books on the shelf next to you. You see:
Book 1: Piers Anthony’s Blue Adept: The Apprentice Adept
Book 2: J. K. Rowling’s Harry Potter
Book 3: J. R. R. Tolkien’s The Hobbit
You’re not sure how to categorize Book 1, so it’s not good evidence for either Science Fiction or Fantasy. Books 2 and 3, however, have wizards or elves on their covers, and you might firmly classify them as Fantasy. By now, you’ve weighed the evidence and concluded you’re in the Fantasy section.
Here’s where things get interesting. If someone had simply handed all three books to you at the same time, you might feel that it’s somewhat likely you are in the Fantasy section. But if someone handed the books to you one at a time, you might conclude very strongly that you’re in the Fantasy section. Even though the books are the same, you would weigh the strength of the evidence more heavily when you processed them in turn, rather than all at once.
Researchers Whitman and Woodward recently demonstrated this effect in a controlled laboratory setting. In their study, people looked at a display on a computer screen. At the bottom of the screen was a little pond connected to two big lakes. The pond contained three fish – say, two white ones and one black one. Then the two lakes big lakes were filled with different proportions of white, black, and yellow fish. People looked at the lakes and the pond, and used a sliding scale to judge the probability that the fish in the pond came from Lake 1 or Lake 2. Sometimes there was strong evidence, or a high probability, that the fish were from Lake 1 (Lake 1 had mostly white fish, and some black fish, like the pond). Sometimes it was weak evidence, or a low probability, that fish were from Lake 1. Also, sometimes the fish in the lakes were added in sequence: all the white fish appeared, then the black ones, then the yellow ones. Other times, all the fish were added all at once. When the fish were added one at a time, people perceived the evidence to be stronger.
This is an intriguing finding about how our minds work: even in dry, laboratory studies, we are imperfect rationalists who judge evidence not just by its actual strength, but also by how it was fed to us.
Changing how information is displayed may be something we do without realizing it. If you’ve recently given a PowerPoint talk, you likely presented your bullet points in sequence, rather than all at once, to allow people to better feel the weight of each statement. But there are hazards. Richard Feynman, the late physicist and Nobel laureate, argued that this one-by-one bullet-point style helped lead NASA to make critical misjudgments that resulted in the Challenger disaster. Big bullet points in support of the shuttle’s safety were shown one-by-one, and smaller bullet points below them suggested caution. The findings here suggest that had the bullet points been shown all at once, confidence would have dropped. Would this have been enough to prevent the Challenger’s launch, and subsequent national tragedy? It’s hard to say. But when you make a decision, it’s always worth considering whether you’ve been swayed by a particular style of presentation rather than the facts themselves.
Are you a scientist who specializes in neuroscience, cognitive science, or psychology? And have you read a recent peer-reviewed paper that you would like to write about? Please send suggestions to Mind Matters editor Gareth Cook, a Pulitzer prize-winning journalist at the Boston Globe. He can be reached at garethideas AT gmail.com or Twitter @garethideas.