“I go with my gut feelings,” says investor Judith Williams. Sure, you might think, “so do I,”— if the choice is between chocolate and vanilla ice cream. But Williams is dealing with real money in the five and six figures.
Williams is one of the lions on the program The Lions’ Den, a German television show akin to Shark Tank. She and other participants invest their own money in business ideas presented by contestants. She is not the only one who trusts her gut. Intuition, it seems, is on a roll: bookstores are full of guides advising us how to heal, eat or invest intuitively. They promise to unleash our inner wisdom and strengths we do not yet know we have.
But can we really rely on intuition, or is it a counsel to failure? Although researchers have been debating the value of intuition in decision-making for decades, they continue to disagree.
A Source of Error?
Intuition can be thought of as insight that arises spontaneously without conscious reasoning. Daniel Kahneman, who won a Nobel prize in economics for his work on human judgment and decision-making, has proposed that we have two different thought systems: system 1 is fast and intuitive; system 2 is slower and relies on reasoning. The fast system, he holds, is more prone to error. It has its place: it may increase the chance of survival by enabling us to anticipate serious threats and recognize promising opportunities. But the slower thought system, by engaging critical thinking and analysis, is less susceptible to producing bad decisions.
Kahneman, who acknowledges that both systems usually operate when people think, has described many ways that the intuitive system can cloud judgment. Consider, for example, the framing effect: the tendency to be influenced by the way a problem is posed or a question is asked. In the 1980s Kahneman and his colleague Amos Tversky presented a hypothetical public health problem to volunteers and framed the set of possible solutions in different ways to different volunteers. In all cases, the volunteers were told to imagine that the U.S. was preparing for an outbreak of an unusual disease expected to kill 600 people and that two alternative programs for combating the disease had been proposed.
For one group, the choices were framed by Tversky and Kahneman in terms of gains—how many people would be saved:
- If Program A is adopted, 200 people will be saved.
- If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved.
The majority of volunteers selected the first option, Program A.
For another group, the choices were framed in terms of losses—how many people would die:
- If Program C is adopted 400 people will die.
- If Program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die.
In this case, the vast majority of volunteers were willing to gamble and selected the second option, Program D.
In fact, the options presented to both groups were the same: The first program would save 200 people and lose 400. The second program offered a one-in-three chance that everyone would live and a two-in-three chance that everyone would die. Framing the alternatives in terms of lives saved or lives lost is what made the difference. When choices are framed in terms of gains, people often become risk-averse, whereas when choices are framed in terms of losses, people often became more willing to take risks.
Other cognitive scientists argue that intuition can lead to effective decision-making more commonly than Kahneman suggests. Gerd Gigerenzer of the Max Planck Institute for Human Development in Berlin is among them. He, too, says that people rarely make decisions on the basis of reason alone, especially when the problems faced are complex. But he thinks intuition’s merit has been vastly underappreciated. He views intuition as a form of unconscious intelligence.
Intuitive decisions can be grounded in heuristics: simple rules of thumb. Heuristics screen out large amounts of information, thereby limiting how much needs to be processed. Such rules of thumb may be applied consciously, but in general we simply follow them without being aware that we are doing so. Although they can lead to mistakes, as Kahneman points out, Gigerenzer emphasizes that they can be based on reliable information while leaving out unnecessary information. For example, an individual who wants to buy a good pair of running shoes might bypass research and brain work by simply purchasing the same running shoes used by an acquaintance who is an experienced runner.
In 2006 a paper by Ap Dijksterhuis and his colleagues, then at the University of Amsterdam, came to a similarly favorable view of intuition’s value. The researchers tested what they called the “deliberation without attention” hypothesis: although conscious thought makes the most sense for simple decisions (for example, what size skillet to use), it can actually be detrimental when considering more complex matters, such as buying a house.
In one of their experiments, test subjects were asked to select which of the four cars was the best, taking into account four characteristics, among them gas consumption and luggage space. One set of subjects had four minutes to think about the decision; another set was distracted by solving brainteasers. The distracted group made the wrong choice (according to the researchers’ criteria for the best car) more often than those who were able to think without being distracted. But if participants were asked to assess 12 characteristics, the opposite happened: undisturbed reflection had a negative effect on decision-making; only 25 percent selected the best car. In contrast, 60 percent of the subjects distracted by brainteasers got it right.
Investigators have been unable to replicate these findings, however. And in a 2014 review Ben R. Newell of the University of New South Wales and David R. Shanks of University College London concluded that the effect of intuition has been overrated by many researchers and that there is little evidence that conscious thought arrives at worse solutions in complex situations.
What about Real Life?
Of course, problems in the real world can be considerably more complicated than the artificially constructed ones often presented in laboratory experiments. In the late 1980s this difference sparked the Naturalistic Decision Making movement, which seeks to determine how people make decisions in real life. With questionnaires, videos and observations, it studies how firefighters, nurses, managers and pilots use their experience to deal with challenging situations involving time pressure, uncertainty, unclear goals and organizational constraints.
Researchers in the field found that highly experienced individuals tend to compare patterns when making decisions. They are able to recognize regularities, repetitions and similarities between the information available to them and their past experiences. They then imagine how a given situation might play out. This combination enables them to make relevant decisions quickly and competently. It further became evident that the certainty of the decider did not necessarily increase with an increase in information. On the contrary: too much information can prove detrimental.
Gary Klein, one of the movement’s founders, has called pattern matching “the intuitive part” and mental simulation “the conscious, deliberate and analytical part.” He has explained the benefits of the combination this way: “A purely intuitive strategy relying only on pattern matching would be too risky because sometimes the pattern matching generates flawed options. A completely deliberative and analytic strategy would be too slow.” In the case of firefighters, he notes, if a slow, systematic approach were used, “the fires would be out of control by the time the commanders finished deliberating.”
Intuition Is Not Irrational
Kamila Malewska of the Poznán University of Economics and Business in Poland has also studied intuition in real-world settings and likewise finds that people often apply a combination of strategies. She asked managers at a food company how they use intuition in their everyday work. Almost all of them stated that, in addition to rational analyses, they tapped gut feelings when making decisions. More than half tended to lean on rational approaches; about a quarter used a strategy that blended rational and intuitive elements; and about a fifth generally relied on intuition alone. Interestingly, the more upper-level managers tended more toward intuition.
Malewska thinks that intuition is neither irrational nor the opposite of logic. Rather it is a quicker and more automatic process that plumbs the many deep resources of experience and knowledge that people have gathered over the course of their lives. Intuition, she believes, is an ability that can be trained and can play a constructive role in decision-making.
Field findings published in 2017 by Lutz Kaufmann of the Otto Beisheim School of Management in Germany and his co-workers support the view that a mixture of thinking styles can be helpful in decision-making. The participants in their study, all purchasing managers, indicated how strongly they agreed or disagreed with various statements relating to their decision-making over the prior three months. For example: “I looked extensively for information before making a decision” (rational), “I did not have time to decide analytically, so I relied on my experience” (experience-based), or “I was not completely sure how to decide, so I decided based on my gut feeling” (emotional). The researchers, who consider experience-based and emotional processes as “two dimensions of intuitive processing,” also rated the success of a manager based on the unit price the person negotiated for a purchased product, as well as on the quality of the product and the punctuality of delivery.
Rational decision-making was associated with good performance. A mixture of intuitive and rational approaches also proved useful; however, a purely experience-based and a purely emotional approach did not work well. In other words, a blending of styles, which is frequently seen in everyday life, seems beneficial.
Economists Marco Sahm of the University of Bamberg and Robert K. von Weizsäcker of the Technical University of Munich study the extent to which our background knowledge determines whether rationality or gut feeling is more effective. Both Sahm and Weizsäcker are avid chess players, and they brought this knowledge to bear on their research. As children, they both learned intuitively by imitating the moves of their opponents and seeing where they led. Later, they approached the game more analytically, by reading chess books that explained and illustrated promising moves. Over time Weizsäcker became a very good chess player and has won international prizes. These days he bases his play mainly on intuition.
The two economists developed a mathematical model that takes the costs and benefits of both strategies into account. They have come to the conclusion that whether it is better to rely more on rational assessments or intuition depends both on the complexity of a particular problem and on the prior knowledge and cognitive abilities of the person. Rational decisions are more precise but entail higher costs than intuitive ones—for example, they involve more effort spent gathering and then analyzing information. This additional cost can decrease over time, but it will never disappear. The cost may be worth it if the problem is multifaceted and the decision maker gains a lot of useful information quickly (if the decision maker’s “learning curve is steep”). Once a person has had enough experience with related problems, though, intuitive decision-making that draws on past learning is more likely to yield effective decisions, Sahm and Weizsäcker say. The intuitive approach works better in that case because relying on accumulated experience and intuitive pattern recognition spares one the high costs of rational analysis.
One thing is clear: intuition and rationality are not necessarily opposites. Rather it is advantageous to master both intuition and analytic skills. Let us not follow our inner voice blindly, but let us not underestimate it either.