Was President Barack Obama born in Hawaii? I find the question so absurd, not to mention possibly racist in its motivation, that when I am confronted with “birthers” who believe otherwise, I find it difficult to even focus on their arguments about the difference between a birth certificate and a certificate of live birth. The reason is because once I formed an opinion on the subject, it became a belief, subject to a host of cognitive biases to ensure its verisimilitude. Am I being irrational? Possibly. In fact, this is how most belief systems work for most of us most of the time.
We form our beliefs for a variety of subjective, emotional and psychological reasons in the context of environments created by family, friends, colleagues, culture and society at large. After forming our beliefs, we then defend, justify and rationalize them with a host of intellectual reasons, cogent arguments and rational explanations. Beliefs come first; explanations for beliefs follow. In my new book The Believing Brain (Holt, 2011), I call this process, wherein our perceptions about reality are dependent on the beliefs that we hold about it, belief-dependent realism. Reality exists independent of human minds, but our understanding of it depends on the beliefs we hold at any given time.
I patterned belief-dependent realism after model-dependent realism, presented by physicists Stephen Hawking and Leonard Mlodinow in their book The Grand Design (Bantam Books, 2011). There they argue that because no one model is adequate to explain reality, “one cannot be said to be more real than the other.” When these models are coupled to theories, they form entire worldviews.
Once we form beliefs and make commitments to them, we maintain and reinforce them through a number of powerful cognitive biases that distort our percepts to fit belief concepts. Among them are:
Anchoring Bias. Relying too heavily on one reference anchor or piece of information when making decisions.
Authority Bias. Valuing the opinions of an authority, especially in the evaluation of something we know little about.
Belief Bias. Evaluating the strength of an argument based on the believability of its conclusion.
Confirmation Bias. Seeking and finding confirming evidence in support of already existing beliefs and ignoring or reinterpreting disconfirming evidence.
On top of all these biases, there is the in-group bias, in which we place more value on the beliefs of those whom we perceive to be fellow members of our group and less on the beliefs of those from different groups. This is a result of our evolved tribal brains leading us not only to place such value judgment on beliefs but also to demonize and dismiss them as nonsense or evil, or both.
Belief-dependent realism is driven even deeper by a meta-bias called the bias blind spot, or the tendency to recognize the power of cognitive biases in other people but to be blind to their influence on our own beliefs. Even scientists are not immune, subject to experimenter-expectation bias, or the tendency for observers to notice, select and publish data that agree with their expectations for the outcome of an experiment and to ignore, discard or disbelieve data that do not.
This dependency on belief and its host of psychological biases is why, in science, we have built-in self-correcting machinery. Strict double-blind controls are required, in which neither the subjects nor the experimenters know the conditions during data collection. Collaboration with colleagues is vital. Results are vetted at conferences and in peer-reviewed journals. Research is replicated in other laboratories. Disconfirming evidence and contradictory interpretations of data are included in the analysis. If you don’t seek data and arguments against your theory, someone else will, usually with great glee and in a public forum. This is why skepticism is a sine qua non of science, the only escape we have from the belief-dependent realism trap created by our believing brains.
This article was originally published with the title The Believing Brain.