Why Bad Beliefs Don’t Die
Wednesday, May 12, 2010 at 02:36PM
Skeptic in Epistemology

An article with this title by psychologist Gregory W. Lester appeared in the November/December 2000 issue of Skeptical Inquirer and does a beautiful and persuasive job of explaining why a person's beliefs, bad or good or even trivial beliefs, are so hard to change. I speculated here on why beliefs are so difficult to change, focusing on ways that persistent beliefs are socially useful. Lester goes well beyond that and argues that the human brain has a system for developing beliefs and that this system has survival value because it is independent of the sensory system.

Functionally, our brains treat beliefs as internal "maps" of those parts of the world with which we do not have immediate sensory contact. As I sit in my living room I cannot see my car. Although I parked it in my driveway some time ago, using only immediate sensory data I do not know if it is still there. As a result, at this moment sensory data is of very little use to me regarding my car. In order to find my car with any degree of efficiency my brain must ignore the current sensory data (which, if relied on in a strictly literal sense, not only fails to help me in locating my car but actually indicates that it no longer exists) and turn instead to its internal map of the location of my car. This is my belief that my car is still in my driveway where I left it. By referring to my belief rather than to sensory data, my brain can "know" something about the world with which I have no immediate sensory contact. This "extends" my brain's knowledge of and contact with the world.

The ability of belief to extend contact with the world beyond the range of our immediate senses substantially improves our ability to survive. A caveman has a much greater ability to stay alive if he is able to maintain a belief that dangers exist in the jungle even when his sensory data indicate no immediate threat.

Not only do beliefs make up for gaps in sensory perceptions but they give meaning and coherence to our environment by providing explanations for causes and effects and giving us bases upon which to make predictions. Lester goes on to talk about the power and independence of beliefs:

Because senses and beliefs are both tools for survival and have evolved to augment one another, our brain considers them to be separate but equally important purveyors of survival information. The loss of either one endangers us. Without our senses we could not know about the world within our perceptual realm. Without our beliefs we could not know about the world outside our senses or about meanings, reasons, or causes.

This means that beliefs are designed to operate independent of sensory data. In fact, the whole survival value of beliefs is based on their ability to persist in the face of contradictory evidence. Beliefs are not supposed to change easily or simply in response to disconfirming evidence. If they did, they would be virtually useless as tools for survival. . . .

As far as our brain is concerned, there is absolutely no need for data and belief to agree. They have each evolved to augment and supplement one another by contacting different sections of the world. They are designed to be able to disagree. This is why scientists can believe in God and people who are generally quite reasonable and rational can believe in things for which there is no credible data such as flying saucers, telepathy, and psychokinesis.

When data and belief come into conflict, the brain does not automatically give preference to data. This is why beliefs-even bad beliefs, irrational beliefs, silly beliefs, or crazy beliefs-often don't die in the face of contradictory evidence. The brain doesn't care whether or not the belief matches the data. It cares whether the belief is helpful for survival. Period. So while the scientific, rational part of our brains may think that data should supercede contradictory beliefs, on a more fundamental level of importance our brain has no such bias. . . .

Lester says attacks on even trivial beliefs may be perceived by the brain as threats when such beliefs are part of an integrated worldview. Then he suggests how skeptics should, and should not, engage other people's beliefs with facts--don't get angry, frustrated, or demeaning, don't think it's easy for the other fellow, and don't expect too much. He concludes:

Finally, it should be comforting to all skeptics to remember that the truly amazing part of all of this is not that so few beliefs change or that people can be so irrational, but that anyone’s beliefs ever change at all. Skeptics’ ability to alter their own beliefs in response to data is a true gift; a unique, powerful, and precious ability. It is genuinely a “higher brain function” in that it goes against some of the most natural and fundamental biological urges. Skeptics must appreciate the power and, truly, the dangerousness that this ability bestows upon them. They have in their possession a skill that can be frightening, life-changing, and capable of inducing pain. In turning this ability on others it should be used carefully and wisely. Challenging beliefs must always be done with care and compassion.

Should I believe this?

Update on Sunday, June 27, 2010 at 01:43PM by Registered CommenterSkeptic

If there is an evolutionary and neurological basis for beliefs persisting despite a person's perception of overwhelming contradictory evidence, perhaps it has to do with the left-brain-right-brain dichotomy.  In Part 4 of Errol Morris' 5-part series The Anosognosic’s Dilemma: Something’s Wrong but You’ll Never Know What It Is (anosognosia is unawareness, self-deception, or denial regarding one's own disability such as paralysis or ignorance), reports on his interview of Dr. V. S. Ramachandran, an expert in agnosognosia and author of Phantoms in the Brain: Probing the Mysteries of the Human Mind:

Ramachandran has used the notion of layered belief — the idea that some part of the brain can believe something and some other part of the brain can believe the opposite (or deny that belief) — to help explain anosognosia. In a 1996 paper [54], he speculated that the left and right hemispheres react differently when they are confronted with unexpected information. The left brain seeks to maintain continuity of belief, using denial, rationalization, confabulation and other tricks to keep one’s mental model of the world intact; the right brain, the “anomaly detector” or “devil’s advocate,” picks up on inconsistencies and challenges the left brain’s model in turn. When the right brain’s ability to detect anomalies and challenge the left is somehow damaged or lost (e.g., from a stroke), anosognosia results.

In Ramachandran’s account, then, we are treated to the spectacle of different parts of the brain — perhaps even different selves — arguing with one another.

We are overshadowed by a nimbus of ideas. There is our physical reality and then there is our conception of ourselves, our conception of self — one that is as powerful as, perhaps even more powerful than, the physical reality we inhabit. A version of self that can survive even the greatest bodily tragedies. We are creatures of our beliefs. This is at the heart of Ramachandran’s ideas about anosognosia — that the preservation of our fantasy selves demands that we often must deny our physical reality. Self-deception is not enough. Something stronger is needed. Confabulation triumphs over organic disease. The hemiplegiac’s anosognosia is a stark example, but we all engage in the same basic process. But what are we to make of this? Is the glass half-full or half-empty? For Dunning, anosognosia masks our incompetence; for Ramachandran, it makes existence palatable, perhaps even possible.

I don't see how Ramachandran's left-brain-right-brain conjecture fits within the conventional assignment of characteristics to each, but that doesn't invalidate the idea that different parts of our brains tend to generate different perceptions in different ways, and somehow we pick one and reject the others. It seems plausible that different people would tend to give different weightings to these conflicting signals.

This seems like a good place to quote the line attributed to Josh Billings (and others), which was the subtitle to this blog for its first two plus years: "The trouble with people is not that they don't know but that they know so much that ain't so."

Article originally appeared on realitybase (http://www.realitybase.org/).
See website for complete article licensing information.