Why we acquire beliefs and refuse to change them
Friday, August 28, 2009 at 01:34PM
Skeptic in Epistemology, Favorites, Politics

John Maynard Keynes concluded his magisterial General Theory of Employment, Interest and Money (1935) with these observations about the power and persistence of beliefs:

[T]he ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back. I am sure that the power of vested interests is vastly exaggerated compared with the gradual encroachment of ideas. Not, indeed, immediately, but after a certain interval; for in the field of economic and political philosophy there are not many who are influenced by new theories after they are twenty-five or thirty years of age, so that the ideas which civil servants and politicians and even agitators apply to current events are not likely to be the newest. But, soon or late, it is ideas, not vested interests, which are dangerous for good or evil.

Although Keynes did not expect those old enough to hold political power to change their beliefs, he expressed optimism that subsequent generations can and will adopt better ones. In contrast, the quotation from Josh Billings in the subtitle of this blog reflects a frustration with the slowness with which new ideas take hold because of the inability of old dogs to learn new tricks. To a large extent the purpose of this blog has been to confront prevailing beliefs with contrary facts to see what, if anything, would happen. In this post, I speculate a bit on why beliefs are so persistent. [UPDATE 10/15/2010: The Josh Billings quotation, which has been replaced, was "The trouble with people is not that they don't know but that they know so much that ain't so."]

By "belief" I mean any idea we are reluctant to change or re-examine. Such a belief may have been acquired via a rigorous analytical process based on measurements, by received wisdom or folklore, by experience, or in other ways. We may have forgotten the details of how and why we acquired the belief and be unable to explain it to others. A belief may have deep emotional importance such as religious dogma or be devoid of emotional content such as a belief that there are 50 States in the USA or that 8 x 9 = 72. A belief may be true or false, but either way we resist consideration of contrary proofs because we "know" we are right.

Perhaps the tendency to unchanging beliefs has evolutionary advantages, for example:

Aids Learning. Our minds seem to discern, create, and use patterns to see, hear, think, etc., and things that don't fit a familiar pattern are likely to be rejected or distorted.  Random facts like nonsense syllables are more difficult to learn and recall than facts that can be anchored to an existing body of knowledge.  If a new idea contradicts the existing knowledge, one may have to do a great deal of mental work to reorganize and/or prune old ideas in order to incorporate the new idea.

Constitutes Intellectual Capital. To the extent our beliefs derive from education and experience, they constitute part of our "intellectual capital," which increases our efficiency. We can make more decisions faster if we can plug a few facts into a template instead of having to start anew to solve a problem as though we'd never seen anything like it before. In large part, this is the difference between an experienced lawyer (or other professional) and a newbie.  A smart newbie given enough time can perhaps solve any problem that is presented, but the valuable professional is one who already has solved 80% of the problem by having worked through similar situations in the past. 

Minimizes the Discomfort of Uncertainty. Perhaps it is stressful to feel confused and in a strange and meaningless situation with no patterns that seem familiar.  Perhaps such stress aids survival by keeping us away from situations that may be dangerous because we don't know how to handle them. If gaining knowledge to eliminate that stressful feeling is a deep psychological need, perhaps people would feel better about having an objectively false or foolish belief than about saying to themselves, "I don't know."  If so, when a belief is challenged, there is a threat to remove a "security blanket." 

Provides Tribal Glue. Beliefs are tribal glue, I think.  There is practical value in believing or pretending to believe what people important to us believe, even if such beliefs are entirely false and utterly stupid.  Churches are typically upset by heretics in their midst even if they are very tolerant of disbelievers outside the church.  Our public beliefs are part of our social identity, and it doesn't matter whether the beliefs are about "facts" or about theory or something that is unverifiable.  It may well be difficult to trust or rely on a tribe member whose behavior we cannot predict. Perhaps this social cohesion function explains the odd finding that scientists as a group are very unlikely to believe in God. "Most polls show that about 90% of the general public believes in a personal God; yet 93% of the members of the National Academy of Sciences do not."—Sam Harris here. That doesn't mean scientists function without beliefs. They just have a different set of beliefs including a belief in the sanctity of facts observed in the physical world, that the world is orderly and (eventually) understandable by mortals, and that there are no miracles or interventions by a Supreme Being. 

Improves Social Status. Belief may be better than actual knowledge in a contest for leadership of the pack. It's hard to be the leader when you admit you need to study the issue and a rival confidently expresses a belief.  A blog comment I read recently on the subject of why very smart economists often publish dumb economic analysis and advice said he was told by a "smirking" professor, that "it's better to be wrong than to be irrelevant." 

Reinforces Authority. Beliefs may be useful to maintain hierarchical or social control and discipline. One whose conduct has been inconsistent with orthodox beliefs is not permitted to challenge the correctness of the beliefs. He may argue that his conduct was in fact consistent with the beliefs or reaffirm the beliefs and explain why it won't happen again. On such occasions, Homer says miscreant Greeks blamed "ate," translated as blindness or confusion temporarily thrust upon them by the gods. 

Update on Monday, August 31, 2009 at 12:25PM by Registered CommenterSkeptic

 An email this morning asks why I lump "facts" with "beliefs." My response:

I was looking at belief subjectively, from inside the head of the believer. From that perspective, I doubt there is any difference between beliefs that are objectively true and those that are objectively false because I doubt we choose to “believe” things we recognize as false. I expect that neurologically we process identically true ideas and false ideas we erroneously believe to be true. Of course, Realitybase concerns itself with the social problems created by false beliefs, but I can imagine that true beliefs can cause problems also, at least for individuals.  For example, if you believe the US has 50 States, you may have difficulty fitting into a cult where everybody believes there are only 48 or 13.

Update on Saturday, September 19, 2009 at 04:00PM by Registered CommenterSkeptic

Joel Stein believes in vaccinating babies. His wife wasn't so sure. So they attended a seminar by a doctor who gave all the reasons not to vaccinate.  They left at the break, and their infant got vaccinated.  Joel explains

. . . . I'm pretty confident in the way I get my knowledge. Even in the age of Google and Wikipedia, we still receive almost all of our information through our peers. I believe in evolution not because I've read Darwin but because everyone I know thinks it's true. When presented with doubts, I don't search for detailed information from my side. I go with the consensus of mainstream media, academia and the government. Not because they're always right but because they're right far more often than not, and I have a TiVo to watch. Also, unlike antivaccination people, they usually shut up after a little while.

That sounds like belief as tribal glue.

Update on Saturday, October 24, 2009 at 08:24AM by Registered CommenterSkeptic

The following about the tribal glue aspect of belief is copied from an earlier post on the problems created by how economics is taught. Who you are is defined by what you believe?

There are millions of citizens and public officials who graduated from college believing that what they were taught in two semesters of economics was "true." That's how the text books are written, and teaching the current standard model may be the easiest way to cover a lot of material. For the rest of their lives, most of these graduates will start their thinking about every economic issue from these same core beliefs--including beliefs that professional consensus later finds objectively false or rarely relevant in the real world. Pete Peterson is an egregious but typical example.

In a little while, the standard model in economics texts will be revised, and some different set of "truths" will be instilled into the next generation of collegians. Whatever they are, these new truths will eventually be spectacularly contradicted by the real world and will be replaced, leaving millions of angry and distrusting believers unable to adjust. So updating the standard model does not get at the fundamental problem that economics is often taught, and even more often received, as a secular religion. Professors, you're more responsible for this than your alumni.

Perhaps if the first (for most students, the only) year of economics were taught as history of economic thought and political philosophy, professors would portray less certainty and, thus, do less long-lasting damage. Or teach the course by the case method in which students are exposed to a full range of conflicting arguments and materials and come to the realization that we don't know for sure why what happened happened, that subtle differences in the facts might have led to different outcomes, and that there is a lot more uncertainty in the real world than in the virtual reality models of economic "science." Perhaps graduates of such programs would be more apt to analyze unique real world problems with the tools of economics rather than to thoughtlessly apply the same "solution" to every problem.

Update on Saturday, February 6, 2010 at 07:48AM by Registered CommenterSkeptic

John Quiggin asks why some economists are still talking about The Great Moderation in the present tense, and gives his answer:

The answer can be sought in the internal dynamics of the economics profession. The Great Moderation vanished in 2008 and 2009, but the academic industry built to analyze it did not. Research projects based on explaining, measuring and projecting the Great Moderation, were not abandoned, and the careers based on those projects could not be diverted quickly into other ends.

Perhaps this is an example of beliefs as intellectual capital.  Objectively, the investment one has made has become worthless, or worth less, and needs to be written down.  A graduate student and his professor are confronted by the need to write off several years of hard work and just can't bring themselves to do it.  Probably, there are tribal glue and heirarchical maintenance aspects to this too--if the professor were to declare the field dead and move on, that abandonment would upset many social relations, not least with his Ph.D. candidates. 

Update on Thursday, February 18, 2010 at 09:15AM by Registered CommenterSkeptic

Macroeconomic Resilience, writing about how it was advantageous to banksters engaging in extremely risky trading activities to convince themselves that their strategies were safe because a true believer is better able to convince others, discusses here the more general question of why we believe things in the face of contrary evidence. The post suggests two reasons, which are similar but not identical to some of my six: First, being a true believer makes you more persuasive, and being persuasive makes you more successful.  Second, ignoring contrary evidence avoids the expenditure of time and energy that would be necessary to revise your beliefs and the "directionless indecision" of trying to live and act while aware that you don't know something important.

There is another question which although not necessary for the above analysis to hold is still intriguing: How and why do people transform into true believers? Of course we can assume a purely selective environment where a small population of true believers merely outcompete the rest. But we can do better. There is ample evidence from many fields of study that we tend to cling onto our beliefs even in the face of contradictory pieces of information. Only after the anomalous information crosses a significant threshold do we revise our beliefs. For a neurological explanation of this phenomenon, the aforementioned paper by V.S. Ramachandran analyses how and why patients with right hemisphere strokes vehemently deny their paralysis with the aid of numerous self-deceiving defence mechanisms.

Jeffrey Friedman’s analysis of how Cioffi and Tannin clung to their beliefs in the face of mounting evidence to the contrary until the “threshold” was cleared and they finally threw in the towel is a perfect example of this phenomenon. In Ramachandran’s words, “At any given moment in our waking lives, our brains are flooded with a bewildering variety of sensory inputs, all of which have to be incorporated into a coherent perspective based on what stored memories already tell us is true about ourselves and the world. In order to act, the brain must have some way of selecting from this superabundance of detail and ordering it into a consistent ‘belief system’, a story that makes sense of the available evidence. When something doesn’t quite fit the script, however, you very rarely tear up the entire story and start from scratch. What you do, instead, is to deny or confabulate in order to make the information fit the big picture. Far from being maladaptive, such everyday defense mechanisms keep the brain from being hounded into directionless indecision by the ‘combinational explosion’ of possible stories that might be written from the material available to the senses.” However, once a threshold is passed, the brain finds a way to revise the model completely. Ramachandran’s analysis also provides a neurological explanation for Thomas Kuhn’s phases of science where the “normal” period is overturned once anomalies accumulate beyond a threshold. It also provides further backing for the thesis that we follow simple rules and heuristics in the face of significant uncertainty which I discussed here.

Update on Monday, March 1, 2010 at 01:17PM by Registered CommenterSkeptic

Lawrence M. Krauss discusses the function of belief for theoretical physicists in Hiding in the Mirror, Viking Press (2005).  After describing the development of theoretical models of the universe that attempt to tie all the known forces of nature, including especially gravity, and quantum theory together into one model that explains everything, Krauss says none of these conjectures has been proven experimentally but that efforts have nevertheless been worthwhile. 

It does a disservice to the most remarkable century in the history of human intellectual investigation to diminish the profound theoretical and experimental discoveries we have made in favor of what is at the present time essentially well-motivated, educated speculation.  It is also simply disingenuous to claim that there is any definitive evidence that any of the ideas associated with string theory yet bear a clear connection to reality, or that they will even survive in their present form for very much longer.  Perhaps more to the point, the deeper we probe these theories, the hazier they seem to have become.

At 247.  Krauss goes on, at 247-49 to discuss the role of Edward Witten, who has been the leading force driving string theory since the mid 1980s.” 

Ed may be a “true believer” in string theory, but that simply reflects the very nature of his position on the theoretical forefront.  It is, as I have stressed, very difficult to devote the incredible intellectual energy and focus that are required over long periods of time in the attempt to unravel the hidden realities of nature if one does not have great personal conviction that one has a good chance of being on the right track. . . .

The same level of personal conviction is required of artists and writers, as well.  But what makes science somewhat different, I believe, is that great scientists are prepared to follow an idea for as long as decades, but at the same time are equally prepared to dispense with all of this effort in a New York minute if a better idea or a contradictory experimental result comes along. 

Two comments:  First, the “belief” discussed by Krauss and attributed to Witten is probably not the belief that the various conjectures are actually and literally truth but perhaps only that they are promising avenues for investigation and that they may very well turn out eventually to have an important element of objective, proven truth in them.  However, as Krauss points out at 249, when you talk to some theoretical physicists, their enthusiasm may well sound like belief in “truth.”  Second, if Thomas Kuhn is right in The Structure of Scientific Revolutions (1962), then it takes a great deal more than a “New York minute” to dispense with all the work of one’s career.  Doing that goes against all six of the reasons for belief outlined in the initial post:  Learning the new paradigm is difficult because it doesn’t accrete to the existing body of knowledge.  One has to “write down” intellectual capital.  One must live with the stress of uncertainty—at least until one adopts as certain a replacement belief.  One tends to lose his/her tribal affiliations (unless the whole tribe migrates together from old belief to new belief).  One is likely to suffer diminution of social status and authority because others will have greater expertise in the new paradigm and will be perceived as having been "right" instead of "wrong."

Update on Monday, July 12, 2010 at 12:33PM by Registered CommenterSkeptic

John F. Kennedy said believing that we know makes us comfortable.

Update on Sunday, February 20, 2011 at 09:39PM by Registered CommenterSkeptic

David Warsh suggests many beliefs are shibboleths, which are verbal means of protecting tribal integrity.  His example:  Belief that the Smoot-Hawley tariff act not only caused the Great Depression but also the precipitating event of the stock market crash 9 months before enactment. This loyalty test didn't really exist until WSJ editorial writer Jude Wanniski invented it in 1978.  It's been thoroughly refuted by students of the facts and disowned by even Milton Friedman but is still a shibboleth of conservative politicians.

Update on Saturday, December 28, 2013 at 11:20AM by Registered CommenterSkeptic

Mathbabe has an interesting post on how the opinions of others affect our own opinions and conviction levels. How do opinions and convictions propagate?  She discusses a scientific paper reporting on the results of experiments on a small sample of people re opinions not likely laden with strong emotions. It appears people tend toward consensus and gain increasing conviction from either a small group with strong convictions or a large group of people weakly holding the same opinion.

The next result they found was that the dynamical system that was the opinion making soup had two kinds of attractors. Namely, small groups of “experts,” defined as people with very strong convictions but who were not necessarily correct (think Larry Summers), and large groups of people with low convictions but who all happen to agree with each other.

Mathbabe explores the implications and, e.g., wonders how these findings might be relevant to political opinions. 

I'm not sure how this report fits into the taxonomy of my original post. Tribalism?  Avoiding uncertainty?  Something else?

Update on Monday, January 13, 2014 at 07:21AM by Registered CommenterSkeptic

Noah Smith writes about shared falsehoods as tribal glue and speculates about the origins of that tendency. Before technology, nobody had a superior ability to observe reality and there was no reason to doubt the reports of a tribe member with no apparent motive to dissemble. Then we began to have people with superior observational powers enhanced by technology.

At first, Western society reacted the way any sensible low-tech society would - by punishing the people who claimed to have some special knowledge. But as it became clear that This Time Really Was Different, Western civilization grew to embrace the iconoclast. The lone inventor, the brave whistleblower, the brilliant scientist, the skeptic - these are some of our greatest heroes. "Think different", our billboards tell us. "If all your friends jumped off a bridge, would you jump off a bridge too?" is a question we ask our kids, and we teach them to say "No!" (even though if you think about it, the sensible answer is probably "Yes").

. . . .

What Western civilization had done is to discover a kind of reality that was beyond Tribal Reality, and was beyond even the more general phenomenon of Consensus Reality. They discovered Extant Reality, the reality that hits you in the head even if you and everyone you know fails to believe in it. Extant Reality is a pitiless, cold, frightening thing. It's a monster from an H.P. Lovecraft story. It is not something you would choose to have exist, but that's the point - you don't get to choose. Western civilization, by being the first to get past that technological critical point, was the first to be forced to reconcile itself to the inability of Tribal Reality to stave off Extant Reality. So the West was the first to begin the hard, painstaking, uncertain task of bending Extant Reality to the will of humankind.

Noah goes on to discuss the Wisdom of Crowds and the Efficient Market Hypothesis.

Article originally appeared on realitybase (http://www.realitybase.org/).
See website for complete article licensing information.