Christian Belief Through the Lens of Cognitive Science: Part 5.5 of 6

By Valerie Tarico

A: An organized system of learned behavior tha...Image by zachstern via Flickr

How Beliefs Resist Change

The Jesuits have a saying sometimes attributed to Francis Xavier, “Give me the child until he is seven, and I will give you the man.” The Jesuits were a tad optimistic, but ample research on identity formation shows that religious, cultural, and political identity become established by early adulthood and rarely change thereafter except in response to crisis. In fact, even in the face of crisis, core beliefs about who we are and why we are here, can be remarkably resilient.

This is due in part to the fact that individual beliefs do not exist in isolation. Rather, each exists as part of a whole network of other beliefs, memories, and attitudes. The more central or important any given belief, the more it is entangled with the rest of our world view. And the more it is tied into the tangle, the harder it is to change. Because religious views are so central, they are particularly resistant to change.

To make things even more complicated, each religion has what can be called an immune system. Because traditional Christianity is centered on orthodoxy, meaning right belief, the immune system consists of a set of teachings that guard against other beliefs or loss of belief. Christianity’s immune system includes the following teachings:

· Doubt is a sign of weakness or temptation by Satan, the father of lies.
· False teachers (those whose theology differs) should be cast out.
· Believers should not be unequally yoked (partnered) with nonbelievers.
· Nonbelievers have no basis for morality, so their motives are suspect.
· If Christians act badly, the flaw is in the persons, not the religion.

Given that core beliefs are naturally resilient and given the power of messages such as these, it will come as no surprise that people go to extreme lengths psychologically to defend religious dogmas.

Cognitive dissonance theory, helps us to understand what happens when people are confronted with contradictory beliefs. If, for example, I believe the world is fair (called a Just World Hypothesis), but a kind, generous neighbor gets assaulted and hurt, I am faced with a contradiction. I can revise my view of the world (it isn’t so fair), the neighbor (she isn’t so good), or the harm done (it wasn’t so bad). Surprisingly often, people resolve such contradictions in favor of a treasured belief rather than in favor of the evidence—even if this requires blaming victims for their own suffering or coming up with elaborate justifications for catastrophes. When the catastrophe is the apparent failure of a prophecy or the moral failure of a religious leader, such justifications can be spectacular.

In Doubting Jesus' Resurrection: What Happened In The Black Box?, Kris Komarnitsky offers an nice overview of cognitive dissonance concepts followed by a series of jaw dropping stories from history – each showing the extreme contradictions believers can accommodate. Small apocalyptic cults suffer the devastating failure of end-of-the-world prophecies and yet each, faced with crushing disappointment, finds some interpretation that leaves the cult belief system intact. In this light, Komarnitsky examines the pressures faced by Jesus followers when his triumphal entry into Jerusalem was followed by torture and death.

Each religion has what can be called an immune system. A small close-knit cult fending adjusting to the disappointment of another ordinary sunrise is just an extraordinary example of ordinary – the human tendency toward confirmatory thinking. All of us are biased to seek information that fits what we already believe. Confirmatory evidence jumps out at us, and we find it emotionally appealing. It’s like our minds set up filters – with contradictory evidence stuck in gray tones on the outside and the confirmatory evidence flowing through in bright and shining color.

Unfortunately, confirmatory thinking causes all kinds of problems. Corporate leaders fall into group think about the best competitive strategy. Jurors assume an accused criminal is guilty. Politicians fabricate reasons for war—sure that the real evidence must be there somewhere. Confirmation bias is so built into human thinking that the whole scientific endeavor is structured essentially to get around it. The scientific method has been called, “What we know about how not to fool ourselves.” And yet, as we know, even scientists end up embarrassing themselves from time to time by getting a little to eager to confirm their pet theories and forgetting how easy it is to fall prey to our own filters.

Even outside our personal information filters is a set of ring defenses: our communities. Who forwards you email? What magazines do you subscribe to? What shows do you watch? Because confirmation is so satisfying and contradiction is so uncomfortable, we surround ourselves with friends and colleagues and coreligionists who think like us. Often, we join groups that do the filtering for us: Democrats for America, The Nature Conservancy, Assemblies of God, The National Rifle Association. These groups provide a steady flow of information confirming and elaborating what we think we know—and ensuring that a lot of contradictory information never makes it anywhere near our brains. They let us short-cut. Instead of weigh the quality of arguments and evidence – we look at the source and either raise or lower a draw bridge.

In an even more impervious form of this, we form a group identity: I’m a Catholic. I’m a Republican. I’m an American. I’m a Woman. I’m Hispanic. I’m a Calvinist. Each of these identities creates what I call a tribal information boundary (TIB). TIB’s are remarkable efficiency devices, allowing us to weave coherent story lines about the world around us. But for someone seeking to understand complicated realities, they can be tremendously costly.

When we actually allow ourselves to bump up against the limitations of our world view, when we acknowledge we’ve hit a wall and then find a way over or around it—that is when growth is most likely to occur. In the 1998 comedy, “The Truman Show,” the protagonist, played by Jim Carrey, pushes past an information boundary and realizes he is living in the artificial world of a television set. From childhood, Truman has accepted the explanations and roles offered him. But he is confronted with small discrepancies, and one day he ignores his own fears and barriers that his community has erected, and punches through to the world outside. The movie’s message to us all: It is possible.



Comments

  Books purchased here help support ExChristian.Net!