How Cognitive Dissonance runs your life! (pt 2A)
Last week, I spoke about what Cognitive Dissonance is. Basically the theory breaks down into this: when we hold two opposing thoughts about something, our brain will automatically try to justify our decisions. This is true even if we may logically know that our reasoning is in direct contrast to something we know we shouldn’t be doing. Eventually that “justification” becomes so self-evident, so automatic for us, that we start to trust and believe in the justification, more than we trust the sound logic.
Today’s post will talk about just how pervasive cognitive dissonance works in our everyday lives and how small justifications typically build into much larger ones.
Cognitive Dissonance is a psychological theory that that touches every single part of our lives, but it is one of our blind-spots. It starts out so innocently and has the potential to work wonders in our lives or end in disaster.
A woman and a bad relationship
Let’s look at an example of how Cognitive Dissonance starts. Have you ever seen a relationship where the guy treats a woman like shit, yet the woman keeps going back to him? This is happening while you’re on the outside looking in asking yourself, “Why the hell is she still with him?!?”
This theory helps best to explain how these situations occur. In the situation where the woman stays in a dysfunctional relationship, the relationship usually didn’t start that way. How it started was like any other relationship – boy meets girl, girl likes boy and they start hanging out.
Over time, something small happens that puts the woman in a bind morally. It’s usually not a large grievance, but it’s something so small and imperceptible, that the woman justifies the reasoning away in a moment. Yet, that small instance is something that normally the woman wouldn’t tolerate. She justifies it away with something like, “Oh he was tired,” or, “He’s really nice when he doesn’t drink.”
The Snowball Effect
And that’s how it all starts. With that small justification comes the pyramid of choice.
What’s the pyramid of choice? It’s a metaphor for how, when faced in a morally ambiguous and gray situation, we stand at the top of that pyramid, being able to see both the positive and negative aspects of a certain choice. Once a choice is made, we start to justify those once moral dilemmas, into reasons that allow us to believe we are backed by fundamentally sound logic. Standing at the top of this pyramid occurs anytime you embark on something new for yourself and can see the benefits of both not taking the action and taking the action.
For example, a study in the late 1950’s by social psychologist Judson Mills showed how cognitive dissonance starts. In the study, he initially measured the student’s attitudes towards cheating. He then gave them a test, with prizes as awards. The problem for the student’s was that he rigged the test thereby making it impossible for the students to “win” unless they cheated. He also left the room, making it easier for them to cheat (he was watching them to see who cheated). The results showed half the students cheated. Nothing earth shattering thus far. What was surprising though is that the student’s who had cheated, saw it as a less serious offense than they originally did (they became more lenient towards cheating), and those that didn’t cheat thought it was a more serious offense than they originally did (they adopted a harsher attitude towards cheating).
Move this example a bit further and you can see how an instance like this starts us snowballing down the pyramid of choice. The students started in an ambiguous decision. They could both see the positives for cheating (the prizes) and the negatives (cheating isn’t “right”). Literally by the next day however, whatever actions they took in that one ambiguous situation had already started to become, “what they believe.” And, if you’ve been reading this blog, you know that “what you believe” is what you do – Period.
If you were to expand this study by taking one of the students that cheated and have him take a test the following week in which he didn’t study for, he would be much more apt to cheat. If he gets away with the cheating again, and gets a good grade he will fall further down the pyramid. Each action that was once morally ambiguous becomes easier and easier to justify. Beyond that, he goes from having the decision being morally ambiguous to something he feels justified in being able to do. If after a couple of years of cheating and getting away with it, he has a teacher that catches him, he will feel not only justified in cheating, but actually be upset at the teacher for “being so strict.” If she tries explaining that “Cheating is wrong” the words will fall on deaf ears.
Beyond that, he is more likely to dislike the teacher and not want to listen to anything she says because he says, “She’s such a witch.” (Yes, I was going to use a different word).
In the book, Mistakes Were Made, the authors sum up the pyramid of choice pefectly:
“Often, standing at the top of the pyramid, we are faced not with a black-and-white, go/no-go decision, but with a gray choice whose consequences are shrouded. The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment – action, justification, further action – that increases our intensity and commitment, and may end up taking us far from our original intentions or principles.” (pgs. 33- 34)
Once a decision is made, be prepared to defend it
“Once we make a decision, we have all kinds of tools at our disposal to defend it.” (pg. 21)
At the top of the pyramid of choice, after the snowball effect starts to take place, how do we go about justifying our behavior, not only to others, but primarily to ourselves? One of the main ways that we’re blind to our own errors and hypocrisy is through confirmation bias. Confirmation bias basically means that we favor information that is in alignment with our beliefs, whether that information is true or not. Confirmation bias also helps to explain why we get stuck in our beliefs – we both ignore conflicting information, and call any conflicting information “stupid.”
This is a major problem with modern day politics. In Newsweek’s article, How Dumb Are We?, the author lists stats about what American’s want versus what they believe is standing in the way of what they want.
“The current conflict over government spending illustrates the new dangers of ignorance. Every economist knows how to deal with the debt: cost-saving reforms to big-ticket entitlement programs; cuts to our bloated defense budget; and (if growth remains slow) tax reforms designed to refill our depleted revenue coffers. But poll after poll shows that voters have no clue what the budget actually looks like. A 2010 World Public Opinion survey found that Americans want to tackle deficits by cutting foreign aid from what they believe is the current level (27 percent of the budget) to a more prudent 13 percent. The real number is under 1 percent. A Jan. 25 CNN poll, meanwhile, discovered that even though 71 percent of voters want smaller government, vast majorities oppose cuts to Medicare (81 percent), Social Security (78 percent), and Medicaid (70 percent). Instead, they prefer to slash waste—a category that, in their fantasy world, seems to include 50 percent of spending, according to a 2009 Gallup poll.
“Needless to say, it’s impossible to balance the budget by listening to these people. But politicians pander to them anyway, and even encourage their misapprehensions.”
The article goes on to say that there’s hope because when people were taught how the budget truly works, people are willing to change their opinions. And although that’s somewhat hopeful, the real question is how do you get people to actually listen. The thing is, you don’t, because they’re not listening.
How do I know? Because of reality. The reality is that these are the statistics. People are this ignorant, not because the information isn’t out there and easily accessible, but that people feel over-whelmed with the information – with little to no hope of actually being able to change the situation even if they did obtain it. So what do they do? They confirm those beliefs by not looking for the information. They get lost in reality TV shows like The Jersey Shore and stop listening to politicians altogether. This is confirmation bias in action.
Tomorrow I’ll be back with part 2B – Our Self Concept. I’ll talk about how our self-concept is formed and Cognitive Dissonance plays a part in it.