Here is an essay about one way our minds work. It is apparently one of a series about how our minds work. Might be valuable.
--Kim
Okay - before I get into this long ass essay from the Critical Thinking series - let's address the elephant in the room. After rereading the book used for the essay and writing most of the following words, I went back and did some light research on the authors, only to realize that Carol Tavris has been actively involved in anti-transgender activism, publishing pieces in Skeptic magazine with transphobic talking points and being associated with gender-critical groups.
This book came out in 2007 and I read it around 2010 - long before the 81 year old Tavris wrote her first anti-trans screed.
I could find another book that says some of the same stuff - but let's be real - that's way too much work for an essay I'll probably make 5 bucks on. So, I'm not going to redo all the work I've already done - I am just going to focus on the co-author Elliott Aronson whose biggest critiques seem to be that he is too left leaning. And from now on, I guess I have to look up the authors before I start one of these.
Lesson learned - moving on.
This is Essay #7 - you can find 1-6 by clicking on the hashtag at the bottom of this post. Okay, let's go!
In the last essay, we talked about confirmation bias. How your brain filters information to show you only the parts that confirm what you already believe. How you accept evidence that supports your position while scrutinizing evidence that contradicts it.
But here's the question that probably occurred to you while reading that: why? Why is your brain so desperate to prove you were right all along?
The answer is in a book called "Mistakes Were Made (but Not by Me)" by Carol Tavris and Elliot Aronson. And what they reveal is way more unsettling than confirmation bias alone.
The mechanism is called cognitive dissonance.
You've heard the term - now let's really dig into what it means.
Here's the basic setup. When you hold two contradictory beliefs, or when your actions conflict with your self-image, your brain experiences psychological discomfort. Not metaphorical discomfort. Actual, measurable distress. Your mind cannot tolerate the contradiction.
So what does it do?
It resolves the dissonance by changing one of the beliefs. And here's the key part: it almost never changes your self-image or your core beliefs. Instead, it changes how you interpret your actions or the facts.
Aronson calls this self-justification, and it's the reason people can commit terrible acts while maintaining their identity as good people. It's the reason intelligent people believe absurd things. It's the reason someone can be objectively wrong about something and become more confident in their wrongness over time.
Let's breakdown how this works.
Say you think of yourself as an honest person. That's part of your self-image. Core to your identity. Then you do something dishonest. Maybe you lie on your taxes. Maybe you cheat on a test. Maybe you mislead someone about something important.
Your brain now has a problem. You believe you're honest. You just acted dishonestly. These two facts cannot coexist comfortably.
Cognitive dissonance kicks in. That psychological discomfort starts ramping up. Your brain needs to resolve this contradiction immediately.
So, you have two options. You can change your self-image and admit you're someone who does dishonest things. Or you can reinterpret your dishonest action so it doesn't count as really dishonest.
Guess which one your brain picks. Every single time.
"Everyone cheats on their taxes. I'm actually paying more than most people." "The test was unfair anyway. The professor was terrible. I deserved that grade." "I was protecting their feelings. Telling them the truth would have been cruel."
Watch what just happened. You didn't change your behavior. You didn't admit you were wrong. You changed the story so you're still the good guy.
Aronson explains that this process happens automatically. Your conscious mind doesn't sit there weighing options. System 1 jumps in, rewrites the narrative, and delivers you a version of events where your self-image remains intact. By the time System 2 gets involved, the story is already coherent. You genuinely believe your own justifications.
This is where it gets dangerous.
Once you've justified something once, the next time is easier.
You've already established the mental pathway. Already created the narrative framework. The lies you tell yourself become smoother, more automatic, more convincing. Aronson calls this the pyramid of choice.
Imagine two people starting at the top of a pyramid, close together, both facing a decision. One person makes choice A, the other makes choice B. At first, they're barely different. But as each person justifies their choice, as cognitive dissonance forces them to defend their decision, they move further down opposite sides of the pyramid.
The person who chose A has to convince themselves A was the right choice. So they focus on A's benefits and B's flaws. The person who chose B does the opposite. Over time, they end up at the bottom of the pyramid, miles apart, each absolutely convinced they were right and the other person is crazy.
They didn't start with fundamentally different values. They started with one small choice and then self-justification did the rest.
Here's where this connects to everything we've discussed so far.
Your lazy System 2 doesn't want to admit it was wrong because that requires work (Essay 2). So it uses confirmation bias to only see evidence supporting your choice (Essay 6). That evidence creates cognitive ease - it feels right (Essay 4). You've been primed by your environment to see your choice as correct (Essay 5). And now cognitive dissonance ensures you'll keep justifying that choice forever, getting more extreme in your position with each justification.
As you can now see - you're not moving toward truth. You're moving down the pyramid, away from everyone who made different choices.
This shows up everywhere in American politics right now.
Someone votes for a candidate. That candidate does something questionable. Cognitive dissonance kicks in. "I'm a good person who makes good decisions. I voted for this person. Therefore, this person must not be that bad."
The alternative - admitting you made a mistake, that you supported someone who's doing damage - creates too much dissonance. So your brain resolves it by justifying the candidate. Then justifying the next questionable thing. Then the next. Each justification moving you further down the pyramid until you're defending things that would have horrified you two years ago.
Aronson documents this pattern in everything from wrongful convictions to medical errors to police brutality. Prosecutors who send innocent people to prison don't admit error even when DNA evidence proves them wrong. They double down. They find new theories. They attack the evidence. Because admitting they destroyed an innocent life would create unbearable cognitive dissonance.
Doctors who misdiagnose patients don't easily admit the mistake. They justify the original diagnosis, find ways to explain away contradictory symptoms, blame the patient for not presenting clearly. Because admitting error means confronting their identity as a competent physician.
This is the mechanism behind the lies Trump supporters tell themselves. They didn't start as people willing to defend obvious falsehoods. They started with a choice - supporting Trump. Each time he did something indefensible, cognitive dissonance forced them to either admit they made a bad choice or justify his behavior. They chose justification. Every single time.
Now they're so far down the pyramid they're defending him stealing classified documents, claiming January 6th was peaceful tourism, insisting that election fraud is real despite zero evidence. Not because they're any dumber than most people - but because they've been justifying, justifying, justifying for years, and each justification made the next one easier.
But here's what's critical to understand. Cognitive dissonance affects everyone. It's not a Trump supporter thing. It's a human brain thing.
If you made different political choices, you're justifying those choices too. You're filtering information to support your decisions. You're moving down your side of the pyramid, getting more entrenched in your position, more certain you're right.
The difference - and this is crucial - is whether the things you're justifying are based in reality or not.
Cognitive dissonance can make you justify a policy choice that turns out to have unintended consequences. That's human. But cognitive dissonance making you justify obvious lies, fabricated evidence, and anti-democratic behavior?
That's where the mechanism becomes dangerous.
The problem isn't that one side has cognitive dissonance and the other doesn't. It's that one side is using cognitive dissonance to justify defending reality, while the other side is using it to justify defending lies.
Aronson emphasizes that self-justification is hardest to overcome in people who think of themselves as objective. The more you believe you're rational, the more you trust your own judgments, the harder it is to admit you might be wrong. Your self-image as a clear thinker becomes part of what you need to protect.
This is why smart people can be so stubbornly wrong. Their intelligence gives them better tools for self-justification. They can construct more elaborate narratives, find more sophisticated reasons why they're actually right, poke more convincing holes in contradictory evidence.
That last paragraph felt like a read.
So what can we do about this?
Aronson suggests that the earlier you catch it, the easier it is to correct. Right at the top of that pyramid, when you first make a choice, that's when you have the most flexibility to admit error.
The more you justify, the harder it gets. Each justification moves you further down the pyramid, further from anyone who chose differently, further from being able to change your mind.
This means you have to catch yourself in the act of justifying.
You have to notice when you're rewriting the narrative to protect your self-image. You have to recognize that feeling of defending yourself, of explaining why you're actually right, of finding reasons why the contradictory evidence doesn't count.
I've been working on this a lot over the past few years. I fully realize that I'm not always right about everything - and so as a way of counteracting the immediate uncomfortable feelings of being told I am wrong - I've leaned into asking myself - could I be wrong?
If yes - then I just go ahead and acknowledge it right away. The truth is - most things that people argue over or disagree on can all use more context and thought. Someone pointing out a flaw in your thinking really only helps you in the end. You either learn something that you did not know, or you become better at articulating your argument through being questioned.
But what I've learned the most from doing this is what I do and don't care about arguing about.
Here's some other easy rules to follow: If you find yourself moving the goalposts, that's a red flag. If your defense of something requires increasingly elaborate explanations, that's a red flag. If you're dismissing evidence you would have accepted before you made your choice, that's a red flag.
Those are signs you're not defending truth. You're defending your ego.
Cognitive dissonance is going to keep operating whether you like it or not. Your brain needs to resolve contradictions. It needs to maintain your self-image. You can't turn that off.
What you can do is recognize when it's happening. When you catch yourself justifying, pause. Ask whether you're defending your position because it's right or because you need it to be right. Ask whether you'd accept this evidence if it confirmed your beliefs instead of contradicting them.
Most people won't do this. It's too uncomfortable. Admitting error is psychologically painful. Way easier to keep justifying, keep moving down that pyramid, keep getting more extreme in your positions.
But that's how you end up miles away from where you started, defending things you never thought you'd defend, trapped in a web of justifications you built yourself.
The only way out is catching yourself early. Before the justifications stack up. Before you're too far down the pyramid to climb back up.
Because once you've justified something enough times, once you've built your identity around being right about this, the cognitive dissonance required to change your mind becomes overwhelming. You'd rather believe the lie than face what you've become.
Okay - good talk.