Nathanael Garrett Novosel, October 1 2025

On Confirmation Bias

There is a huge irony regarding confirmation bias: people are subject to it when they are not aware of it, and they are still subject to it when they are aware of it (arguably more so).

There’s a lot of talk about confirmation bias today thanks to GenAI tools being taught to be agreeable (they aren’t as agreeable as people claim, though—they often tell you when you are wrong). However, there is, very ironically, a self-serving nature to all of this talk. It’s very surface-level “want to sound intelligent” commentary. Let’s explore the concept in more depth so you don’t just sound good on a LinkedIn or X post but can truly understand what’s going on to make your life better.

There are a set of truths we will break down in this blog post:


Everyone Thinks They Are Right

The first point to dissect is that everyone thinks that they are right. That might seem like an obvious statement, but there is a profoundness to its deeper meaning. Why does everyone think they are right? Because if they thought they were wrong, they would change their beliefs to the correct ones and reassume their state of being right. Also, people are more likely to speak up when they believe they are right…or when they are wrong in an effort to seek the right answer to become right. Again, this all sounds obvious, but the insight comes into the logic of why confirmation bias exists: people wouldn’t hold beliefs they thought were wrong, therefore, in their minds, they must be right.

The reason this is a seemingly obvious but really amazing insight is that this is why people are slow to change their beliefs. If people hold beliefs because they think they are true, then to change a belief, they would need to both admit they were wrong and accept all of the negative implications of that (they might’ve been doing the wrong thing; they might’ve hurt someone else with that false knowledge; they might’ve been wasting their time before; they might have lost or will lose social standing by being wrong). We’ll get to this point more in a moment, but for now, the point is that people look to prove their point because they both think they’re right and can’t handle the psychological and social impact of being wrong.


People Often Cling to the First Information or Belief They Hold

In psychology, there is something known as the Primacy Effect where initial information carries disproportionate weight on someone’s thoughts, beliefs, and behaviors. Again, seemingly obvious when you think about it, but in reality most people truly believe that they would immediately change their minds given new information that contradicts their beliefs. The truth is that most people accept the first information they receive more readily than new information on top of a wealth of existing information because they don’t have much of a choice if they don’t have any other information to go on. Sure, they can take initial information with a grain of salt, but there’s something about accepting the premise of new information that people often don’t notice.

The most famous example of this false premise sneaking into new information is the question, “Do you feel guilty when you beat your wife?” Clearly, the question assumes something that is not necessarily true, but people use this technique because of a psychology concept called Anchoring where any additional conversation on a topic gets influenced by the first statement. It’s why techniques like lowballing, foot-in-the-door, and putting overpriced accessories next to the high-priced main item all work so well: the person receiving the information has their worldview suddenly reframed by someone else’s strategic positioning and conveying of information. Because people disproportionately weight their first beliefs, they now accept the first frame of the situation and make decisions on new information relative to that first information. It’s why Macy’s for years sold many times more of a product when it was positioned as a $10 item at a 50% discount from a $20 item vs. a $10 item at the normal retail price.


A New Belief Requires Disproportionate Evidence to Overturn Initial Beliefs

Once you hold a belief—even one you didn’t know you formed—you now internalize it as part of your ego and social status that it is correct. To admit that it was wrong isn’t just switching to the correct belief but now making you question yourself and having others question you. Credibility is a valuable commodity, and the idea that someone has a decent probability of being wrong—by definition—hurts that credibility.

It may seem as simple as being able to analyze your thought process (famously known as “reviewing your work” for those of you who have ever taken a math class), identifying what led you to the wrong conclusion, and adjusting your knowledge and behavior accordingly. But, it’s often not for a variety of reasons. First, many people don’t care enough to learn new information. For example, if someone is telling a story and says that the car driving by that had the funny bumper sticker was blue and someone corrects them that it was really blue-green, that person may not care enough about the information to try to retain it.

Secondly, many people don’t bother to analyze where they went wrong because they just move forward with the new information and don’t learn from their mistakes. I recall a time where a colleague was leaving a day later from a conference than I expected, and I looked like my brain broke as I mentally replayed everything that transpired that led to me believing that she was leaving a day earlier (it turns out, another colleague was talking about leaving a day earlier, and the one leaving a day later historically left a day earlier and never spoke up to say it had changed this year). This may seem like a silly thing to care about, but I went through it specifically so that I could identify where my logic went wrong so I could correct it in the future. Most people would not care about such a mundane detail—and it doesn’t hurt their ego or social status for getting something so trivial wrong vs. getting something important wrong (which is why people will downplay they’re wrong by belittling the thing they were wrong about).

The main takeaway from this is that once a belief is held, new information that contradicts it seems to be wrong—and, if it were right, might change your entire perception of something. Because of the impact of the correction, it takes more effort to change a belief than to form it.


People Downplay Their Mistakes and Emphasize Their Correctness…While Emphasizing Others’ Mistakes and Downplaying Those Individuals’ Correctness

Attribution Bias is the concept in psychology describing the fact that people generally attribute good things about themselves to who they are and bad things to circumstances…while attributing bad things to others as due to who they are and good things due to circumstances. In short, if you succeed on a test, it’s because you were good; if others succeed on a test (especially when you didn’t), it’s because they were lucky. Famously, people with low self-esteem surprisingly reverse this bias: their success is due to luck, and others’ success is due to who they are.

Attribution bias is possibly the number one reason for hypocrisy (both real and perceived) in this world. After all, if someone else speeds or cuts you off in traffic, it’s because they’re a maniac; if you speed or cut someone off in traffic, it’s because you have a very good reason. That’s attribution bias, and it’s why everyone is a hypocrite to some degree. To reference the Louis CK bit made famous in the subsequent Denis Leary song, no one walks around saying, “I’m an asshole, and so of course I do this while you don’t do this. Because I’m an asshole, and you’re not.” Since no one says this (and, as I mentioned above, everyone thinks they’re right or they wouldn’t hold their beliefs), it’s always that they had a good reason to do something but others don’t.

The scariest part about attribution bias is its role in post hoc rationalization. When someone holds a belief that they are a good person and then that person does a bad thing, he or she immediately tries to find an explanation for why a good person would need to do a bad thing (or, worse, how the bad thing wasn’t really bad). This is where confirmation bias hits: because someone holds one belief, the facts and motivations behind another situation get shaped by that first belief without the person realizing it. That’s what makes confirmation bias so dangerous.

But, it works both ways: the same reason why a religious person would take a disproportionate amount of evidence to become an atheist is why a skeptic would take a disproportionate amount of evidence to become a believer. In some respects (and skeptics would vehemently disagree, making them more susceptible to confirmation bias, ironically), skeptics who identify themselves as skeptics are now subject to the same techniques of downplaying anything that refutes their beliefs and emphasizing anything that supports it. Just as you almost never hear a believer say, “There might not be a God and I’m just living in a fantasy world,” you almost never hear a skeptic say, “There might be a God and I’m just dismissing all contradictory evidence by assigning it to known causes that my limited brain can understand.” (I’m not suggesting either is correct—only that either party who saw something they couldn’t explain given their identify and worldview or contradicted it would be just as downplayed—no one is purely objective) Just as a believer might say, “That bad thing happened because God is testing me,” a skeptic would say, “God didn’t do that; this person did this, which led to that, which led to the thing that happened. Not God.”


Identity, Ego, and Social Status

This idea of holding beliefs through evidence that they’re wrong is, ironically, pointed out by most people as stupid due to attribution and confirmation bias. After all, it’s easy to point out how others hold stupid beliefs that should be changed and seeing all the reasons why they’re stupid; it’s another to point out ones’ own stupid beliefs and seeing any reason to change them. Again, this goes back to point one: everyone holds beliefs that they think are correct, so why would they change them??? Then, people make all kinds of decisions about how to think, act, and even see themselves based on those beliefs. Therefore, to change certain beliefs would require a monumental destruction and reconstruction of one’s identity, worldview, and ego. It’s no wonder people hold beliefs long past the point at which an objective party would tell them to let them go.

On top of that, humans are a social species, and all social relationships are built on trust. The reason for this is that the fundamental assumption made based on all positive human relationships is that the two parties interacting won’t hurt each other. If you thought someone would hurt you, you would not trust them enough to interact with them, give them information, help them, or be open to receiving help or information from them. Therefore, you need to trust the other party to befriend and listen to them.

As a result of this reality, one’s credibility and reputation is everything to social status. If you can’t trust a person, you can’t be close to them. A lot of trust is around reliability and following through with commitments, of course, but, without one’s deeds to go on, people gauge trust based on correctness and accuracy of information. Therefore, being wrong is not just a hit to one’s ego but also to their credibility and reputation. Again, it’s no wonder people insist that they are right long past the point at which an objective party would conclude that they lost the argument: they are saving face to retain credibility among their social circle.

It’s the same reason why people will make bold lies. The most famous in the last century was O.J. Simpson claiming not to have murdered his wife despite mountains of DNA and other evidence. As long as he kept selling the lie, he was able to maintain his social standing with millions of people; had he told the truth, he’d remove all doubt he was a murderer and would’ve lost his social standing with those believers.


How This Is Good and Bad for Success

An irony of confirmation bias is how everyone uses it to try to change others’ beliefs but rarely ever admit to falling for it themselves (despite everyone being subject to it—the same irony occurs with attribution bias). The other irony is that everyone points to it to eliminate it when there’s actually a good reason for it to exist and it shouldn’t be eliminated.

The problem with most conversation about confirmation bias is that it assumes that all bias should be eliminated. That’s not necessarily true. The reality is that there are times when truth and happiness are at odds. This conflict was famously analyzed at length in the TV show, House, M.D. House was always trying to face the difficult truths in life, reflecting the constant pain in his leg causing him to suffer all the time. Everyone around him was lying to feel better or attain a better outcome, and he would point it out in every episode. There is an interesting relationship between truth and happiness: many people seek truth thinking it will make them happy; most people avoid truth to feel happy; many truths make people unhappy.

The interesting thing about episodes of House is that it would usually end with House emphatically stating his truth and being miserable while his patients lived in lies and were happy. Examples included turning to religion when dying, overlooking a partner’s infidelity, or believing in the intrinsic goodness of humanity. House would scoff at such things as denial of reality, while the people around him would choose to believe those things for a happier life.

What they didn’t say explicitly in that show—and what they don’t state during conversations about confirmation bias and attribution bias—is that there are many times when you are right and the world is wrong. For example, there are many wars going on in the world at any given time, and just because that’s true doesn’t mean that you have to accept that and become a warrior. Many people don’t care about others, but that doesn’t mean that you don’t have to or that you shouldn’t. Everyone knows this as the, “If everyone jumped off a bridge, does that mean you should?” logic. The point—the most important point of this whole blog—is this: if you changed your beliefs at the drop of a hat, you would likely be lost, misguided, and possibly miserable. For all the possibly that you are wrong about something, there are millions of more wrong ways to do things than there are right. To change your mind so quickly when you are right would be disastrous on many levels.

The most important part of that idea is that you are ultimately responsible for your life, so if you act based on someone else’s advice and they end up being wrong, you still face the consequences, not them. That’s the problem with blindly accepting new information that might be different from what you would’ve thought or done: you could switch to the wrong information. That’s why holding your beliefs and trusting yourself is important.

Similarly, most success comes through a mountain of setbacks and opposition. If you changed your beliefs every time contradictory information came at you (real or fake), you would never accomplish anything. That’s what people who talk about confirmation bias usually don’t address: holding beliefs through adversity is critical success, so confirmation bias is a very, very good thing when it helps you keep your resolve—and even your sanity—through difficult times. Yes, I just say that confirmation bias is good—like any tool in life, it can be used in good ways and in bad.

And that is the full picture of confirmation bias: Beliefs are necessary to hold through setbacks to become successful, but beliefs are also necessary to change when they will lead to better results. And, as House showed, sometimes people even change their beliefs to ones that others would call false for better outcomes in their lives. And that’s the whole picture about confirmation bias, attribution bias, and the beliefs you hold: you hold beliefs because you think they are true; you tell yourself things to keep your ego, wellbeing, and social status intact to make progress toward success in life; you, therefore, need to hold many beliefs for success through contradictory information; you also need to change beliefs that are not leading you to success in life. All of that can be true at once.


So, what can you do tomorrow with this information?

First of all, accept that agreeableness is important to social cohesion. People only disagree when the benefit is worth the social cost. It’s why people disagree more on social media than in person (both the benefit of anonymity and the fact that they’re talking to people they’ll likely never meet in real life).

Second, accept that you face a lot of contradictory information. It’s okay if you don’t change your beliefs immediately all the time, no matter what intellectuals tell you when they only focus on eliminating confirmation bias.

Third, know that you will face the conflict between truth and happiness, and it’s okay if you choose to tell yourself things to make yourself feel better if it helps you thrive—as long as it’s not at someone else’s expense (e.g., “I was just doing my job” as an excuse for committing atrocities is not the same as “I am human and make mistakes” as a way to forgive yourself for missing an important meeting).

Fourth, try to identify more with values and processes and less with static worldviews and beliefs colored by recent or memorable experiences (known as Recency Bias and the Availability Heuristic, respectively). What I mean by this is that you may wish to identify with being a good person, but what makes a person “good” might evolve in your mind over time. You might identify with continuous improvement instead of accumulating and maintaining a certain set of information as “The Truth” or a certain standard as “The Best”—things change, and new information is coming at you all the time. You might also run into a few bad people and, rather than concluding that all people are bad, acknowledge that there are good and bad people and all you can do is find the good people and avoid the bad people.

Finally, knowing why you hold a belief and where you went wrong can help you determine when you need to hold a belief and when you need to change it. You might initially form a belief based on little information and then later forget that and think that it’s a rock-solid belief. Similarly, you might not understand the evidence that would have to be presented to change your beliefs or might “move the goalposts” when that evidence is presented (funnily enough, moving the goalposts is a good example of being wrong but acting like you’re right—you’ve adjusted your belief given new information but won’t admit to it). If you identify the reasons behind your belief and what needs to be true to change it, you will make better decisions regarding, in the words of Kenny Rogers, when to hold and went to fold.

So, the next time you hear about confirmation bias, accept that information with a new understanding: yes, people usually only look for information that reinforces their beliefs and dismisses contradictory information. Yes, it’s based on ego and social status. But, no, it’s not irrational. It’s very rational, and it’s often beneficial. Don’t let anyone tell you to eliminate all of it. Not only are those people likely to say they would change their opinion given new information but would still require disproportionate information to overturn their beliefs, but their advice would subject you to bombarding contradictory facts and opinions regarding everything you do (making you miserable). Instead, know when beliefs benefit you and when they don’t. There are plenty of beliefs that lead you to living a better life even when contradictory information is presented. Again, it’s only when a belief leads to a worse outcome for yourself or others that makes it negative. It is much better to believe in the intrinsic good of humans if it leads you to spending more time contributing to society than if you held a believe that humans were intrinsically selfish and bad and used that to be selfish and bad.

Most importantly, you can hold complex, somewhat contradictory beliefs in your mind at the same time and choose to focus on the half of the belief that is positive. As long as you don’t forget the other part and act in a dangerous way for yourself or others (e.g., voting for a public policy that assumes that all people would behave well by default if they only had their needs met when they wouldn’t), the way you manage beliefs to optimize your thoughts, actions, and outcomes is for you to decide. Confirmation bias is a tool just like anything else—you may need to reinforce your beliefs during a difficult time in your life to avoid crumbling, and other times you may need to break down your limiting beliefs to avoid holding yourself back in life. The choice is yours, and, hopefully, you’ll be better equipped to make the right one given this information.

Written by

Nathanael Garrett Novosel

Tags

Previous Biology vs. What Drives Success