You’re Wrong, I’m Right

We routinely think we are correct, and we are confident in our correctness.
This is epitomized by two adjacent observations: confirmation bias and the Dunning-Kruger effect. We seek only what we want to hear (subconsciously), and don’t know enough to know that we don’t know anything (also subconsciously).


You can imagine that this is a dangerous combination with regards to critical thinking and accuracy.


Another way that we simply goof on thinking is through our tendency for overconfidence. Hey, a little bit of self- confidence is important, but too much can steer you straight into error after error. Recall the tragedy of the RMS Titanic, which was reputed to be indestructible to the point where there was a severe shortage of lifeboats onboard.


Overconfidence is when our brains unconsciously deceive us by telling us we are smart, we are correct, or we know better than others. Clearly, this can’t be true for us all to think this way.
By necessity of statistics, 50% of us are above average, 50% are below average, and a tiny proportion of us are right in the middle, average in terms of any trait or ability. Yet why do we all prefer to insist that we aren’t part of the 50% that is below average?

  • Brain Blunders: Uncover Everyday Illusions and Fallacies, Defeat Your Flawed Thinking Habits, And Think Smarter (Or Just Less Stupidly) By Peter Hollins
  • Get the audiobook on Audible at https://bit.ly/brain-blunders
  • Show notes and/or episode transcripts are available at https://bit.ly/self-growth-home
  • Peter Hollins is a bestselling author, human psychology researcher, and a dedicated student of the human condition. Visit https://bit.ly/peterhollins to pick up your FREE human nature cheat sheet: 7 surprising psychology studies that will change the way you think.
  • For narration information visit Russell Newton at https://bit.ly/VoW-home
  • For production information visit Newton Media Group LLC at https://bit.ly/newtonmg


First of all, it’s not how anyone wants to view themselves. No one thinks they are stupid; if they recognize a shortcoming in one aspect, they will find another to make up for it and still be able to consider themselves above average. It’s completely natural; for instance, those who don’t have academic success often say that they are instead “street smart.” It might be true, but it might not. Our egos and sense of pride are desperately at work making sure we have a generally positive view of ourselves. Sometimes, however, we go too far and we protect ourselves so much that we begin to self-deceive. You’ll recognize these as defense mechanisms and excuses.


Second, we know our thoughts and explanations for how we come to certain decisions.
If we make a poor choice or assertion, we still know we had some plausible set of reasons that made it seem not so ridiculous at the time. Essentially, we can explain and justify our faulty thoughts and decisions. When we make a mistake, it’s something we accounted for and can write off as an anomaly. It’s not something that happens often, and when it does, there was a reason for it.


However, when we look at the thoughts and behaviors of others, we can’t read their minds and understand their train of thought. We only see their errors and flaws without any of the redeeming factors. We don’t have any idea of why others have faulty thoughts and decisions, which means we can’t justify them. It was just a bad decision made out of stupidity. We judge others based on bare results, while we judge ourselves based on the thought process and effort.


These two tendencies are best displayed through the Dunning-Kruger effect. The Dunning-Kruger effect is a psychological phenomenon where someone who is below average in a certain aspect believes themselves to be above average. In fact, the more below average they are, the more above average they often rate themselves. This occurs because they don’t know what they don’t know. They don’t have the experience, context, or knowledge to recognize that they are inept or incompetent. This is a dangerous combination.


For instance, if you have just learned how to play soccer and you can complete a pass, you might think soccer is not so difficult or complex. After all, you just kick a ball, right? You possess this simplistic view because you have only been exposed to soccer as a series of kicks and passes, and you haven’t seen the depth of variety in passing, strategy, and overall hand-eye coordination. Your understanding is limited to a small subset of knowledge, and thus it necessarily seems simplistic. You have no idea (though you should probably assume) that there are deeper and more complex levels. It all flies over your head, so you remain in blissful ignorance and label soccer as a simple game.


You can apply this same type of ignorance to any field—the specifics may be different, but what remains the same is that only the surface level is ever glanced at. Everything appears easy on the surface.
As another example, painting can seem extremely straightforward. You see something and you reproduce it on a paper like a printer or a photograph. That’s all artists do, right? Leonardo Da Vinci and Michelangelo were just human printers. Painting and art in general appear easy until you actually attempt to do it, and only then do you start to grasp what lies beneath the surface. Your attempt at a face will look like a Pablo Picasso drawing, but not in a good way. It may be simple, but it is not easy, and both of those aspects matter.


The Dunning-Kruger effect originated from Cornell University in 1999, when researchers had participants perform tests of general intelligence and grammar and then rate how they thought they performed on the tests compared to the other participants. Generally, the participants who performed the worst rated themselves to be at least above average. Participants with test scores that put them into the bottom 12% rated themselves as performing in the top 62%. Conversely, people who performed above average rated themselves fairly accurately or as below average.


When you have knowledge in a certain domain, you know nothing is truly simple or easy. What someone might see as three steps is closer to steps to you because you might know what’s involved. If you know these steps exist, you won’t be as confident in your performance or knowledge. If you don’t know these steps exist, you’ll be confident that you can nail three simple steps. Additionally, when people don’t know, they don’t understand the flaws in their thought patterns and fail to grasp the complexities of what they are trying to accomplish. This is why some say that those with true expertise appear to be more doubtful and less confident—because they know what they’re up against.


The problem with overconfidence quickly becomes this: how can people know when they don’t know? Well, that’s a tough proposition.


This is further compounded by what’s known as confirmation bias, which takes our inflated sense of confidence and makes it feel justified, even. Confirmation bias occurs when you start with a conclusion in mind and find the evidence to support only that conclusion.


It causes one to disregard, rationalize, deny, or steer clear of evidence that disproves or challenges that belief. It’s not necessarily driven by ego so much as it is a desire for wanting to be correct—a desire so deeply rooted that your subconscious creates a filter where what you believe becomes the truth. Confirmation bias is the ultimate stance of seeing what you want to see and using that filter to prove a pre-chosen conclusion. In fact, it’s where you start with a conclusion in mind and work backward to make it your reality despite evidence directly to the contrary. The simplest example is when you have a particular stance that you want to support—for example, that dogs are loyal. So you type into Google “dogs are very loyal”—obviously this is going to generate results about the loyalty of dogs, whereas if you type in (1) “are dogs loyal?” (2) “dogs loyalty,” or (3) “dogs are not loyal,” you would get a broader range of the literature on dogs and loyalty.

This particular stance does not have any consequences, but confirmation bias can also turn life- threatening. For instance, you may support the conclusion that you are a world-class skier despite the fact that you have only skied once in your life. Despite evidence that you constantly fell even on that one occasion, you explain it all away as “beginner’s bad luck” and insist that you are ready for a double black diamond course—a type of course that involves steep cliffs that one could easily slip off of and slide into oblivion. You see other people’s warnings as jealous, and you even find anecdotes from famous skiers about how they were amazing after only one class, ignoring the warnings of everyone else.

You find a group of first-time skiers who advanced quickly for inspiration. All your detractors “don’t know who you truly are” and “underestimate your abilities.” Unfortunately, you end up persisting in the belief of your abilities, and you ski right off a cliff and perish. That’s how confirmation bias can tilt your interpretation of the world by restricting the flow of information. If you want to believe an opinion, then you’ll feverishly seek out sources that will buttress your belief—even if it’s false. And you’ll ignore (“No, I didn’t see that!”), deny (“No, I refuse to believe that!”), or rationalize (“No, it’s different here! I’m different!”) sources that counteract or disprove your feelings—even if they’re true. It tends to lock us up in an echo chamber, where we only listen to a small number of the same voices and a narrow range of opinions, all of them in support of our view. For all intents and purposes, this is your world and reality; this is the majority view that seems to be the truth. With so many people (people around you, anyway) saying the same thing, how could you go wrong?

It’s never easy (nor much fun) to be diagnosed with confirmation bias. But once you realize where it’s stemming from, it should motivate you to seek a course of action to lessen its impact: argue against yourself. If you’re certain of your opinion, then you should be able to identify the arguments against your opinion. After all, you know exactly what you are ignoring, denying, or rationalizing. The typical sequence of events is that you have your opinion, an opposing argument, and then your confirmatory reaction. What then? Continue that discussion. Make an honest effort to create a back and forth to see the merits and weaknesses of both sides. If you give 100% effort on your own opinion, you must give 100% effort on the opposing opinion. Engage with it and ask why it exists. Ask about the different perspective that created that opinion. 190

Question the evidence you like as harshly as you’d question the evidence you dislike. Hearing the other side of an argument will give you a much better ability to understand a different position, the different worldview or reality, and the factors involved that you never considered. Even if you don’t change your opinion, you’ve opened up a channel that wasn’t there before. For example, maybe you’re talking to someone who is bitterly opposed to the construction of a new park that you support in your neighborhood. You think a park would substantially increase the livability and comfort of your neighborhood, but your opponent doesn’t think it’s a good use of money. Instead of trashing the opposing view, ask why the view exists in the first place. Maybe they feel the money’s better spent on improving local roads; perhaps they’ll tell a story of a relative who suffered severe injuries on a street that was in bad need of repair. Or maybe they feel that a park should only be built after other social services are fully funded and operational.

Whatever their reasoning, try to get a story from them and see if there’s a solution you can work toward together. You might find that you even start getting defensive with yourself in this process, but attempt to engage in this from a perspective of curiosity, self-education, and seeking knowledge. An important step is to write these arguments down so you can truly see for yourself what both sides are represented by. Try outlining your viewpoints, and then make up arguments against them. Provide the same number of arguments for each, and directly address the corresponding ones. Flip your Google searches as we did with discovering the loyalty of dogs earlier. If evidence is presented, find it, and search for the opposite if it exists. Remember that evidence is objective, but reasoning and perspective is subjective—yours included! After all this sweat and toil, you may find out that you don’t really believe your original argument as much as you thought you did.

And that’s the first step to cracking confirmation bias and starting to think openly. It’s the simple realization that you should leave a 1% buffer of doubt and uncertainty for yourself, and being 100% certain about something takes work that you probably haven’t performed. Our refusal to hear the opposing side isn’t a sign of inner strength or resolve—it’s the exact opposite. It turns out we’re wrong more often than we think, and we don’t have a clue—sometimes we don’t even know that we don’t know.