Along with the ego, there are a few notable obstacles to pursuing truth and clarity of thought. They are intellectual laziness (I can’t be bothered to understand or research this, so I will accept anything), willful ignorance (I reject and deny that there is something further to understand), and adherence to sacred cows (that topic or stance is simply irrefutable truth; I refuse to question it).
It’s easy to tell someone who is intellectually honest versus dishonest. It’s all about how arguments contrary to their view are processed. The intellectually honest focus on understanding and following the evidence where it leads. The intellectually dishonest focus on a narrative that they want to preserve, and become defensive and sometimes outright hostile. The intellectually honest are able to answer questions directly and without justification; the intellectually dishonest must provide explanations, roundabouts, and deflections. Usually, it’s clear that there is something being substituted for evidence that shouldn’t be.
Having an opinion is something we all do, but we must recognize that we often do it based on insufficient information and questionable evidence. An opinion is one thing, while forming a well-founded and defensible opinion is quite another. The latter, as Bertrand Russell writes, requires that you be wary of opinions which flatter your self-esteem. Imagine different biases and perspectives, look outside your immediate social circle, and question why an opposing opinion might make you react emotionally. It can be summed up with “Strong opinions which are lightly held.”

- Hear it Here – https://bit.ly/clearthinkingking
- Show notes and/or episode transcripts are available at https://bit.ly/social-skills-shownotes
- Patrick King is an internationally bestselling author and social skills coach. emotional and social intelligence. Learn more or get a free mini-book on conversation tactics at https://bit.ly/pkconsulting
- For narration information visit Russell Newton at https://bit.ly/VoW-home
- For production information visit Newton Media Group LLC at https://bit.ly/newtonmg
#Aristotle #BertrandRussell #Buddha #JackNicholson #LinjiYixuan #Munger #NicolausCopernicus #Opinions #PatrickKing #PatrickKingConsulting #SocialSkillsCoaching # #PaulMcCartney #RussellNewton #NewtonMG #WarrenBuffett
Willful ignorance.
It’s one thing to be intellectually dishonest through mental laziness and prioritizing your comfort over the truth, but it’s quite another thing to know you’re relying on faulty information, mislead others, but keep on doing it anyway. This is called willful ignorance, and it’s worse than mere intellectual laziness.
Willful ignorance is making a deliberate choice to disregard the truth. Examples include the conspiracy theorist who won’t consider any information that exposes the holes in their argument, like people in the ‘60s who thought Paul McCartney was dead, and rejected clear evidence like his giving new television interviews frequently (“It was an imposter!”) and releasing new music (“It was the same imposter!”).
But willful ignorance happens in less fringe situations as well: In the 1990s, when tobacco companies knew that science had proven their product was harmful, they fought to suppress the data and deny its authority by claiming it was “inconclusive.” If you assume that tobacco companies weren’t knowingly poisoning their customers, they turned a blind eye to compelling evidence simply because they wanted to believe it so badly. It’s the equivalent of plugging your ears, covering your eyes, and loudly screaming “LA-LA-LA-LA-LA” to deny something.
There’s more than innocent ignorance behind those that practice willful ignorance: They consciously opt to spurn the truth, with statements ranging from the relatively benign (“It’s none of my business”) to the dismissive (“I don’t want to know”). Such brazen refusal is usually a sign that the speaker knows there’s something wrong with their position and merely wants to escape the proceedings.
Several reasons might be at play when someone displays willful ignorance. Remember, denial typically serves the ego. They could just be insecure about their beliefs and want to avoid information that would conflict with them. They may want to escape the responsibility to change that comes with new knowledge—to paraphrase the Jack Nicholson movie quote, they “can’t handle the truth!” Alternately, they may simply perceive ignorance as the psychologically healthier option: They prefer to “stay positive” and preserve the relative tranquility of “not knowing.”
This harms you because without the truth, and without acknowledging your possible role in it, improvement is impossible. It’s like when the “Check Engine” light goes on in one’s car. They can rationalize it away by saying, “Oh, that light goes on all the time. It’s irrelevant.” Then they continue to ignore it, until one night they try to start the car and it won’t turn over. More personally, we see willful ignorance when someone refuses to acknowledge hard evidence that their partner might not be totally truthful with them, continuing to stick silently by their side thinking things will get better by just pretending nothing’s wrong.
Knowing that your beliefs or facts don’t align with reality is important. Willful ignorance is short-circuited by making the simple yet tough decision to start with facts and then find a conclusion, instead of starting with the conclusion and then finding the facts to support it.
Some reading this will find the risks of losing willful ignorance too much to endure. Still others will say there’s nothing wrong with being willfully ignorant if it makes them happy. But don’t confuse this comfort zone for clear thinking.
Adherence to sacred cows.
Certain subjects, ideas, people or groups are considered by some to be off-limits when it comes to criticism or even critical analysis. These items are called “sacred cows,” in reference to the Hindu belief that the cow is a holy animal that must not be eaten or disrespected.
Discussing sacred cows can be extremely problematic, because they speak directly to people’s core of faith, belief and identity. For our purposes, sacred cows can include anything from long-established cultural traditions, religious practices, political beliefs, and even industry practices. Anything that is held out to be the unquestionable truth, or above truth itself, is a sacred cow. In everyday terms, they are “touchy subjects.”
To say anything critical of those hallowed institutions and figures is considered blasphemy by those who follow them. But are they accurate, truthful, and deserving of such a label? What gives them their status, and what makes them more correct than anything else? Is it simply a result of “doing things for the sake of doing them as they have always been done”?
To be clear, this is not a point about discussing the merits of the Hindu belief regarding the cow. This is a point about questioning your beliefs and separating long-held assumption from fact.
Intellectually honesty dictates that no subject, belief, or person should be free from critical thinking or questioning. If you honestly engage in this process, sooner or later you’re going to step directly onto someone’s sacred cow, even your own. This is when you encounter something that you believed to be incontrovertible truth, and when you come into conflict with that, how will you react? Will you be able to follow the evidence where it leads, or ignore it by deferring to your sacred cow?
But it’s a dangerous discussion. It sparks intense defensiveness. Centuries of chaos and bloodshed have resulted from these attitudes. You might have your own internal battles on the matter. As with many things in life, discomfort here is a sign of something significant occurring.
There is no tenet or belief that should be accepted completely on blind faith. Every single one of them should be open to scrutiny and investigation. The best ideas and principles will stand up to such inquiry—the truth will always be defensible. Only beliefs that rely on falsehoods, outdated thought or misinformation will lose out.
Imagine that you (after having traveled through time) are working diligently to construct a theory on whether or not the planets orbit the sun, or everything orbits the Earth. You may recognize this as the debate between heliocentrism and geocentrism, respectively. Geocentrism was indeed considered a sacred cow. Where would we be if it wasn’t taken off its pedestal and intensely questioned and ultimately proven incorrect by Nicolaus Copernicus?
If you have a sacred cow, the biggest step is to at least recognize and admit that it is a sacred cow rather than a fact. People are free to believe what they want, but they are not free to present what they want as truth or fact.
This idea is behind the famous Zen teaching of Linji Yixuan: “If you meet the Buddha on the road, kill him.” This means that one shouldn’t be so beholden to knowledge of a certain person or belief system, and if they have the opportunity, to destroy it or them to gain clarity of thought.
What are your sacred cows? Why do you consider them sacrosanct and beyond reproach?
What beliefs or subjects are off-limits with you?
What are you unwilling to be critical of or criticize?
What are you unwilling to discuss honestly without growing defensive?
What do you feel must not be questioned?
Take time to question and at least identify them. The goal isn’t to change your mind about your beliefs, it’s just to gain a better understanding of what your beliefs are built upon. That actually may strengthen your beliefs. But don’t be afraid or panicked if doubt creeps in—investigate that too. You’re not betraying yourself if you do that; you’re using your brain for its intended purpose. Questioning your sacred cows isn’t about being disrespectful or rude, it’s about knowing that the truth fears no questions, nor does it need you to defend it any more than gravity, logic, or mathematics needs you to defend them.
On Forming Opinions
“Opinions are like mouths, everyone has one.” Have you ever heard this phrase, or a more vulgar version? It means that opinions are natural to have and inescapable. However, this doesn’t say anything about their accuracy or the unfortunate consequence that many people like to substitute their opinions for fact.
Sound opinions can only come from intellectual honesty. Especially in the times we live, when it seems like it’s more important to have loud and quickly-delivered beliefs, going out of your way to take deliberate steps in establishing your views is vital.
Philosopher Bertrand Russell identified some of the pitfalls of making hasty opinions, as outlined in one of the essays that comprised his anthology The Basic Writings of Bertrand Russell. He may not have known it at the time, but he was one of intellectual honesty’s first proponents. His approach was to ensure that they aren’t clouded by sentiment, bias, or corrupt thinking. Accordingly, one of Russell’s lasting legacies is the work he did in the philosophy of logic, which first started with Aristotle.
“If the matter is one that can be settled by observation, make the observation yourself.” It’s one thing to believe facts and opinions that you’ve read or heard about, and there are some that you can even take for granted. You’re secure in believing that bears hibernate in winter, even if you’ve never personally tracked a bear as he’s preparing to pack it in for the season. Is it possible for you to observe them yourself? Other people have, and it might be safe to take their word on it for this one if you trust them.
When you can—especially when it comes to opinions—you should try out your beliefs yourself. If you believe that a new shopping center near your kid’s school is creating heavy and unsafe traffic when school lets out, take a day or two to actually watch and measure the traffic on the street to back up your opinion. Can it truly be your opinion if you don’t have a basis for it?
Don’t just take others’ opinions for your own, no matter how persuasive your sources. It’s a mistake to assert that you know something when you don’t. The more strongly you believe something, the higher the risk that you’re being swayed by personal bias. If you have a chance to test your beliefs, take it.
“If a contrary opinion makes you angry, you might subconsciously know you have no good reason for your thinking.” The most volatile blow-ups we have in intellectual discourse occur when we’re discussing matters that are, at heart, unprovable. We don’t get angry when we hear a math equation; “2 plus 2 equals 4” will not make someone fly into a vicious rage unless they’re extremely unstable. It’s subjective matters of the spirit that people clash over, be it theology, favorite music styles, or whether their favorite sports team “sucks.”
If you find yourself getting increasingly angry when you’re in a debate with someone, stop and think why you’re getting incensed. Russell suggests that you may subliminally know that your viewpoint isn’t necessarily backed up by the strongest proof, and you are dreading the inevitable feeling of being wrong. The more agitated and hotter you are about defending yourself, the higher the chance that you’re standing on shaky intellectual ground. If the ego is awakening, there just might be a reason.
“Become aware of opinions outside your social circle.” In fact, seek them out. Many times we adopt certain beliefs because our friends and family believe them. For all intents and purposes, those opinions become our reality. Then, we fear being ostracized or rejected by the social circles we’re in if we dare express a countering viewpoint. Other times we may sincerely hold those opinions but have no visibility into what a counterpoint might look or sound like. Echo chambers are where strict, dictatorial stances are left free to develop and turn into ruthless dogma.
Seek out the viewpoints of people far outside your immediate group of friends. Don’t argue against them or refute them. Listen. Read or watch the news sources of the opponent if you can’t get out and talk to them personally. Understand that people live in different worlds, despite walking or sitting right next to you on the subway.
In many cases you’ll find they might have some good points. And if you still find their views repugnant or unhealthy—well, that’s how they feel about you. As unlikely as it seems, exposure to the opposition is the best way to find common ground, decrease intolerance, and balance your own opinions.
On a related note, after gaining a bit of understanding of other people, try engaging in the thought exercise of how someone with an alternate perspective might respond to your opinions. There may be zero chance that you actually change your mind on certain things, but at least you’ve gained perspective and hopefully empathy.
“Be wary of opinions that flatter your self-esteem.” Any politician will tell you that the best way to instill a belief in a certain individual is to appeal to their ego. They win over crowds by complimenting their patriotism, emotions and overall profile. This should be self-evident—people don’t get insulted into believing a certain way, but they can be cajoled and seduced into it.
But just because a vendor calls you beautiful or handsome doesn’t mean the price of that jacket will fit your bank account. Beware when you’re hearing an opinion from someone that makes you feel validated and righteous all over. Is it honest, or is it pandering and flattering for the purpose of gaining compliance? There’s a chance it’s formed and delivered in such a way that you can’t help but be manipulated or charmed into believing it. No matter how sound or rational the opinion might be, check to make sure it’s as appealing to your intellect more than your sense of pride. Thinking clearly means going more deeply than your emotional reactions.
For Russell, forming opinions is not something to be taken lightly, and a certain amount of responsibility comes with it. Others may not engage in this process, but that doesn’t mean you shouldn’t.
Charlie Munger, the businessman and philanthropist who is best known as financial partner to Warren Buffett, once said, “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.” That view goes hand in hand with Russell’s directives above to seek ideas outside your social circle and imagine how someone would argue back to you. Don’t just come up with a bullet list of counteracting opinions—go deeply into the opposition’s point of view. You should become your own toughest and most articulate critic.
We’re not programmed to do this instinctively. The brain has a strong inclination to confirmation bias, the tendency to only hear opinions that support our own viewpoints that we’ll explore later. But ours is a brain that is programmed for a combination of speed and certainty, not accuracy. Acting decisively in the face of a speeding truck can save your life, while trying to determine truth can leave you a splatter on the road. But that’s not the situation we are dealing with, is it? In the absence of threats to your life, truth should always be the end goal, and opinions should be formed only after making an honest effort to pursue it.
“Strong opinions which are lightly held” is a helpful rule of thumb. Have certainty in what you know, but also be open to what you don’t know and how it impacts your current opinion. Make your opinion a reflection of what you currently know, and keep updating it to adapt. When you don’t attach to a particular opinion, you’ll find that truth becomes easier and easier to see as well as find. If you do feel an attachment, it’s probably a sign that you are not being guided by intellectual honesty.