Artwork for podcast Social Skills Coaching
Follow The Evidence
18th August 2021 • Social Skills Coaching • Patrick King
00:00:00 00:10:53

Share Episode

Shownotes

Confirmation bias is also the most prominent way that we fail to simply follow the evidence. If we perform research and keep an open mind, our task is simple: just follow the arrows where they point. But all too often, we are seduced into following the wrong arrows. These include the cognitive distortions of focusing on “must” and “ should”, black-and-white thinking, the Dunning-Kruger Effect, and labeling.


Hear it Here - https://bit.ly/clearthinkingking

Show notes and/or episode transcripts are available at https://bit.ly/social-skills-shownotes

Patrick King is an internationally bestselling author and social skills coach. emotional and social intelligence. Learn more or get a free mini-book on conversation tactics at https://bit.ly/pkconsulting

For narration information visit Russell Newton at https://bit.ly/VoW-home

For production information visit Newton Media Group LLC at https://bit.ly/newtonmg


#Cornell #DavidDunning #Dunning #DunningKruger #DunningKrugerEffect #JustinKruger #Kruger #FollowTheEvidence #PatrickKing #SocialSkill #RussellNewton #NewtonMG


Cornell,David Dunning,Dunning,DunningKruger,DunningKruger Effect,Justin Kruger,Kruger,Follow The Evidence,Patrick King,Social Skill,Russell Newton,NewtonMG

Transcripts

As an extension of denying your confirmation bias, another aspect of openness and mental flexibility is to follow the evidence. Wherever it points is where you go. Inevitably, a narrative begins to unfold as you delve deeper and seek to understand. All you have to do is at least look in that direction without regard to how happy or unhappy it makes you.

You might find real evidence that supports your point of view—great. But you’ll also find evidence that you don’t necessarily want to face, the kind that offers cogent and reasonable arguments against your position. Even people who have devoted themselves to fearless truth-seeking might bristle at this kind of evidence and try to avoid it. To a certain point it’s fine, but especially if it really pierces the shields of their cherished identity—religious, political, social, allegiance to a fictional movie character—they may try to sidestep it and turn it away.

You know what I’m going to say: That’s exactly the kind of evidence you should need to follow and follow to its utmost. It’s a deceptively simple task—if you can let go.

Treat all the evidence you receive by the same standards of reliability. All of it needs to pass the same sniff test. You must be circumspect of all evidence, and this means tending toward high-quality information more than high quantities of information.

Imagine an arrow points in one direction after you review some information and perspectives. This is a green arrow, to symbolize that it is correct. However, you are going to have to be able to pick it out from a multitude of red arrows, which seem helpful but really aren’t. Sometimes it’s more important to be able to eliminate those red arrows, as you can never quite be sure that you are indeed following the green arrow.

Beware of black-and-white thinking. Black-and-white thinking is easy. That’s why people practice it. But it’s also dangerous. Starkly contrasted, right-or-wrong belief systems are the downfall of modern civilization. Unfortunately they’re not going away anytime soon, but to maintain a path of intellectual openness you must learn to avoid black-and-white thinking—and keep it from inserting it into your own beliefs.

Black-and-white thinkers only see two options for anything: “You’re with us or against us.” “If it’s not Mars, it’s Venus.” “If I don’t follow that red arrow, it must be this red arrow.” If the evidence doesn’t point one way, then it definitely points the other. The middle ground of maybe doesn’t exist because it’s more important for them to be certain than right.

But that’s an error of epic proportions. If someone doesn’t like the color red, it doesn’t mean they like blue. One discovery does not necessarily rule out another, and there isn’t a causal link between very many things.

Only a few truths are absolute, and they’re the ones that are provable by evidence. But all other truths—more accurately, beliefs—are more nuanced. There’s more to consider and think about when deciding what’s true and what’s not.

Most people, in their natural state, are not sharply one way or the other. The truth is the same. Perhaps the tendency for black-and-white thinking is because of the following point.

It’s okay to be uncertain—it’s not okay to pretend you know what you don’t. Saying “Maybe” is a perfectly fine conclusion, and an opinion isn’t mandatory. Being unknowledgeable about current issues simply isn’t an option. Saying “I don’t know” is almost shocking to some people, because they’ve internalized that statement as a sign of failure or some kind of shortcoming. All of the arrows you can currently see might be incorrect.

To avoid that appearance, many of us offer an uninformed opinion off the top of our heads. We think it’s better to have some kind of insight—even if it’s completely off-base—about something we don’t understand than to remain quiet or express doubt. And then, as we do, we stick to that stance for no good reason.

Sure, uncertainty is uncomfortable, or at least can be. But it shouldn’t be so disarming that we try to conquer it by finding something, anything, to believe. The fear of being uncertain is why people accept conspiracy theories or the rantings of a charismatic cult leader. They may be completely without merit, but they admire the sureness they provide. Even if the beliefs are absolute rubbish, they’re better than having no beliefs at all.

Almost invariably, the information they’re getting is ill-formed, unsound, slanted, and even flat-out false. But that doesn’t matter, because they feel like they know something. It doesn’t really even matter to them that they’re not right—because it’s important to feel certain.

Your biggest stumbling block in this situation is emotional. It is the emotion of anxiety you feel from a lack of certainty. You need to understand and believe that there’s nothing wrong with being uncertain, that ambiguity is not an affliction. Some might even say that being uncertain or ambiguous is exciting, because it opens up possibilities. In any event, being uncertain is far, far more preferable to believing in something false.

De-stigmatize the dreaded three words “I don’t know.” You won’t lose points in the eyes of others. In fact, they might even appreciate that you’re the rare breed who doesn’t feel they have to have a ready set of opinions about something they don’t know anything about. It’s much, much better to be unsure than to be misguided.

Just remember this: Your search for truth is rooted in a desire to understand. You’re seeking knowledge—you are not necessarily seeking answers. The key to getting through that uncertainty is to accept the chance to test or confirm our beliefs.

Thinking “must” or “should”. This is one of the leading causes of biased and closed-minded thinking. Like other shortcomings we have discussed, this leads to you looking at a set of evidence with a conclusion already in mind, based on how you picture something should occur, and trying to mold the evidence to fit that. It is when you expect the world to be different than it actually is, and it is the opposite of what you should do.

Must and should thoughts are beliefs that you unknowingly treat as fact. If they don’t materialize, even after you see clear arrows to proceed another way, you’ll still be hesitant to follow the evidence. For example, you could carry the belief that dogs “should” be friendly—how might this “should” hurt you if you encounter a wild dog frothing at the mouth with rabies? Shoulds and Musts masquerade as evidence, and for that reason, they are a red arrow to be avoided.

Beware of the Dunning-Kruger Effect. This was coined by Cornell psychologists David Dunning and Justin Kruger in the late ‘90s. Simply put, some people aren’t informed or knowledgeable enough to know what they don’t know. Even worse, they’re usually over-confident in their own abilities because to them, there is little nuance and only simple questions and answers. The more they fall prey to this effect, typically the more confident they are in themselves.

Dunning-Kruger occurs in just about any setting where there are people who assume they know best. You might see it during a chess match, where a novice feels that chess is extremely simple—while missing all the behind-the-scenes planning and nuance. It’s just not possible to follow the evidence if you have no idea what you’re looking for (and you don’t know that you don’t know).

Dunning himself recently noted that the effect isn’t necessarily fatal. Many people appear to have it simply because they don’t know what the standards for success or accomplishment are, so they’re essentially flying blind, but giving themselves credit for keeping afloat. Once people become aware they suffer from Dunning-Kruger and acknowledge it, they can always reverse its effect by learning what to do and putting it into practice.

There are a few steps you can take that will ameliorate the Dunning-Kruger effect. A lot of it involves things we’ve already talked about: being humble and realistic about your current state of being, not being intellectually lazy, and not thinking that you’re above anybody else in terms of intelligence or accomplishment. The world is usually not devoid of complexity, so if you feel there is little nuance, then you are probably missing entire levels of analysis. To combat this, embracing self-challenge is a key, because it turns out that if something appears too simple to be true, you probably just don’t know what to look for.