* Mental Model #13: Peer Review Your Perspectives. Many of the ways we fail at solving problems are related to our inability to look at other perspectives. In fact, we should be continually checking our perspectives through triangulation against those of others. Thinking and solving in a vacuum will never work because if you didn't experience it firsthand, it won't make sense to you.
* Mental Model #14: Find Your Own Flaws. This mental model is about resisting the comforting allure of confirmation bias and attempting to scrutinize yourself before others ever get the chance. Assume that you are wrong; this especially applies to interpersonal relationships. If you assume that you are at least 1% responsible for conflict, then your illusion of superiority and infallibility is broken, an important factor in social interaction.
https://www.buymeacoffee.com/RussellNewton
Hear it Here - https://bit.ly/mentalmodelshollins
Show notes and/or episode transcripts are available at https://bit.ly/self-growth-home
Peter Hollins is a bestselling author, human psychology researcher, and a dedicated student of the human condition.
Visit https://bit.ly/peterhollins to pick up your FREE human nature cheat sheet: 7 surprising psychology studies that will change the way you think.
For narration information visit Russell Newton at https://bit.ly/VoW-home
For production information visit Newton Media Group LLC at https://bit.ly/newtonmg
#PeerReviewYourPerspectives #PeterHollins #ScienceofSelf #MentalModels #RussellNewton #NewtonMG
Peer Review Your Perspectives,Peter Hollins,Science of Self,Mental Models,Russell Newton,NewtonMG
Use to understand the consensus view and why you might differ.
Peer reviews are conducted in many disciplines. They're most commonly associated with scholarly publications, but almost any endeavor - professional, scientific, or otherwise - has some form of peer review as part of its operations. As the name implies, a peer review is an evaluation of your work conducted by other people in your field. Other like-minded colleagues within your area of study or expertise review your work and offer feedback and suggestions in advance of submission. Often, this devolves into people viciously trying to rip your research apart and find flaws where they can. But actually, the more vicious, the more helpful it can be.
The goal of peer reviews is to guard against inaccuracies or omissions in a final work and to offer alternative viewpoints that could help make the results clearer, more relevant, or precise. Examiners review your premise, your methodology, your analysis, your conclusion, and everything that links those things together. This scientific and methodical approach is the best way to put your perspectives under scrutiny and make them bulletproof - or at least informed.
The best peer reviews leave no stones unturned and make sure the originator is presenting work that's been subjected to as much examination as possible. You'll come away knowing your weaknesses, strengths, and where you generally stand.
While this may not be very practicable on a daily basis, the purpose can be carried out in a few ways. If you have an opinion or perspective, that's one data point. What about trying to gather three more? And then what about trying to gather two that are opposed to yours and present different and novel angles?
You can gather information, intelligence, and other points of view in as complete a manner as possible to reinforce or fine-tune your thoughts or plans and help you make better decisions in the process of problem-solving. When you can find the consensus opinion, you can then gauge whether you align with it, or determine why and how you differ. Often this will open up new avenues of thought and exploration.
A specific application of this mental model is called triangulation. It's based on, among other things, the military practice of confirming a certain location by drawing lines from three different points of origin to form a "triangle" to it. The more data points available, the more sides the triangle gains and the smaller the area becomes. It's the process of slowly working your way toward a correct range through incremental data collection.
For instance, I may guess that a company is producing ten widgets a day, while a coworker believes that the same company is only producing four widgets a day. An average of our estimates wouldn't be a bad idea. Then my supervisor might suppose that the company produces seven widgets a day. Then her supervisor chimes in and says that the figure is six. Slowly, we close in on a range that is somewhat supported by all the data points.
Now do this same process but with your opinions, stances, and perspectives.
You may feel that lemurs are the most ferocious animals alive (or insert a more inflammatory stance that I would rather not broach). A zoologist you know may assert that, while ferocious, they are third behind honey badgers and cornered cheetahs. A zookeeper you are acquainted with may knock the lemur down to fifth place, under hippos, beavers, eagles, cheetahs, and honey badgers. A veterinarian friend may place lemurs between the two at the fourth most ferocious, behind cheetahs, honey badgers, geese, and buffalo.
What have you gained from this exercise? Well, you know your initial opinion is probably wrong, and you also know what the correct range of answers is.
Officially, triangulation of information requires collecting and verifying information from at least two different sources. Optimally, there are many more. While the "peer review" form of triangulation is potentially the best, you can also obtain it by examining data or theories from other sources (in other words, research).
Subjecting your perspectives and ideas to peer review and triangulation increases your legitimacy and authenticity. It shows that you're confident enough to expose your solutions to outside scrutiny and that you have the humility to listen to other opinions and constructive criticism. And that adds a lot of weight and sureness to the decisions you make: it increases the likelihood that they're sound choices, ones that are well thought-out and tested through trials.
Through that process, you'll gain a sense of what the actual solution is and, related to the main thrust of the chapter, solve problems far easier and quicker.
MM #14: Find Your Own Flaws
Use to scrutinize yourself before others can.
Requesting the learned opinions of others can be illuminating, especially if they happen to confirm that your opinions and perspectives have been misguided.
But we can also do this for ourselves by invoking the mental model of searching for your own flaws. Treat your perspective or opinion as a hypothesis that must be tested and verified. Key to this is not being emotionally invested in the outcome, or defensive about being correct as opposed to seeking the honest truth.
Instead of approaching a perspective or opinion by seeking to prove it, flip it on its head and seek to prove it wrong (dogs aren't great; dogs are evil).
Instead of maximizing its supposed benefits, minimize them and maximize the shortcomings (dogs may be relatively loyal compared to cats, but they are high-maintenance and can be extremely costly and sometimes even violent).
Instead of imagining smooth sailing and a best-case scenario, paint an apocalyptic worst-case scenario (what if I get a violent dog that I can't properly train and he ruins everything in my home?).
Ask yourself this: if you wanted your perspective or opinion to fail, what is the easiest way for that to happen (if I don't give my dog enough attention or walks, he will go crazy and destroy things)?
Bleak, I know. But otherwise, you fall into the error of confirmation bias. Confirmation bias is rampant; it is when one only pursues and listens to information or evidence in favor of a certain belief that we wish to be true. In doing so, it causes one to disregard, rationalize, deny, or steer clear completely of evidence that disproves or challenges that belief. It's not necessarily driven by ego so much as it is by a desire for wanting to be correct.
Confirmation bias is the ultimate stance of seeing what you want to see and using that perception to prove a pre-chosen conclusion. In fact, it's where you start with a conclusion in mind and work backward to make it your reality despite evidence directly to the contrary.
The simplest example is when you have a particular stance that you want to support - for example, that dogs are loyal. So you type into Google "dogs are very loyal." Obviously, this is going to generate results about the loyalty of dogs, whereas if you type in (1) "are dogs loyal?" (2) "dogs loyalty," or (3) "dogs are not loyal," you would get a broader range of the literature on dogs and loyalty. This particular stance does not have any consequences, but confirmation bias can also turn life-threatening.
Finding your own flaws flows in the opposite (and correct) direction of starting with premises and then drawing conclusions only from what the evidence seems to honestly point toward. Most of us have veritable physical pain when we think about admitting our flaws, especially in front of others. But that's the ego talking, and the ego has zero interest in solving problems and thinking clearly. The ego will always have comforting yet detrimental motives.
The mental model of finding your own flaws applies in another important context: in relationships. This particularly arises when you have conflict with someone else. But again, what if you were to shift gears and proactively seek to find your own flaws in your arguments and stances instead of defending them to the death?
Instead, when you seek to find your own flaws in arguments, try to find what's known as the third story. The third story is what an objective bystander would say about the conflict. It would be ruthlessly objective and detached. You would probably not be pleased to hear it, and you would definitely not be found blameless or without fault.
This is an important realization in itself. Often we can get so wrapped up in intense emotion that we lose track of our goal and simply defend. That's easier for some people than it is for others, but conceding that you could be mistaken opens many more doors to understanding than entrenching yourself. Recognizing that your point of view may be imperfect is, in fact, usually the first part of solving a problem. It's a sign of strength and confidence, whereas dogged refusal to listen to another outlook is more frequently perceived as a sign of shakiness or weakness.
In that sense, it's good to handle your perspective as if there's at least something amiss about it - say, starting with 1%. There's almost no interpersonal issue where the answer is utterly black or white; you are not infallible. So what 1% are you probably wrong about in your side of the argument, even if you don't want to admit it?
If you can fully commit to 1% error/flaw, then it immediately opens you up to the other things you might be missing. Getting that third perspective is a great bridge to understanding the whole of a problem - because if the third story deviates drastically from your story and your opponent's story, then you probably aren't even thinking about the same problem to solve.