We are a 501(c)(3) charity and all donations are tax deductible

Cognitive Biases

Cognitive Bias Cheat Sheet                             Cognitive Bias Cheat Sheet

Because thinking is hard.   All of the writing on this page has been done by Buster Benson

“We can’t avoid our biases. The best we can do is maintain an honest dialogue with our blind spots and commit to identifying and repairing inadvertent damage caused by them as efficiently as possible.”

 

cognitive-bias-codexCognitive Bias Codex by John Manoogian III

    Click on the image to see the detail!

 

Buster shares here a brief but thorough review of common biases that greatly affect our decisions. https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18#.5lbck52kr

Printable Version of the article

 

I’ve spent many years referencing Wikipedia’s list of cognitive biases whenever I have a hunch that a certain type of thinking is an official bias but I can’t recall the name or details. It’s been an invaluable reference for helping me identify the hidden flaws in my own thinking. Nothing else I’ve come across seems to be both as comprehensive and as succinct.

However, honestly, the Wikipedia page is a bit of a tangled mess. Despite trying to absorb the information of this page many times over the years, very little of it seems to stick. I often scan it and feel like I’m not able to find the bias I’m looking for, and then quickly forget what I’ve learned. I think this has to do with how the page has organically evolved over the years. Today, it groups 175 biases into vague categories (decision-making biases, social biases, memory errors, etc) that don’t really feel mutually exclusive to me, and then lists them alphabetically within categories. There are duplicates a-plenty, and many similar biases with different names, scattered willy-nilly.

I’ve taken some time over the last four weeks (I’m on paternity leave) to try to more deeply absorb and understand this list, and to try to come up with a simpler, clearer organizing structure to hang these biases off of. Reading deeply about various biases has given my brain something to chew on while I bounce little Louie to sleep.

I started with the raw list of the 175 biases and added them all to a spreadsheet, then took another pass removing duplicates, and grouping similar biases (like bizarreness effect and humor effect) or complementary biases (like optimism bias and pessimism bias). The list came down to about 20 unique biased mental strategies that we use for very specific reasons.

I made several different attempts to try to group these 20 or so at a higher level, and eventually landed on grouping them by the general mental problem that they were attempting to address. Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.

PB                 An easy reference for all of the biases! https://busterbenson.com/biases/

Four problems that biases help us address:

Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later.

Problem 1: Too much information.

There is just too much information in the world, we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

Problem 3: Need to act fast.

We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.

 

Problem 2: Not enough meaning.

The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world.

Problem 4: What should we remember?

There’s too much information in the universe. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make constant bets and trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save and discard the rest. What we save here is what is most likely to inform our filters related to problem 1’s information overload, as well as inform what comes to mind during the processes mentioned in problem 2 around filling in incomplete information. It’s all self-reinforcing.

Great, how am I supposed to remember all of this?

You don’t have to. But you can start by remembering these four giant problems our brains have evolved to deal with over the last few million years (and maybe bookmark this page if you want to occasionally reference it for the exact bias you’re looking for):

  1. Information overload sucks, so we aggressively filter. Noise becomes signal.
  2. Lack of meaning is confusing, so we fill in the gaps. Signal becomes a story.
  3. Need to act fast lest we lose our chance, so we jump to conclusions. Stories become decisions.
  4. This isn’t getting easier, so we try to remember the important bits. Decisions inform our mental models of the world.

In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things in that firehose are actually important and call those out.

In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models. In the meantime we also need to make sure that it all stays relatively stable and as accurate as possible.

In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and feel confident that we can make things happen.

And in order to keep doing all of this as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time, but no more than that.

Sounds pretty useful! So what’s the downside?

In addition to the four problems, it would be useful to remember these four truths about how our solutions to these problems have problems of their own:

  1. We don’t see everything. Some of the information we filter out is actually useful and important.
  2. Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there.
  3. Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive.
  4. Our memory reinforces errors. Some of the stuff we remember for later just makes all of the above systems more biased, and more damaging to our thought processes.

By keeping the four problems with the world and the four consequences of our brain’s strategy to solve them, the availability heuristic (and, specifically, the Baader-Meinhof phenomenon) will insure that we notice our own biases more often. If you visit this page to refresh your mind every once in a while, the spacing effect will help underline some of these thought patterns so that our bias blind spot and naïve realism is kept in check.

Nothing we do can make the 4 problems go away (until we have a way to expand our minds’ computational power and memory storage to match that of the universe) but if we accept that we are permanently biased, but that there’s room for improvement, confirmation bias will continue to help us find evidence that supports this, which will ultimately lead us to better understanding ourselves.

“Since learning about confirmation bias, I keep seeing it everywhere!”

Cognitive biases are just tools, useful in the right contexts, harmful in others. They’re the only tools we’ve got, and they’re even pretty good at what they’re meant to do. We might as well get familiar with them and even appreciate that we at least have some ability to process the universe with our mysterious brains.

 

Blind Spot Pachinko

Buster Benson   June 3, 2019    Medium Article:

What Can We Do About Our Bias? 

A 4-step roadmap for developing an always-on, honest relationship to bias.

https://medium.com/better-humans/what-can-we-do-about-our-bias-73c16eeb7dca

There are many obstacles to seeing things clearly:

pachinkopachinko column 1

pachinko III

pachinko column II

.

There’s no way to become completely unbiased.  – Buster Benson

https://betterhumans.pub/what-can-we-do-about-our-bias-73c16eeb7dca

All of the steps to develop honest bias are about continuous maintenance rather than one-time and permanent fixes.

The temptation to seek permanent fixes is great (believe me, I’ve looked for them too), but the 3 conundrums don’t have permanent fixes. If you think you’ve found one, or are on its track and will catch it any day now, check yourself. There’s a good chance that it’s intended to resolve your anxiety about the problem rather than fix the problem itself. See the shortcut “treat experience as reality”. Focus on openness, responsiveness, and maintenance instead.

🌀 Four Steps to Developing Honest Bias

Step 1: Opt-in. Developing honest bias requires us to wake up to our own blindness and to stop trying to pretend it doesn’t exist. Only you can decide if you’re up for the challenge of taking it on.

Step 2: Observe (Beginner level). Take steps to reduce the amount of time and energy you spend trying to hide or ignore your biases and blind spots. For example: read articles like this to get familiar with the variety of biases. Notice when your defenses are triggered. Is there an opportunity to learn from a new perspective (even in a small way)?

Step 3: Repair (Intermediate level). Take steps to reduce the time and energy it takes for you to identify and begin to repair inadvertent damage caused by your biases and blind spots. For example: when you notice a blind spot, look into it and identify people and ideas that may have been undervalued or harmed by you and others. Look for ways to reverse that trend and repair damage.

Step 4: Normalize (Advanced level). Take steps to reduce the time and energy others have to spend challenging your blind spots. For example: actively seek out information and perspectives that challenge your own. Invite the best representatives of positions you don’t agree with to productive disagreements. Actively attempt to falsify your own beliefs.

 

The 3 Conundrums & 13 Strategies That Generate Biases

No matter what we do, we can’t escape these conundrums,

but 13 strategies help us think within their constraints.

 

3 Conundrums

🧠 1. There’s too much information (so we must filter it).

🧡 2. There’s not enough meaning (we use stories to make sense).

🖐 3. There’s not enough time (so we motivate towards action).

 

13 Strategies

🧠 1-5 HELP US FILTER INFORMATION1. We depend on the context to figure out what to notice and remember.2. We accept what comes to mind, and don’t worry much about what doesn’t come to mind.3. We amplify bizarre things.4. We notice novelty.5. We seek takeaways to remember and toss the rest.

🧡 6-10 HELP US MAKE SENSE OF THINGS6. We fill in the gaps with stereotypes and generalities.7. We favor familiar things over the non-familiar.8. We treat experience as reality.9. We simplify mental math.10. We are overconfident in everything we do.

11-13 HELP US GET THINGS DONE11. We stick with things we’ve started.12. We protect existing beliefs.13. We will opt to do the safe things, all other things being equal.

Update: A couple days after posting this, John Manoogian III asked if it would be okay to do a “diagrammatic poster remix” of it, to which I of course said YES to.

You can buy a poster-version of the above image online here: https://www.designhacks.co/products/cognitive-bias-codex-poster

I’ll leave you with the first part of this little poem by Emily Dickinson:

The Brain — is wider — than the Sky

For — put them side by side —

The one the other will contain

With ease — and You — beside —

 

Here’s another interesting look at the Cognitive Bias Codex.

Again, click on the image for a large, clear view!

CB codex 2

New and Improved Cognitive Biases Codex…with Descripti2048px-Cognitive_Bias_Codex_With_Definitions,_an_Extension_of_the_work_of_John_Manoogian_by_Brian_Morrissette 2ons!!!

Click on the image…multiple times…to increase the size of the words!!

Or click on this LINK to find it online.