Reading Notes & Thoughts from…
By Jessica Stillman , Inc Magazine
— — — — — — — — — — — — — — — — — — — — — — — — —
Confirmation Bias. I have definitely written about this one before. Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes.
Flawed decisions due to confirmation bias have been found in political, organizational, financial and scientific contexts. Other great examples can be seen in medicine. You might be more likely to believe a study that confirms your beliefs from 1 credible institution vs contrary info from an otherwise equally credible institution. Can often be seen in the wild with Naïve Realism or Naïve Cynicism. As in, assuming contrary info is biased (“probably paid for by XXX industry” or “that’s what they want you to think”)
Backfire Effect. Repeatedly mentioning a false belief to disprove it sometimes ends up just making people believe it more. It’s a cognitive bias that causes people to have increased conviction in their beliefs when faced with opposing evidence. In other words, disconfirming facts only strengthen their stance.
Is this bad for us? Although keeping our self-identity and worldview intact can have some benefits for us, problems are likely to arise when we stick to incorrect beliefs.
Example: A study was conducted to find out whether positive information about vaccinations encourages parents to vaccinate their kids. The results showed that when parents who are against vaccinations are informed about the benefits that vaccines provide, they sometimes become more convinced that they have negative consequences.
Commonly characterized by statements like “I don’t know about that, all I know is <something else that they think tangentially reinforces their argument>.”