fundamental attribution error: we judge others on their personality or fundamental character, but we judge ourselves on the situation.
self-serving bias: our failures are situational, but our successes are our responsibility.
in-group favoritism: we favor people who are in our in-group as opposed to an out-group.
bandwagon effect: ideas, fads, and beliefs grow as more people adopt them.
groupthink: due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
halo effect: if you see a person as having a positive trait, that positive impression will spill over into their other traits. (this also works for negative traits.)
moral luck: better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
false consensus: we believe more people agree with us than is actually the case.
curse of knowledge: once we know something, we assume everyone else knows it, too.
spotlight effect: we overestimate how much people are paying attention to our behavior and appearance.
availability heuristic: we rely on immediate examples that come to mind while making judgments.
defensive attribution: as a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
just-world hypothesis: we tend to believe the world is just; therefore, we assume acts of injustice are deserved.
naïve realism: we believe that we observe objective reality and that other people are irrational, uninformed, or biased.
naïve cynicism: we believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
forer effect (aka barnum effect): we easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
dunning-kruger effect: the less you know, the more confident you are. the more you know, the less confident you are.
anchoring: we rely heavily on the first piece of information introduced when making decisions.
automation bias: we rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
google effect (aka digital amnesia): we tend to forget information that’s easily looked up in search engines.
reactance: we do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
confirmation bias: we tend to find and remember information that confirms our perceptions.
backfire effect: disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
third-person effect: we believe that others are more affected by mass media consumption than we ourselves are.
belief bias: we judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
availability cascade: tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
declinism: we tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
status quo bias: we tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
sunk cost fallacy (aka escalation of commitment): we invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
gambler’s fallacy: we think future possibilities are affected by past events.
zero-risk bias: we prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
framing effect: we often draw different conclusions from the same information depending on how it’s presented.
stereotyping: we adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
outgroup homogeneity bias: we perceive out-group members as homogeneous and our own in-groups as more diverse.
authority bias: we trust and are more often influenced by the opinions of authority figures.
placebo effect: if we believe a treatment will work, it often will have a small physiological effect.
survivorship bias: we tend to focus on those things that survived a process and overlook ones that failed.
tachypsychia: our perceptions of time shift depending on trauma, drug use, and physical exertion.
law of triviality (aka “bike-shedding”): we give disproportionate weight to trivial issues, often while avoiding more complex issues.
zeigarnik effect: we remember incomplete tasks more than completed ones.
ikea effect: we place higher value on things we partially created ourselves.
ben franklin effect: we like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
bystander effect: the more other people are around, the less likely we are to help a victim.
suggestibility: we, especially children, sometimes mistake ideas suggested by a questioner for memories.
false memory: we mistake imagination for real memories.
cryptomnesia: we mistake real memories for imagination.
clustering illusion: we find patterns and “clusters” in random data.
pessimism bias: we sometimes overestimate the likelihood of bad outcomes.
optimism bias: we sometimes are over-optimistic about good outcomes.
blind spot bias: we don’t think we have bias, and we see it others more than ourselves.