The human brain is a powerful computer that’s capable of incredible things, but it’s also full of bugs and extremely prone to error.
These errors, also called “cognitive biases” are mental mistakes that can affect both our thinking and actions.
What is Cognitive Bias?
A cognitive bias is an error in cognition that arises in a person’s line of reasoning when decision-making is skewed or flawed by personal beliefs.
In other words:
A cognitive bias is a programmed error in our brains that causes us to make irrational decisions and judgments on the information we process.
You can think of them as rules of thumb that our brains follow to interpret information in an easy way.
There are many Cognitive Biases that you could or should take into account:
View a high resolution version of this cognitive bias codex with definitions by clicking here.
These biases, deeply ingrained in our psychology, are systematic deviations from a standard of rationality or good judgment.
They stem from the brain’s attempt to simplify information processing, and when activated, can lead to extrapolating information from the wrong sources, seeking out information to confirm existing beliefs, and/or failing to remember events the way they actually happened.
Of course, in the never-ending drama of human decision-making (which I often call the theater of war), where every day we are faced with a multitude of decisions that can lead to potentially infinite outcomes, cognitive biases play the role of both the villain and the hero masterfully well.
But this is all part of being human.
You see, these biases are evolutionary adaptations (developed over long timescales) that helped our ancestors make quick, life-saving decisions.
But in today’s complex and nuanced world, biases often lead us down the wrong path, subtly skewing our choices and judgments in ways we barely even notice.
This often leads to bad decisions and suboptimal outcomes.
Think of it like this:
Imagine you’re on a road trip, and the road to your final destination is full of twists and turns, forks and forests. There are many ways to get lost if you’re not careful. Now, imagine you’re the driver, and your brain is your well-meaning but slightly bumbling friend (the passenger) who insists on using that ‘gut feeling’ instead of a reliable GPS to get you there.
From the ‘trust me bro’ overconfidence bias (because, of course, we’re always right) to the classic confirmation bias (where we pick facts like cherries, only the ones that suit our pie), our brains are not the most reliable source of directions at all times.
After many hours of research, I have sorted through more than 100 cognitive biases that are known to exist in the world today and compiled them into a short list. I have separated them out by category below.
Note: Although the reality of most of these biases is confirmed by reproducible research, there are often controversies about how to classify these biases or how to explain them.
List of over 100+ types of the most relevant cognitive biases in behavioral economics
Perception Biases
- Confirmation Bias: The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs.
- In-group Bias: The tendency to favor one’s own group over outsiders.
- Halo Effect: The tendency to let one positive trait influence the perception of someone’s other traits.
- Horn Effect: The tendency to let one negative trait influence the perception of someone’s other traits.
- Anchoring Bias: The tendency to rely too heavily on the first piece of information encountered when making decisions.
- Framing Effect: The tendency to draw different conclusions based on how data is presented.
- Selective Perception: The tendency to selectively focus on information that confirms one’s own beliefs.
- Blind Spot Bias: The inability to recognize one’s own cognitive biases.
Memory Biases
- Availability Heuristic: Overestimating the importance of information that comes to mind easily.
- Recency Effect: The tendency to weigh the latest information more heavily than older data.
- Hindsight Bias: The tendency to believe, after an event has occurred, that one would have predicted or expected it.
- Serial Position Effect: The tendency to remember the first (primacy effect) and last (recency effect) items in a series the best, and those in the middle the worst.
- Serial Recall Effect: The tendency to recall items or events in order.
- Memory Inhibition: The ability not to remember much of irrelevant information.
- Modality Effect: The tendency to remember information better when it is presented in a spoken format rather than in a written format (e.g. visual presentation) particularly in the context of information presented in a list or sequence.
- Duration Neglect: A psychological observation that people’s judgments of the unpleasantness or intensity of painful experiences depends very little on the duration of those experiences.
- Peak-End Rule: The tendency to judge an experience largely based on how you felt at its peak (most intense point) and at its end, rather than the total sum or average of every moment of the experience.
Decision-Making Biases
- Dunning-Kruger Effect: The tendency for unskilled individuals to overestimate their abilities.
- Choice-Supportive Bias: The tendency to remember one’s choices as better than they actually were.
- Regret Aversion: The tendency to avoid decision-making due to fear of regret.
- Status Quo Bias: The preference for the current state of affairs.
- Omission Bias: The tendency to judge harmful actions as worse than equally harmful inactions.
- Base Rate Fallacy: Ignoring or underweighting the base rate information (general information) and focusing solely on specific information, even when the general information is more relevant to the judgment.
- Familiarity Bias: The tendency to prefer familiar people, things, or ideas, often leading to a reluctance to explore new or unknown options.
Example: Imagine you’re ordering food at your favorite restaurant, and let’s assume it’s a pizza joint in Rome. You might experience what’s known as “familiarity bias,” which can lead you to repeatedly choose dishes you’ve enjoyed in the past (aka ol’ reliable) rather than trying new items. Afterall, you know this pizza joint has the best slice in the world (you get it every time) and you don’t need anyone to tell you otherwise, right? This bias is based on the comfort and predictability of familiar choices, which may cause you to miss out on discovering new, potentially more enjoyable menu options. When this happens, your preference for the familiar can overshadow an objective assessment of other available dishes, leading to a repetitive, though comfortable, dining experience. This bias is a classic demonstration of how past positive experiences can disproportionately influence current decision-making, even in the simple act of ordering a meal. A common smoov brain reasoning error.
Probability and Belief Biases
- Neglect of Probability: Ignoring probability when making a decision under uncertainty.
- Gambler’s Fallacy: The belief that future probabilities are altered by past events, when in reality they are unchanged. For example, believing that a coin is ‘due’ to land on heads after several tails in a row.
- Zero-Risk Bias: Preference for reducing a small risk to zero over a greater reduction in a larger risk.
- Survivorship Bias: The tendency to focus on the survivors of a particular situation rather than those who succumb to it.
- Belief Bias: The tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion.
- Overconfidence Bias: Overestimating one’s own abilities or the accuracy of one’s beliefs or judgments, often leading to an underestimation of the probability of being wrong.
- Anchoring in Probability Estimates: The tendency to rely too heavily, or “anchor,” on one piece of information (often the first encountered) when making decisions, including probability estimations.
- Insensitivity to Sample Size: The tendency to underweight or ignore the size of a sample, which is crucial in judging the reliability of statistical evidence.
- Representativeness Heuristic: Judging the probability of an event based on how much it resembles other events or a parent population, often leading to disregard of actual probability.
- Illusion of Control: The tendency to overestimate one’s control over events, particularly in random or chance situations, leading to an overestimation of the probability of desired outcomes
- Neglect of Probability: The tendency to completely disregard probability when making a decision under uncertainty, often focusing on other factors instead.
- Conjunction Fallacy: The belief that the conjunction of two events (e.g., A and B happening together) is more probable than a single one of these events (e.g., just A or just B happening), which violates the basic principles of probability.
A Note On Probabilities: The math behind Bayes’ Theorem is simple, but the application of it is often complex. Here’s an illustrative example, adapted from Nate Silver’s explanation in The Signal and the Noise. “Let’s say that before Sept. 11, 2001, you put the odds that terrorists would crash a plane into a New York skyscraper (x) at 0.005%. After the first plane struck, you figured that the odds of a plane’s hitting, if in fact terrorists were attacking Manhattan with it (y) was 100%, and the odds of a plane’s hitting by random chance (z) was 0.008%. Plug those into Bayes’ formula, xy/(xy+z(1-x)), and the probability that terrorists had just flown that plane into the World Trade Center is 38%. Run the calculation for a second plane, using 38% as the initial probability, and you get a probability of 99.99%.”
Social Biases
- Groupthink: The tendency to conform to the opinions or decisions of a group.
- Bandwagon Effect: The tendency to adopt certain behaviors or beliefs because many other people are doing the same.
- Not Invented Here Bias: The tendency to reject solutions or ideas developed outside a group.
- Outgroup Homogeneity Bias: The perception that individuals in the outgroup are more similar to each other than they really are.
Action Biases
- Hyperbolic Discounting: The tendency to choose smaller, immediate rewards over larger, future rewards.
- Endowment Effect: The tendency to overvalue something simply because one owns it.
- Effort Justification: The tendency to attribute greater value to outcomes that require more effort.
Self-Related Biases
- Self-Serving Bias: The tendency to claim more responsibility for successes than failures.
- Optimism Bias: The tendency to overestimate the likelihood of positive outcomes.
- Pessimism Bias: The tendency to overestimate the likelihood of negative outcomes.
- Egocentric Bias: Over-relying on one’s own perspective and ignoring the perspectives of others.
Cognitive Misperceptions
- Apophenia: The tendency to perceive connections between unrelated things.
- Clustering Illusion: The tendency to see patterns where actually none exist.
- Confabulation: The act of filling in memory gaps.
- Déformation Professionnelle: The tendency to look at things according to the conventions of one’s own profession.
- False Consensus: Overestimating how much other people agree with you.
- Focusing Effect: The tendency to place too much importance on one aspect of an event.
- Forer Effect: The tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people.
- Framing: Using an approach or description of the situation or issue that is too narrow.
- Fundamental Attribution Error: The tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior.
- Illusory Correlation: Inaccurately perceiving a relationship between two unrelated events.
Economic and Financial Biases
- Money Illusion: The tendency to think in terms of nominal rather than real monetary values.
- Mental Accounting: The tendency to categorize and treat money differently depending on its source or intended use.
- Disposition Effect: The tendency to sell assets that have increased in value and keep assets that have decreased in value.
- Sunk Cost Fallacy: The tendency to continue a continue an endeavor once an investment in money, effort, or time has been made.
- House Money Effect: The tendency to take more risks when money is perceived as a “windfall” or “house money.”
- Loss Aversion: The tendency to prefer avoiding losses over acquiring equivalent gains, with the psychological impact of a loss generally felt more strongly than that of an equivalent gain.
A Note on Loss Aversion: in the now declassified report “Prospect Theory: An Analysis of Decision under Risk” Kahneman and Tversky (1) illustrate how investors often exhibit loss aversion by holding onto losing stocks for too long, hoping to avoid realizing a loss, and selling winning stocks too quickly to secure gains.
This behavior contradicts the rational investment strategy of cutting losses and letting winners run, illustrating how the fear of losses can disproportionately influence financial decision-making compared to the attraction of equivalent gains.
Kahneman and Tversky’s work provides foundational insight into how loss aversion affects investor behavior and market dynamics.
Loss aversion is a cognitive bias commonly observed in economic and financial decision-making, where individuals exhibit a preference for avoiding losses rather than acquiring equivalent gains.
For example, an investor might choose not to sell a stock that has decreased in value, preferring to avoid realizing a loss, even though rational analysis suggests that selling and reallocating funds to a more promising investment would likely lead to better financial outcomes. This bias is rooted in the psychological impact of a loss, which is typically perceived as more significant and unpleasant than the joy of an equivalent gain.
Affective Biases
- Affect Heuristic: The tendency to make decisions based on emotions rather than logical analysis.
- Empathy Gap: The tendency to underestimate the influence of emotional states on decision-making.
- Hot-hand Fallacy: The belief that a person who has experienced success has a higher chance of further success in additional attempts.
- Mood-Congruent Memory Bias: The tendency to recall information that is congruent with one’s current mood.
Time-Related Biases
- Planning Fallacy: The tendency to underestimate the time needed to complete a future task.
- Time-Saving Bias: Overestimating the amount of time that could be saved when increasing speed.
- Temporal Discounting: The devaluation of future rewards.
- Procrastination: The act of delaying tasks that should be done immediately.
Complexity and Ambiguity Biases
- Ambiguity Aversion: The tendency to avoid options for which the probability of a favorable outcome is unknown.
- Curse of Knowledge: The tendency to assume that others have the same background or knowledge as oneself.
- Need for Closure: The desire to quickly conclude decision-making or problem-solving tasks.
- Paralysis by Analysis: The state of over-analyzing a situation so that a decision or action is never taken.
Ethical and Moral Biases
- Moral Credential Effect: The tendency to act immorally when one’s past behavior has proven moral integrity.
- Ethical Fading: The tendency to overlook ethics when making a decision that triggers cognitive biases.
- Just-World Hypothesis: The belief that actions have morally fair and fitting consequences.
- Self-Righteousness: The perception that one’s beliefs, values, or actions are superior to those of others.
Communication Biases
- Silent Majority Fallacy: The belief that one’s own opinions are in the minority when statistical evidence is lacking.
- Egocentric Bias in Attribution: The tendency to attribute communication behavior to personality traits rather than situational factors.
- Illusion of Transparency: Overestimating the degree to which one’s mental state is known by others.
- Spiral of Silence: The tendency to remain silent when one believes that one’s views are in the minority.
Learning and Education Biases
- Forgetting Curve: The exponential loss of information that one has learned.
- Generation Effect: The phenomenon where information is better remembered if it is generated from one’s own mind rather than simply read.
- Spacing Effect: The phenomenon where information is better recalled if exposure to it is repeated over a longer span of time.
- Von Restorff Effect: The tendency to remember a unique item in a list better than common items.
Environmental and Contextual Biases
- Broken Windows Theory: The tendency for small signs of disorder to lead to larger-scale neglect and disorder.
- Context Effect: The tendency to recall information better when in the same context in which it was learned.
- Distinction Bias: The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.
- NIMBY (Not In My Back Yard): The tendency to oppose unwanted developments or hazards in one’s own neighborhood while accepting them elsewhere.
Philosophical and Existential Biases
- Existential Bias: The tendency to focus on life’s immediate challenges and ignore existential questions that lack immediate practical application.
- Fatalism: The belief that all events are predetermined and therefore inevitable, often leading to resignation and disinclination to act.
Self Consistency Biases
- The IKEA Effect: The tendency to value an object more if you make (or assemble) it yourself, and how we tend to like things more if we’ve expended effort to create them.
- Cognitive Dissonance: The mental discomfort experienced when holding two or more contradictory beliefs, values, or ideas, leading to an alteration in one’s beliefs to reduce the discomfort.
- Consistency Bias: The tendency to retrospectively ascribe one’s current views and behaviors to their past self, even when the past self did not hold those views or behaviors.
A Note on the IKEA Effect: Imagine you’re the proud owner of a 3-bedroom house with a huge yard. The market has been pretty good the past few years and you have considered selling the place and upgrading to a bigger house.
But you’ve never done anything to the yard.
It’s just a boring plot of unkempt grass.
Before you list the place, you’ll need to spruce up the yard to increase the marketability and curb appeal, so you decide to spend the next several months landscaping and gardening.
You spend hundreds of hours tilling the soil, planting seeds, adding trees, bushes, and flowers, watering them diligently, and carefully tending to your creations as they grow.
Sometime later, you decide the time is right and list your home online.
After doing some Googling, you see that similar houses are being sold for $700,000, but you decide to list yours for $1,700,000.
Of course, your garden oasis probably didn’t turn out as professionally manicured as those done by expert landscapers (there are some patches where plants didn’t thrive, and it isn’t totally visually appealing) but to you, this space is invaluable.
All that work you put in has got to be worth something, right?
So why does this happen?
As humans, we often derive immense satisfaction and joy from seeing the fruits of our labor, quite literally in some cases.
This sense of accomplishment and personal investment can make the garden feel more special than a professionally designed one… so our brains tend to blind us to the imperfections and focus more of our attention on the time and effort we personally invested in it, causing us to drastically overestimate its market value.
The emotional connection created here (labor equals value) is a classic manifestation of the IKEA effect.
Related: Due to the IKEA effect, we are also often willing to fork over extra money for experiences that require us to put in more work, like assembling furniture ourselves instead of buying it pre-assembled.
Other Biases
- Authority Bias: The tendency to attribute greater accuracy to the opinion of an authority figure.
- Contrast Effect: The enhancement or diminishment of a weight or other measurement when compared with a recently observed contrasting object.
- Affinity Bias: The tendency to get along better with people like us.
- Attribution Bias: The tendency to attribute one’s own actions to external causes while attributing other people’s behaviors to internal causes.
- Conservatism Bias: The tendency to insufficiently revise one’s belief when presented with new evidence.
- Information Bias: The tendency to seek information even when it cannot affect action.
- Ostrich Effect: The tendency to ignore dangerous or negative information by “burying” one’s head in the sand.
- Outcome Bias: Judging a decision based on the outcome rather than how exactly the decision was made.
- Overreaction: The tendency to react more strongly to new information than is warranted.
- Paradox of Choice: The tendency to become paralyzed when faced with too many options.
- Placebo Effect: The tendency to experience a perceived improvement in conditions when receiving an inert treatment.
- Pro-Innovation Bias: The belief that an innovation should be adopted by the whole society without the need for its alteration.
- Reciprocity: The tendency to think that favors offered should be reciprocated.
- Salience: The tendency to focus on the most easily recognizable features of a person or concept.
- Selective Exposure Theory: The tendency to favor information which reinforces one’s pre-existing views while avoiding contradictory information.
- Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.
- Triviality: The tendency to give disproportionate weight to trivial issues.
- Unit Bias: The tendency to want to finish a given unit or task or to provide it with a certain round number.
- Weber’s Law: The tendency to perceive changes in proportional terms.
- Wishful Thinking: The formation of beliefs based on what might be pleasing to imagine.
- Zeigarnik Effect: The tendency to remember uncompleted tasks better than completed ones.
When making decisions, humans are influenced by factors that carry a lot more weight than logic.
Identifying and measuring the impact of those factors (heuristics) is the goal of every behavioral economist.
Tversky and Kahneman’s seminal work on heuristics and biases in the 1970s, for example, initiated a significant shift in understanding how the human brain processes information and makes decisions, revealing that human decision making is not always rational or optimal but is frequently influenced by these biases (2).
Here are a few examples of how biases can substantially influence human behavior across a variety of contexts, from business and strategic military decisions to personal relationships:
In Business Strategy:
Business leaders fall victim to biases quite often, and the one that usually tops the list is overconfidence bias, where they overestimate their own abilities or the accuracy of their predictions.
When this happens, it can often lead to overly aggressive business strategies that do not account for real-world uncertainties.
For example, confirmation bias may cause a company to favor information that supports the existing strategy and disregards dissenting perspectives, potentially leading to miscalculation and strategic blunders.
Also, the sunk cost fallacy typically results in the continued investment in dead end or failing projects due to the amount of resources already spent, rather than cutting losses and reallocating resources to a campaign or venture that shows a higher likelihood of success (3).
In Military and War:
In military strategy, biases can have grave consequences. Human cost and loss of life, social and economic suboptimalities, destabilization, government overthrow, financial ruin, the loss of a kingdom. You get the idea.
War is a high stakes game, and mistakes can be costly. There is a lot of stress, which can lead military strategists to succumb to bias.
One such bias that has been consistently observed in the theater of war is the anchoring effect, which can lead strategists to rely too heavily on the first piece of intelligence they receive, sometimes even shaping their subsequent decisions around that anchor even if it has been proven to be unreliable.
A common smoov brain reasoning error.
Another particularly hazardous error one can make in a war room setting is groupthink, where the pressure for group cohesion may suppress dissenting opinions and lead to suboptimal decision-making and outcomes (4).
Additionally, hindsight bias (where people convince themselves after an event that they accurately predicted it before it happened, leading them to conclude that they can accurately predict other events) can be a significant impediment to strategic evaluation. Hindsight bias can stop decision makers from learning from past mistakes, thus influencing future strategies with the wrong variables and increasing the probability of suboptimal outcomes in the future.
In Personal Relationships:
We have all been there, we notice someone and let a cognitive bias like the halo effect cause us to let an impressive trait of that person (like physical attractiveness or professional success) positively influence our evaluation of the person’s other traits (5).
On the flip side of that coin, the horn effect can lead to a disproportionately negative perception of a person based on one trait.
Along the same lines, our brains can trick us into familiarity bias, which can lead one to develop preferences for things just because we are familiar with them, which can impact who we form relationships with or how we react to new individuals entering their established social circles.
And lastly, the false consensus effect is known to lead one to make the assumption that others share their beliefs and behaviors to a greater extent than they actually do, which can result in misunderstandings and conflicts in personal interactions.
An all-to-common story.
Final Thoughts
Today we talked about cognitive biases, which are the deeply embedded tendencies of our brain to acquire and process information by filtering it through one’s own likes, dislikes, and experiences.
Cognitive biases are pervasive and can significantly influence an array of decisions and behaviors, often without you being consciously aware that it’s even happening.
They’re like the brain’s own quirky software glitches, endearing in their predictability, yet baffling in their logic.
From the overconfident swagger of the “trust me bro” I’m always right bias to the nostalgic rose-colored glasses of the “good old days” fallacy, our ‘smoov brain‘ reasoning errors are not just annoying software glitches or fascinating quirks – they’re actually quite useful, reminding us that even at our most sophisticated, we’re still delightfully human.
Understanding these biases is not just an academic exercise; it’s a crucial step towards improving our decision-making processes, fostering better judgment, and navigating the world with a clearer, more rational perspective.
So, whether you’re a CEO making a strategic business decision, a general planning a military operation, or just a common plebeian navigating a personal relationship, being aware of these biases and actively working to mitigate their influence can lead to more rational decision-making and better outcomes.
What do you recognize?
I’m curious… what cognitive biases do you recognize in yourself or others?
Let me know on X/Twitter: twitter.com/jaminthompson.
P.S. Would you like additional help? Our Lab can develop and design custom models for your business to help you avoid costly miscalculations and strategic errors. Contact us for more information about how you can improve decision making with Big Data Engineering.
References
- Tversky, A., & Kahneman, D. (1979). Prospect Theory: An Analysis of Decision Making Under Risk. Decision Research.
- Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131.
- Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140.
- Janis, I. L. (1972). Victims of Groupthink. Houghton Mifflin.
- Thorndike, E. L. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25–29.