top of page

Cognitive Biases and Logical Fallacies

Updated: Sep 6, 2023

Humans are imperfect. We make decisions many times each day regarding things like what to eat for lunch, how to deal with a coworker's conflicting idea on a project, how to react to shocking news, who to vote for for president, how to spend money, forming an opinion on climate change, etc. A lot of what drives our decisions actually does not include much thought at all, rather we conduct our lives using shortcuts so our brains don't have to work that hard all the time. These shortcuts come in the form of heuristics and cognitive biases that save us time. However, they do come at the cost of being deceived by ourselves or others and making poor decisions.

What Is Cognitive Bias?

A cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them that affects the decisions and judgments that they make.

Logic is the systematic study of valid rules of inference. Here we will explore the common ways that people inadvertently defy logic when forming opinions and making decisions.


Confirmation Bias:

Favoring information that conforms to your existing beliefs and discounting evidence that does not conform.

  • Continued Influence Effect: The tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred.

  • Experimenter's / Expectation Bias: The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.

  • Observer Expectancy Effect: When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.

  • Selective Perception: The tendency for expectations to affect perception.

  • Semmelweis Reflex: The tendency to reject new evidence that contradicts a paradigm.


The tendency to see meaningful connections in unrelated things. Or seeking patterns in random information.

  • Clustering Illusion: The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).

  • Illusory Correlation: Inaccurately perceiving a relationship between two unrelated events.

  • Pareidolia: The tendency for incorrect perception of a stimulus as an object, pattern or meaning known to the observer, such as seeing shapes in clouds, seeing faces in inanimate objects or abstract patterns, or hearing hidden messages in music.

Logical Fallacy:

The use of invalid or otherwise faulty reasoning in the construction of an argument or idea.

  • Berkson's Paradox: The tendency to misinterpret statistical experiments involving conditional probabilities.

  • Gambler's Fallacy: The tendency to think that future probabilities are altered by past events, when in reality they are unchanged.

  • Hot-Hand Fallacy: The belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.

  • Illicit Transference: When a term in the distributive (referring to every member of a class) and collective (referring to the class itself as a whole) sense are treated as equivalent.

  • Irrational Escalation: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.

  • Plan Continuation Bias: Failure to recognize that the original plan of action is no longer appropriate for a changing situation or for a situation that is different than anticipated.

  • Subadditivity Effect: The tendency to judge the probability of the whole to be less than the probabilities of the parts.

  • Zero-Sum Bias: A bias whereby a situation is incorrectly perceived to be like a zero-sum game

Prospect Theory:

A theory of the psychology of choice and finds application in behavioral economics and behavioral finance that describes how individuals assess their loss and gain perspectives in an asymmetric manner. Developed by Daniel Kahneman and Amos Tversky in 1979.

  • Ambiguity Effect: The tendency to avoid options for which the probability of a favorable outcome is unknown.

  • Disposition Effect: The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.

  • Dread Aversion: Just as losses yield double the emotional impact of gains, dread yields double the emotional impact of savoring.

  • Endowment Effect: The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.

  • Loss Aversion: The perceived disutility of giving up an object is greater than the utility associated with acquiring it.

  • Pseudocertainty Effect: The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.

  • Status Quo Bias: The tendency to like things to stay relatively the same.

Availability Bias:

A mental shortcut that relies on immediate examples that come to mind when evaluating a specific topic, concept, method or decision.

  • Attentional Bias: This is the tendency to pay attention to some things while simultaneously ignoring others. For example, when making a decision on which car to buy, you may pay attention to the look and feel of the exterior and interior, but ignore the safety record and gas mileage.

  • Availability Heuristic: Placing greater value on information that comes to your mind quickly. You give greater credence to this information and tend to overestimate the probability and likelihood of similar things happening in the future.

  • Frequency Illusion: Once something has been noticed, then every instance of that thing is noticed, leading to the belief it has a high frequency of occurrence.

  • Salience Bias: The tendency to focus on items that are more prominent or emotionally striking and ignore those that are unremarkable.

  • Survivorship Bias: Concentrating on the people or things that "survived" some process and inadvertently overlooking those that didn't because of their lack of visibility.

  • Observational Selection Bias: The tendency to notice something more when something causes us to be more aware of it.

Extension Neglect:

When the sample size is ignored while evaluating a study in which the sample size is logically relevant.

  • Base Rate Fallacy: The tendency to ignore general information and focus on information only pertaining to the specific case, even when the general information is more important.

  • Compassion Fade: The predisposition to behave more compassionately towards a small number of identifiable victims than to a large number of anonymous ones.

  • Conjunction Fallacy: The tendency to assume that specific conditions are more probable than a more general version of those same conditions.

  • Duration Neglect: The neglect of the duration of an episode in determining its value.

  • Hyperbolic Discounting: The tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.

  • Insensitivity to Sample Size: The tendency to under-expect variation in small samples.

  • Neglect of Probability: The tendency to completely disregard probability when making a decision under uncertainty.

  • Scope Neglect: The tendency to be insensitive to the size of a problem when evaluating it.

  • Zero-Risk Bias: Preference for reducing a small risk to zero over a greater reduction in a larger risk.

Framing Effect:

When people decide on options based on whether the options are presented with positive or negative connotations; e.g. as a loss or as a gain.

  • Decoy Effect: Preferences for either option A or B change in favor of option B when option C is presented, which is completely dominated by option B (inferior in all respects) and partially dominated by option A.

  • Default Effect: When given a choice between several options, the tendency to favor the default one.

  • Distinction Bias: The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.

Egocentric Bias:

The tendency to rely too heavily on one's own perspective and/or have a higher opinion of oneself than reality.

  • False Consensus Effect: The tendency to overestimate how much other people agree with you.

  • Barnum Effect: The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.

  • Illusion of Control: The tendency to overestimate one's degree of influence over external events.

  • Planning Fallacy: The tendency to underestimate one's own task-completion times.


The belief or assertion that a particular statement is true based on the intuition or perceptions of some individual or individuals, without regard to evidence, logic, intellectual examination, or facts.

  • Belief Bias: An effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.

  • Illusory Truth Effect: A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity.

  • Rhyme as Reason Effect: Rhyming statements are perceived as more truthful.

  • Subjective Validation: Perception that something is true if a subject's belief demands it to be true. Also assigns perceived connections between coincidences.

Anchoring Bias:

The tendency to rely too heavily on the very first piece of information you learn in order to make subsequent judgements during decision making.

  • Conservatism Bias: The tendency to revise one's belief insufficiently when presented with new evidence.

  • Functional Fixedness: Limits a person to using an object only in the way it is traditionally used.

  • Law of the Instrument: An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. "If all you have is a hammer, everything looks like a nail."

Familiarity Principle:

A psychological phenomenon by which people develop a preference for things merely because they are familiar with them.

  • Mere Exposure Effect: The tendency to express undue liking for things merely because of familiarity with them.

Cognitive Dissonance:

When a person holds contradictory beliefs, ideas, or values, and is typically experienced as psychological stress when they participate in an action that goes against one or more of them.

  • Ben Franklin Effect: A person who has performed a favor for someone is more likely to do another favor for that person than they would be if they had received a favor from that person.

  • Normalcy Bias: The refusal to plan for, or react to, a disaster which has never happened before.


  • Actor-observer bias: The tendency to attribute your own actions to external causes while attributing other people's behaviors to internal causes. For example, you attribute your high cholesterol level to genetics while you consider others to have a high level due to poor diet and lack of exercise.

  • Authority Bias: The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.

  • Bandwagon Effect: The tendency to do (or believe) things because many other people do (or believe) the same.

  • Courtesy Bias: The tendency to give an opinion that is more socially correct than one's true opinion, so as to avoid offending anyone.

  • Curse of Knowledge: When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.

  • Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.

  • Declinism: Viewing the past favorably (rosy retrospection) and future negatively.

  • Dunning-Kruger effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.

  • Exaggerated Expectation: The tendency to expect or predict more extreme outcomes than those outcomes that actually happen.

  • Fading Affect Bias: A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.

  • Halo effect: When your overall impression of a person influences how you feel and think about their character. This especially applies to physical attractiveness influencing how you rate their other qualities.

  • Hindsight Bias: The tendency to see past events as being predictable at the time those events happened.

  • IKEA Effect: The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end product.

  • Information Bias: The tendency to seek information even when it cannot affect action.

  • Ingroup Bias: The tendency for people to give preferential treatment to others they perceive to be members of their own groups.

  • Interoceptive Bias: The tendency for sensory input about the body itself to affect one's judgement about external, unrelated circumstances. (For example, in parole judges who are more lenient when fed and rested.)

  • Misinformation effect: Memory becoming less accurate because of interference from post-event information.

  • Money Illusion: The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.

  • Moral Credential Effect: Occurs when someone who does something good gives themselves permission to be less good in the future.

  • Non-Adaptive Choice Switching: After experiencing a bad outcome with a decision problem, the tendency to avoid the choice previously made when faced with the same decision problem again, even though the choice was optimal.

  • Omission Bias: The tendency to judge harmful actions (commissions) as worse, or less moral, than equally harmful inactions (omissions).

  • Optimism Bias: The tendency to be over-optimistic, underestimating greatly the probability of undesirable outcomes and overestimating favorable and pleasing outcomes.

  • Ostrich Effect: Ignoring an obvious (negative) situation.

  • Outcome Bias: The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.

  • Pessimism Bias: The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.

  • Pro-Innovation Bias: The tendency to have an excessive optimism towards an invention or innovation's usefulness throughout society, while often failing to identify its limitations and weaknesses.

  • Projection Bias: The tendency to overestimate how much our future selves share one's current preferences, thoughts and values, thus leading to suboptimal choices.

  • Proportionality Bias: Our innate tendency to assume that big events have big causes, may also explain our tendency to accept conspiracy theories.

  • Recency Illusion: The illusion that a phenomenon one has noticed only recently is itself recent.

  • Risk Compensation Effect: The tendency to take greater risks when perceived safety increases.

  • Surrogation: Losing sight of the strategic construct that a measure is intended to represent, and subsequently acting as though the measure is the construct of interest.

  • Parkinson's Law of Triviality: The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.

  • Unconscious Bias: Also known as implicit biases, are the underlying attitudes and stereotypes that people unconsciously attribute to another person or group of people that affect how they understand and engage with them.

  • Unit Bias: The standard suggested amount of consumption (e.g., food serving size) is perceived to be appropriate.

  • Weber-Fechner Law: Difficulty in comparing small differences in large quantities.


bottom of page