Cognitive Confusion & Brain Biases

We are all guilty of being biased at times; we believe what we want to believe.  We accept evidence that goes along with our worldview, and outright ignore evidence that contradicts it.  Some call this cognitive dissonance, but why do we reject information that counters our ideology? It’s because, basically, our brains are biased.  Coincidingly, cognitive confusion can be catastrophic when attempting to separate fact from fiction, frankly.

Alliteration aside, let’s dig in..

cognitive biases confusion brain

Cognitive Biases: Decision-making, belief & behavioral biases

  • Ambiguity effect – the tendency to avoid options for which missing information makes the probability seem “unknown.”
  • Anchoring – the tendency to rely too heavily, or “anchor,” on a past reference or on one trait or piece of information when making decisions (also called “insufficient adjustment”).
  • Attentional Bias – the tendency of emotionally dominant stimuli in one’s environment to preferentially draw and hold attention and to neglect relevant data when making judgments of a correlation or association.
  • Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
  • Availability cascade – a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).
  • Backfire effect – Evidence disconfirming our beliefs only strengthens them.
  • Bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.
  • Base rate neglect or Base rate fallacy – the tendency to base judgments on specifics, ignoring general statistical information.
  • Belief bias – an effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.
  • Bias blind spot – the tendency to see oneself as less biased than other people.
  • Choice-supportive bias – the tendency to remember one’s choices as better than they actually were.
  • Clustering illusion – the tendency to see patterns where actually none exist. Also referred to as “patternicity” by author Michael Shermer.
  • Confirmation bias – the tendency to search for or interpret information in a way that confirms one’s preconceptions.

  • Congruence bias – the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Conjunction fallacy – the tendency to assume that specific conditions are more probable than general ones.
  • Conservatism or Regressive Bias – tendency to underestimate high values and high likelihoods/probabilities/frequencies and overestimate low ones. Based on the observed evidence, estimates are not extreme enough
  • Contrast effect – the enhancement or diminishing of a weight or other measurement when compared with a recently observed contrasting object.
  • Denomination effect – the tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[14]
  • Distinction bias – the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.
  • Empathy gap – the tendency to underestimate the influence or strength of feelings, in either oneself or others.
  • Endowment effect – the fact that people often demand much more to give up an object than they would be willing to pay to acquire it.
  • Exaggerated expectation – based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).
  • Experimenter’s or Expectation bias – the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.
  • Focusing effect – the tendency to place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.

  • Forward Bias – the tendency to create models based on past data which are validated only against that past data.
  • Framing effect – drawing different conclusions from the same information, depending on how that information is presented.
  • Frequency illusion – the illusion in which a word, a name or other thing that has recently come to one’s attention suddenly appears “everywhere” with improbable frequency (see also recency illusion). AKA “The Baader-Meinhof phenomenon”.

  • Gambler’s fallacy – the tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the Law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
  • Hard-easy effect – Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough
  • Hindsight bias – sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened.
  • Hostile media effect – the tendency to see a media report as being biased due to one’s own strong partisan views.
  • Hyperbolic discounting – the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.
  • Illusion of control – the tendency to overestimate one’s degree of influence over other external events.
  • Illusory correlation – inaccurately perceiving a relationship between two unrelated events.
  • Impact bias – the tendency to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias – tendency to seek information even when it cannot affect action.

 

  • Irrational escalation – the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong.
  • Just-world hypothesis – the tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).
  • Loss aversion – “the disutility of giving up an object is greater than the utility associated with acquiring it”. (see also Sunk cost effects and Endowment effect).
  • Mere exposure effect – the tendency to express undue liking for things merely because of familiarity with them.
  • Money illusion – the tendency to concentrate on the nominal (face value) of money rather than its value in terms of purchasing power.
  • Moral credential effect – the tendency of a track record of non-prejudice to increase subsequent prejudice.
  • Negativity bias – the tendency to pay more attention and give more weight to negative than positive experiences or other kinds of information.
  • Neglect of probability – the tendency to completely disregard probability when making a decision under uncertainty.
  • Normalcy bias – the refusal to plan for, or react to, a disaster which has never happened before.
  • Observer-expectancy effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Omission bias – the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Optimism bias – the tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinkingoptimism biasvalence effectpositive outcome bias).
  • Ostrich effect – ignoring an obvious (negative) situation.
  • Outcome bias – the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Overconfidence effect – excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.
  • Pareidolia – a vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse.
  • Pessimism bias – the tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.
  • Placement bias – tendency to believe ourselves to be better than others at tasks at which we rate ourselves above average (also Illusory superiority or Better-than-average effect) and tendency to believe ourselves to be worse than others at tasks at which we rate ourselves below average (also Worse-than-average effect
  • Planning fallacy – the tendency to underestimate task-completion times.
  • Post-purchase rationalization – the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Primacy effect – the greater ease of recall of initial items in a sequence compared to items in the middle of the sequence.
  • Pro-innovation bias – the tendency to reflect a personal bias towards an invention/innovation, while often failing to identify limitations and weaknesses or address the possibility of failure.
  • Pseudocertainty effect – the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Recency bias – a cognitive bias that results from disproportionate salience of recent stimuli or observations — the tendency to weigh recent events more than earlier events (see also peak-end rule).
  • Recency illusion – the illusion that a phenomenon, typically a word or language usage, that one has just begun to notice is a recent innovation (see also frequency illusion).
  • Regressive Bayesian likelihood – estimates of conditional probabilities are conservative and not extreme enough
  • Restraint bias – the tendency to overestimate one’s ability to show restraint in the face of temptation.
  • Selective perception – the tendency for expectations to affect perception.
  • Semmelweis reflex – the tendency to reject new evidence that contradicts a paradigm.[46]
  • Social comparison bias – the tendency, when making hiring decisions, to favour potential candidates who don’t compete with one’s own particular strengths.
  • Status quo bias – the tendency to like things to stay relatively the same (see also loss aversionendowment effect, and system justification).
  • Stereotyping – expecting a member of a group to have certain characteristics without having actual information about that individual.
  • Subadditivity effect – the tendency to estimate that the likelihood of an event is less than the sum of its (more than two) mutually exclusive components.
  • Subjective validation – perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.
  • Unit bias — the tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.
  • Well travelled road effect – underestimation of the duration taken to traverse oft-traveled routes and over-estimate the duration taken to traverse less familiar routes.
  • Zero-risk bias – preference for reducing a small risk to zero over a greater reduction in a larger risk.

Social biases

  1. Actor–observer bias – the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one’s own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
  2. Defensive attribution hypothesis – defensive attributions are made when individuals witness or learn of a mishap happening to another person. In these situations, attributions of responsibility to the victim or harm-doer for the mishap will depend upon the severity of the outcomes of the mishap and the level of personal and situational similarity between the individual and victim. More responsibility will be attributed to the harm-doer as the outcome becomes more severe, and as personal or situational similarity decreases.
  3. Dunning–Kruger effect an effect in which incompetent people fail to realize they are incompetent, because they lack the skill to distinguish between competence and incompetence
  4. Egocentric bias – occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  5. Forer effect (aka Barnum effect) – the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  6. False consensus effect – the tendency for people to overestimate the degree to which others agree with them.
  7. Fundamental attribution error – the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior
  8. Halo effect – the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them (see also physical attractiveness stereotype).
  9. Illusion of asymmetric insight – people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
  10. Illusion of transparency – people overestimate others’ ability to know them, and they also overestimate their ability to know others.
  11. Illusory superiority – overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people. (Also known as “Lake Wobegon effect,” “better-than-average effect,” or “superiority bias”).
  12. Ingroup bias – the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  13. Just-world phenomenon – the tendency for people to believe that the world is just and therefore people “get what they deserve.”
  14. Moral luck – the tendency for people to ascribe greater or lesser moral standing based on the outcome of an event rather than the intention
  15. Outgroup homogeneity bias – individuals see members of their own group as being relatively more varied than members of other groups.
  16. Projection bias – the tendency to unconsciously assume that others (or one’s future selves) share one’s current emotional states, thoughts and values.
  17. Self-serving bias – the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  18. System justification – the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
  19. Trait ascription bias – the tendency for people to view themselves as relatively variable in terms of personality, behavior, and mood while viewing others as much more predictable.
  20. Ultimate attribution error – similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.

Memory errors

  • Cryptomnesia – a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias – recalling the past in a self-serving manner, e.g., remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was.
  • False memory – a form of misattribution where imagination is mistaken for a memory.
  • Hindsight bias – filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the “I-knew-it-all-along effect.”
  • Positivity effect – older adults remember relatively more positive than negative things, compared with younger adults
  • Reminiscence bump – the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection – the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Self-serving bias – perceiving oneself responsible for desirable outcomes but not responsible for undesirable ones.
  • Suggestibility – a form of misattribution where ideas suggested by a questioner are mistaken for memory.
  • Telescoping effect – the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Von Restorff effect – the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.

Common theoretical causes of some cognitive biases

List of memory biases

  1. Choice-supportive bias: remembering chosen options as having been better than rejected options
  1. Change bias: after an investment of effort in producing change, remembering one’s past performance as more difficult than it actually was
  2. Childhood amnesia: the retention of few memories from before the age of four
  3. Consistency bias: incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.
  4. Context effect: that cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a work-related memory will be lower at home, and vice versa)
  5. Cross-race effect: the tendency for people of one race to have difficulty identifying members of a race other than their own
  6. Cryptomnesia: a form of misattribution where a memory is mistaken for imagination, because there is no subjective experience of it being a memory.
  7. Egocentric bias: recalling the past in a self-serving manner, e.g., remembering one’s exam grades as being better than they were, or remembering a caught fish as bigger than it really was
  8. Fading affect bias: a bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.
  9. Generation effect (Self-generation effect): that self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.
  10. Google effect: the tendency to forget information that can be easily found online.
  11. Hindsight bias: the inclination to see past events as being predictable; also called the “I-knew-it-all-along” effect.
  12. Humor effect: that humorous items are more easily remembered than non-humorous ones, which might be explained by the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor.
  13. Illusion-of-truth effect: that people are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement. In other words, a person is more likely to believe a familiar statement than an unfamiliar one.
  14. Leveling and Sharpening: memory distortions introduced by the loss of details in a recollection over time, often concurrent with sharpening or selective recollection of certain details that take on exaggerated significance in relation to the details or aspects of the experience lost through leveling. Both biases may be reinforced over time, and by repeated recollection or re-telling of a memory.
  15. Levels-of-processing effect: that different methods of encoding information into memory have different levels of effectiveness
  16. List-length effect: a smaller percentage of items are remembered in a longer list, but as the length of the list increases, the absolute number of items remembered increases as well.
  17. Misinformation effect: misinformation affects people’s reports of their own memory.
  18. Misattribution: when information is retained in memory but the source of the memory is forgotten. One of Schacter’s (1999) Seven Sins of Memory, Misattribution was divided into Source Confusion, Cryptomnesia and False Recall/False Recognition.
  19. Modality effect: that memory recall is higher for the last items of a list when the list items were received via speech than when they were received via writing.
  20. Mood congruent memory bias: the improved recall of information congruent with one’s current mood.
  21. Next-in-line effect: that a person in a group has diminished recall for the words of others who spoke immediately before or after this person.
  22. Osborn effect: that being intoxicated with a mind-altering substance makes it harder to retrieve motor patterns from the Basal Ganglion.
  23. Part-list cueing effect: being shown some items from a list makes it harder to retrieve the other items
  24. Peak-end effect: that people seem to perceive not the sum of an experience but the average of how it was at its peak (e.g. pleasant or unpleasant) and how it ended.
  25. Persistence: the unwanted recurrence of memories of a traumatic event.
  26. Picture superiority effect: that concepts are much more likely to be remembered experientially if they are presented in picture form than if they are presented in word form.
  27. Positivity effect: older adults favor positive over negative information in their memories.
  28. Primacy effectRecency effect & Serial position effect: that items near the end of a list are the easiest to recall, followed by the items at the beginning of a list; items in the middle are the least likely to be remembered
  29. Processing difficulty effect
  30. Reminiscence bump: the recalling of more personal events from adolescence and early adulthood than personal events from other lifetime periods
  31. Rosy retrospection: the remembering of the past as having been better than it really was.
  32. Self-relevance effect: that memories relating to the self are better recalled than similar information relating to others.
  33. Source Confusion: misattributing the source of a memory, e.g. misremembering that one saw an event personally when actually it was seen on television.
  34. Spacing effect: that information is better recalled if exposure to it is repeated over a longer span of time.
  35. Stereotypical bias: memory distorted towards stereotypes (e.g. racial or gender), e.g. “black-sounding” names being misremembered as names of criminals
  36. Suffix effect: the weakening of the recency effect in the case that an item is appended to the list that the subject is not required to recall
  37. Suggestibility: a form of misattribution where ideas suggested by a questioner are mistaken for memory.
  38. Telescoping effect: tendency to displace recent events backward in time and remote events forward in time, so that recent events appear more remote, and remote events, more recent.
  39. Testing effect: frequent testing of material that has been committed to memory improves memory recall.
  40. Tip of the tongue phenomenon: when a subject is able to recall parts of an item, or related information, but is frustratingly unable to recall the whole item. This is thought an instance of “blocking” where multiple similar memories are being recalled and interfere with each other
  41. Verbatim effect: the “gist” of what someone has said is better remembered than the verbatim wording
  42. Von Restorff effect: that an item that sticks out is more likely to be remembered than other items
  43. Zeigarnik effect: uncompleted or interrupted tasks are remembered better than completed ones.

Formal fallacies is an error in logic that can be seen in the argument’s form without an understanding of the argument’s content. All formal fallacies are specific types of non sequiturs.

  • Appeal to authority – (argumentum ad verecundiam) deductively fallacious; even legitimate authorities speaking on their areas of expertise may affirm a falsehood. However, if not using a deductive argument, a logical fallacy is only asserted when the source is not a legitimate expert on the topic at hand, or their conclusion(s) are in direct opposition to other expert consensus. Appeal to authority does not condone to agreeing to the argument.
  • Appeal to probability – assumes that because something could happen, it is inevitable that it will happen.
  • Argument from fallacy – assumes that if an argument for some conclusion is fallacious, then the conclusion itself is false.
  • Base rate fallacy – making a probability judgment based on conditional probabilities, without taking into account the effect of prior probabilities.
  • Conjunction fallacy – assumption that an outcome simultaneously satisfying multiple conditions is more probable than an outcome satisfying a single one of them.
  • Masked man fallacy (illicit substitution of identicals) – the substitution of identical designators in a true statement can lead to a false one.

Propositional fallacies

Quantificational fallacies

Existential fallacy – an argument has two universal premises and a particular conclusion.

Formal syllogistic fallacies– logical fallacies that occur in syllogisms.

Informal fallacies — arguments that are fallacious for reasons other than structural (formal) flaws and which usually require examination of the argument’s content.

  • Argument from ignorance (appeal to ignorance, argumentum ad ignorantiam) – assuming that a claim is true (or false) because it has not been proven false (true) or cannot be proven false (true).
  • Argument from repetition (argumentum ad nauseam) – signifies that it has been discussed extensively until nobody cares to discuss it anymore
  • Argument from silence (argumentum e silentio) – where the conclusion is based on silence of opponent, failing to give proof, based on “lack of evidence”
  • Argumentum verbosium – See Proof by verbosity, below.
  • Begging the question (petitio principii) – where the conclusion of an argument is implicitly or explicitly assumed in one of the premises
  • (shifting the) Burden of proof (see – onus probandi) – I need not prove my claim, you must prove it is false
  • Circular cause and consequence – where the consequence of the phenomenon is claimed to be its root cause
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy) – improperly rejecting a claim for being imprecise.
  • Correlation does not imply causation (cum hoc ergo propter hoc)–a faulty assumption that correlation between 2 variables implies that one causes the other.
  • Correlative-based fallacies
  • Equivocation – the misleading use of a term with more than one meaning (by glossing over which meaning is intended at a particular time)
  • Ecological fallacy – inferences about the nature of specific individuals are based solely upon aggregate statistics collected for the group to which those individuals belong.
  • Etymological fallacy – which reasons that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day meaning.
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole
  • Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are held to be the only possible options, when in reality there are more.
  • If-by-whiskey – an argument that supports both sides of an issue by using terms that are selectively emotionally sensitive.
  • Fallacy of many questions (complex question, fallacy of presupposition, loaded question, plurium interrogationum) – someone asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically, so that the question limits direct replies to those that serve the questioner’s agenda.
  • Ludic fallacy – the belief that the outcomes of a non-regulated random occurrences can be encapsulated by a statistic; a failure to take into account unknown unknowns in determining the probability of an event’s taking place.
  • Fallacy of the single cause (causal oversimplification) – it is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
  • False attribution – an advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument
    • Fallacy of quoting out of context (contextomy) – refers to the selective excerpting of words from their original context in a way that distorts the source’s intended meaning.
  • Argument to moderation (false compromise, middle ground, fallacy of the mean) – assuming that the compromise between two positions is always correct
  • Gambler’s fallacy – the incorrect belief that separate, independent events can affect the likelihood of another random event.
  • Historian’s fallacy – occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision. (Not to be confused with presentism, which is a mode of historical analysis in which present-day ideas, such as moral standards, are projected into the past.)
  • Homunculus fallacy – where a “middle-man” is used for explanation, this usually leads to regressive middle-man. Explanations without actually explaining the real nature of a function or a process. Instead, it explains the concept in terms of the concept itself, without first defining or explaining the original concept.
  • Incomplete comparison – where not enough information is provided to make a complete comparison
  • Inconsistent comparison – where different methods of comparison are used, leaving one with a false impression of the whole comparison
  • Intentional fallacy – addresses the assumption that the meaning intended by the author of a literary work is of primary importance
  • Ignoratio elenchi (irrelevant conclusion, missing the point) – an argument that may in itself be valid, but does not address the issue in question.
  • Kettle logic – using multiple inconsistent arguments to defend a position.
  • Mind projection fallacy – when one considers the way he sees the world as the way the world really is.
  • Moving the goalposts (raising the bar) – argument in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded
  • Nirvana fallacy (perfect solution fallacy) – when solutions to problems are rejected because they are not perfect.
  • Onus probandi – from Latin “onus probandi incumbit ei qui dicit, non ei qui negat” the burden of proof is on the person who makes the claim, not on the person who denies (or questions the claim). It is a particular case of the “argumentum ad ignorantiam” fallacy, here the burden is shifted on the person defending against the assertion
  • Petitio principii – see begging the question
  • Post hoc ergo propter hoc (false cause, coincidental correlation, correlation not causation) – X happened then Y happened; therefore X caused Y
  • Proof by verbosity (argumentum verbosium, proof by intimidation) – submission of others to an argument too complex and verbose to reasonably deal with in all its intimate details. (See also Gish Gallop and argument from authority.)
  • Prosecutor’s fallacy – a low probability of false matches does not mean a low probability of some false match being found
  • Psychologist’s fallacy – an observer presupposes the objectivity of his own perspective when analyzing a behavioral event
  • Red herring – a speaker attempts to distract an audience by deviating from the topic at hand by introducing a separate argument which the speaker believes will be easier to speak to.
  • Regression fallacy – ascribes cause where none exists. The flaw is failing to account for natural fluctuations. It is frequently a special kind of the post hoc fallacy.

  • Reification (hypostatization) – a fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a “real thing” something which is not a real thing, but merely an idea.
  • Retrospective determinism – the argument that because some event has occurred, its occurrence must have been inevitable beforehand
  • Special pleading – where a proponent of a position attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption
  • Straw man – an argument based on misrepresentation of opponent’s position twisting his words, or by means of [false]assumptions
  • Wrong direction – cause and effect are reversed. The cause is said to be the effect and vice versa.

Faulty generalizations reach a conclusion from weak premises. Unlike fallacies of relevance, in fallacies of defective induction, the premises are related to the conclusions yet only weakly buttress the conclusions. A faulty generalization is thus produced.

  • Accident – an exception to a generalization is ignored.
    • No true Scotsman – when a generalization is made true only when a counterexample is ruled out on shaky grounds.
  • Cherry picking (suppressed evidence, incomplete evidence) – act of pointing at individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position.
  • False analogy – an argument by analogy in which the analogy is poorly suited.
  • Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, leaping to a conclusion, hasty induction, secundum quid, converse accident) – basing a broad conclusion on a small sample.
  • Misleading vividness – involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem.
  • Overwhelming exception – an accurate generalization that comes with qualifications which eliminate so many cases that what remains is much less impressive than the initial statement might have led one to assume.
  • Pathetic fallacy – when an inanimate object is declared to have characteristics of animate objects.
  • Thought-terminating cliché – a commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance, conceal lack of thought-entertainment, move onto other topics etc. but in any case, end the debate with a cliche—not a point.

Red herring fallacies  — argument given in response to another argument, which is irrelevant and draws attention away from subject of argument. See also irrelevant conclusion.

  • Ad hominem – attacking the arguer instead of the argument.
    • Poisoning the well – a type of ad hominem where adverse information about a target is presented with the intention of discrediting everything that the target person says
    • Abusive fallacy – a subtype of “ad hominem” when it turns into name-calling rather than arguing about the originally proposed argument.
  • Argumentum ad baculum (appeal to the stick, appeal to force, appeal to threat) – an argument made through coercion or threats of force to support position
  • Argumentum ad populum (appeal to belief, appeal to the majority, appeal to the people) – where a proposition is claimed to be true or good solely because many people believe it to be so
  • Appeal to equality – where an assertion is deemed true or false based on an assumed pretense of equality.
  • Association fallacy (guilt by association) – arguing that because two things share a property they are the same
  • Appeal to authority – where an assertion is deemed true because of the position or authority of the person asserting it.
  • Appeal to consequences (argumentum ad consequentiam) – the conclusion is supported by a premise that asserts positive or negative consequences from some course of action in an attempt to distract from the initial discussion
  • Appeal to emotion – where an argument is made due to the manipulation of emotions, rather than the use of valid reasoning
    • Appeal to fear – a specific type of appeal to emotion where an argument is made by increasing fear and prejudice towards the opposing side
    • Appeal to flattery – a specific type of appeal to emotion where an argument is made due to the use of flattery to gather support.
    • Appeal to pity (argumentum ad misericordiam) – an argument attempts to induce pity to sway opponents
    • Appeal to ridicule – an argument is made by presenting the opponent’s argument in a way that makes it appear ridiculous
    • Appeal to spite – a specific type of appeal to emotion where an argument is made through exploiting people’s bitterness or spite towards an opposing party
    • Wishful thinking – a specific type of appeal to emotion where a decision is made according to what might be pleasing to imagine, rather than according to evidence or reason.
  • Appeal to motive – where a premise is dismissed by calling into question the motives of its proposer
  • Appeal to novelty (argumentum ad novitam) – where a proposal is claimed to be superior or better solely because it is new or modern.
  • Appeal to poverty (argumentum ad Lazarum) – supporting a conclusion because the arguer is poor (or refuting because the arguer is wealthy).
  • Appeal to tradition (argumentum ad antiquitam) – a conclusion supported solely because it has long been held to be true.
  • Appeal to wealth (argumentum ad crumenam) – supporting a conclusion because the arguer is wealthy (or refuting because the arguer is poor). (Sometimes taken together with the appeal to poverty as a general appeal to the arguer’s financial situation.)
  • Argument from silence (argumentum ex silentio) – a conclusion based on silence or lack of contrary evidence
  • Chronological snobbery – where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held
  • Genetic fallacy – where a conclusion is suggested based solely on something or someone’s origin rather than its current meaning or context.
  • Judgmental language – insulting or pejorative language to influence the recipient’s judgment
  • Naturalistic fallacy (is–ought fallacy, naturalistic fallacy) – claims about what ought to be on the basis of statements about what is.
  • Reductio ad Hitlerum (playing the Nazi card) – comparing an opponent or their argument to Hitler or Nazism in an attempt to associate a position with one that is universally reviled (See also – Godwin’s law)
  • Straw man – an argument based on misrepresentation of an opponent’s position
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data
  • Tu quoque (“you too”, appeal to hypocrisy) – the argument states that a certain position is false or wrong and/or should be disregarded because its proponent fails to act consistently in accordance with that position[
  • Two wrongs make a right – occurs when it is assumed that if one wrong is committed, another wrong will cancel it out.

Conditional or questionable fallacies

  1. Black swan blindness – the argument that ignores low probability, high impact events, thus down playing the role of chance and under representing known risks
  2. Broken window fallacy – an argument which disregards lost opportunity costs (typically non-obvious, difficult to determine or otherwise hidden) associated with destroying property of others, or other ways of externalizing costs onto others. For example, an argument that states breaking a window generates income for a window fitter, but disregards the fact that the money spent on the new window cannot now be spent on new shoes.
  3. Definist fallacy – involves the confusion between two notions by defining one in terms of the other.
  4. Naturalistic fallacy – attempts to prove a claim about ethics by appealing to a definition of the term “good” in terms of either one or more claims about natural properties (sometimes also taken to mean the appeal to nature)
  5. Slippery slope (thin edge of the wedge, camel’s nose) – asserting that a relatively small first step inevitably leads to a chain of related events culminating in some significant impact

Public relations methods and approaches

Airborne leaflet propaganda        Astroturfing / Astroturf PR: fake grassroots       Atrocity story   Bandwagon effect    Big lie    Black propaganda    Buzzword    Card stacking    Code word    Communist propaganda     Corporate image  Corporate propaganda   Cult of personality     Demonization    Doublespeak  Disinformation: providing false information   Dog-whistle politics       Enterperience: fusing entertainment and experience together          Euphemisms, to advance a cause or position (see also Political correctness)      Factoid   Fedspeak   Framing   Front organization    Glittering generality     Indoctrination       Information warfare: the practice of disseminating information in an attempt to advance your agenda relative to a competing viewpoint          Junk science           Lesser of two evils principle           Loaded language         Marketing: commercial and business techniques                Media bias                 Media manipulation: the attempt to influence broadcast media decisions in an attempt to present your view to a mass audience     Misuse of statistics        News management: PR techniques concerned with the news media    News propaganda   Newspeak        Plain folks         Propaganda film    Public service announcement       Revolutionary propaganda          Self propaganda    Social marketing: techniques used in behavioral change, such as health promotion      Sound science    Rebuttal: a type of news management technique    Rhetoric        Slogan       Transfer (propaganda)      Video news release    Weasel Word          White propaganda                     Yellow journalism

 

Cognitive distortion

  • All-or-nothing thinking (splitting) – Conception in absolute terms, like “always”, “every”, “never”, and “there is no alternative”. (See also “false dilemma” or “false dichotomy”.)
  • Overgeneralization – Extrapolating limited experiences and evidence to broad generalizations. (See also faulty generalization and misleading vividness.)
  • Magical thinking – Expectation of certain outcomes based on performance of unrelated acts or utterances. (See also wishful thinking.)
  • Mental filter – Inability to view positive or negative features of an experience, for example, noticing only tiny imperfection in a piece of otherwise useful clothing.
  • Disqualifying the positive – Discounting positive experiences for arbitrary, ad hoc reasons.

  • Jumping to conclusions – Reaching conclusions (usually negative) from little (if any) evidence. Two specific subtypes are also identified:
    • Mind reading – Sense of access to special knowledge of the intentions or thoughts of others.
    • Fortune telling – Inflexible expectations for how things will turn out before they happen.
  • Magnification and minimization – Magnifying or minimizing a memory or situation such that they no longer correspond to objective reality. This is common enough in the normal population to popularize idioms such as “make a mountain out of a molehill.” In depressed clients, often the positive characteristics of other people are exaggerated and negative characteristics are understated. There is one subtype of magnification:
    • Catastrophizing – Inability to foresee anything other than the worst possible outcome, however unlikely, or experiencing a situation as unbearable or impossible when it is just uncomfortable.
  • Emotional reasoning – Experiencing reality as a reflection of emotions, e.g. “I feel it, therefore it must be true.”
  • Should statements – Patterns of thought which imply the way things “should” or “ought” to be rather than the actual situation the person is faced with, or having rigid rules which the person believes will “always apply” no matter what the circumstances are. Albert Ellis termed this “Musturbation”.
  • Labeling and mislabeling – Limited thinking about behaviors or events due to reliance on names; related to overgeneralization. Rather than describing the specific behavior, the person assigns a label to someone or himself that implies absolute and unalterable terms. Mislabeling involves describing an event with language that is highly colored and emotionally loaded.
  • Personalization – Attribution of personal responsibility (or causal role or blame) for events over which a person has no control.

In Conclusion

Nobody is immune to these biases and logical fallacies. We are all guilty of this illogical thinking, whether we admit it or not.  The best way to circumvent our fallacious logic and reasoning is to educate ourselves, and catch ourselves in the act.

When we do so, we are much less likely to use these lines of false reasoning in the future.  We become more aware of our reality; we become more ‘woke’, if you’d like.  We get better at spotting liars and deceivers; we tune our mental instrument to recognize fact from fiction, and fiction from ‘fact’.

 

 

Resources

https://en.wikipedia.org/wiki/List_of_cognitive_biases

https://en.wikipedia.org/wiki/Cognitive_bias

http://energyskeptic.com/2013/cognitive-bias/

https://quizlet.com/10255153/cognitive-errors-flash-cards/

https://www.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8

https://uxknowledgebase.com/cognitive-bias-part-1-8191decf703a
https://uxknowledgebase.com/cognitive-bias-part-2-fab5b7717179
https://uxknowledgebase.com/cognitive-bias-part-3-239cd2f0fc8c
https://uxknowledgebase.com/cognitive-bias-part-4-ea2361328bce
https://uxknowledgebase.com/cognitive-bias-part-5-437321f4f39d

Thanks for reading. Please subscribe and share!

Joe Dubs

I write about philosophy, geometry, health, politics and other stuff that interests me.

7 Comments:

  1. The last line tho …. “recognizing fact from fiction, and fiction from fact.” Ha. After reading this article I think maybe it is ALL fiction! None of it’s real and we make everything up…..

    • “ ‘Tis strange – but true; for truth is always strange;
      Stranger than fiction.”
      — Byron

      “Fiction is the lie through which we tell the truth.”
      Albert Camus

      A lie is much more effective if it is mixed with 80% or more truth.

  2. I find this is a very interesting blogarticle, and although I skimmed over most of the numerous fascinating psychological details in it the first time I read it, I couldn’t avoid recognizing a small anomaly in the otherwise consistently presented information of the subject at hand… the anomaly stands out to me in the very last sentence, where it says, quote; “we tune our mental instrument to the insusceptibility of recognizing fact from fiction, and fiction from fact.”, unquote…

    It is the phrase “unsusceptibility of recognizing” that stands out to me, since my intuition tells me that if I were to tune my own mental instrument towards an unsusceptibility of my natural ability to recognize and discern fact from fiction and vice versa, then I believe that over time I will become worse at performing this particular mental skill, because my instrument will thus become tuned towards becoming more and more unsusceptible to the recognition of any difference between fact and fiction… whilst if the sentence instead would read; “we tune our mental instrument to the susceptibility of recognizing…etc.”, or; “we tune our mental instrument to the insusceptibility of confusing fact with fiction…etc”, then the phrase would make more sense to me in the particular context that it appears…

    What kind of thinking this particular human ability represents, i.e. finding and pointing out minute logical inconsistencies in other peoples logic, would of course also be of interest to know, since I suspect that it has something to do with the tendency to keep believing what we want to believe…

  3. Wow,,, you made my day here, when I actually had an impact on this fascinating multifaceted blog… struggling to learn something new myself from this experience, so not to come across as captious, overly nitpicking or even hypercritical, I’ve examined the ‘cognitive bias codex’ more closely and notice that you yourself most likely do not suffer from any “choice-supportive bias”, whereas none of the biases in the “too much information”-quadrant seems to fit with my own underlying tendency for scrutinizing otherwise excellent Internet-texts… for some reason, the closest I came to any learningexperience was the resonance and/or resemblance with the “money-illusion” bias… it seems to me that the very minor fault I found in your text would be the equivalent of a “nominal value”, whereas the entire information contained in the whole article would be the “real value”, which I hereby acknowledge… so, thankyou for your shown courtesy …

  4. Thanks Roland, I remember reading that one about ‘too much information’ as well and thinking it sounded a little off. Now, I can’t find it, lol.

    It’s not this one I don’t think….

    Focusing effect – the tendency to place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
    (towards to top)

    As far as the money illusion bias, this is a little off topic, but thought I’d link it for you
    http://joedubs.com/fiat-current-sea-maritime-money/

    Thanks for your comment

  5. Thanks, I’ve already read it, and have yet to figure out any contribution…:) …I did give it a reasonable try after absorbing the fascinating article, but as of yet to no avail…:)

What do you think?