GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - Logical Fallacies

Critical thinking and decision-making  -, logical fallacies, critical thinking and decision-making logical fallacies.

GCFLearnFree Logo

Critical Thinking and Decision-Making: Logical Fallacies

Lesson 7: logical fallacies.

/en/problem-solving-and-decision-making/how-critical-thinking-can-change-the-game/content/

Logical fallacies

If you think about it, vegetables are bad for you. I mean, after all, the dinosaurs ate plants, and look at what happened to them...

illustration of a dinosaur eating leaves while a meteor falls in the background

Let's pause for a moment: That argument was pretty ridiculous. And that's because it contained a logical fallacy .

A logical fallacy is any kind of error in reasoning that renders an argument invalid . They can involve distorting or manipulating facts, drawing false conclusions, or distracting you from the issue at hand. In theory, it seems like they'd be pretty easy to spot, but this isn't always the case.

Watch the video below to learn more about logical fallacies.

Sometimes logical fallacies are intentionally used to try and win a debate. In these cases, they're often presented by the speaker with a certain level of confidence . And in doing so, they're more persuasive : If they sound like they know what they're talking about, we're more likely to believe them, even if their stance doesn't make complete logical sense.

illustration of a politician saying, "I know for a fact..."

False cause

One common logical fallacy is the false cause . This is when someone incorrectly identifies the cause of something. In my argument above, I stated that dinosaurs became extinct because they ate vegetables. While these two things did happen, a diet of vegetables was not the cause of their extinction.

illustration showing that extinction was not caused by some dinosaurs being vegetarians

Maybe you've heard false cause more commonly represented by the phrase "correlation does not equal causation ", meaning that just because two things occurred around the same time, it doesn't necessarily mean that one caused the other.

A straw man is when someone takes an argument and misrepresents it so that it's easier to attack . For example, let's say Callie is advocating that sporks should be the new standard for silverware because they're more efficient. Madeline responds that she's shocked Callie would want to outlaw spoons and forks, and put millions out of work at the fork and spoon factories.

illustration of Maddie accusing Callie of wanting to outlaw spoons and forks

A straw man is frequently used in politics in an effort to discredit another politician's views on a particular issue.

Begging the question

Begging the question is a type of circular argument where someone includes the conclusion as a part of their reasoning. For example, George says, “Ghosts exist because I saw a ghost in my closet!"

illustration of George claiming that ghosts exists and him seeing one in his closet

George concluded that “ghosts exist”. His premise also assumed that ghosts exist. Rather than assuming that ghosts exist from the outset, George should have used evidence and reasoning to try and prove that they exist.

illustration of George using math and reasoning to try and prove that ghosts exist

Since George assumed that ghosts exist, he was less likely to see other explanations for what he saw. Maybe the ghost was nothing more than a mop!

illustration of a splitscreen showing a ghost in a closet on the left, and that same closet with a mop in it on the right

False dilemma

The false dilemma (or false dichotomy) is a logical fallacy where a situation is presented as being an either/or option when, in reality, there are more possible options available than just the chosen two. Here's an example: Rebecca rings the doorbell but Ethan doesn't answer. She then thinks, "Oh, Ethan must not be home."

illustration showing the false dilemma of either Ethan being home or his home being empty

Rebecca posits that either Ethan answers the door or he isn't home. In reality, he could be sleeping, doing some work in the backyard, or taking a shower.

illustration of Ethan sleeping, doing yard work, and taking a shower

Most logical fallacies can be spotted by thinking critically . Make sure to ask questions: Is logic at work here or is it simply rhetoric? Does their "proof" actually lead to the conclusion they're proposing? By applying critical thinking, you'll be able to detect logical fallacies in the world around you and prevent yourself from using them as well.

previous

Logo for OPEN OKSTATE

Unit 1: What Is Philosophy?

LOGOS: Critical Thinking, Arguments, and Fallacies

Heather Wilburn, Ph.D

Critical Thinking:

With respect to critical thinking, it seems that everyone uses this phrase. Yet, there is a fear that this is becoming a buzz-word (i.e. a word or phrase you use because it’s popular or enticing in some way). Ultimately, this means that we may be using the phrase without a clear sense of what we even mean by it. So, here we are going to think about what this phrase might mean and look at some examples. As a former colleague of mine, Henry Imler, explains:

By critical thinking, we refer to thinking that is recursive in nature. Any time we encounter new information or new ideas, we double back and rethink our prior conclusions on the subject to see if any other conclusions are better suited. Critical thinking can be contrasted with Authoritarian thinking. This type of thinking seeks to preserve the original conclusion. Here, thinking and conclusions are policed, as to question the system is to threaten the system. And threats to the system demand a defensive response. Critical thinking is short-circuited in authoritarian systems so that the conclusions are conserved instead of being open for revision. [1]

A condition for being recursive is to be open and not arrogant. If we come to a point where we think we have a handle on what is True, we are no longer open to consider, discuss, or accept information that might challenge our Truth. One becomes closed off and rejects everything that is different or strange–out of sync with one’s own Truth. To be open and recursive entails a sense of thinking about your beliefs in a critical and reflective way, so that you have a chance to either strengthen your belief system or revise it if needed. I have been teaching philosophy and humanities classes for nearly 20 years; critical thinking is the single most important skill you can develop. In close but second place is communication, In my view, communication skills follow as a natural result of critical thinking because you are attempting to think through and articulate stronger and rationally justified views. At the risk of sounding cliche, education isn’t about instilling content; it is about learning how to think.

In your philosophy classes your own ideas and beliefs will very likely be challenged. This does not mean that you will be asked to abandon your beliefs, but it does mean that you might be asked to defend them. Additionally, your mind will probably be twisted and turned about, which can be an uncomfortable experience. Yet, if at all possible, you should cherish these experiences and allow them to help you grow as a thinker. To be challenged and perplexed is difficult; however, it is worthwhile because it compels deeper thinking and more significant levels of understanding. In turn, thinking itself can transform us not only in thought, but in our beliefs, and our actions. Hannah Arendt, a social and political philosopher that came to the United States in exile during WWII, relates the transformative elements of philosophical thinking to Socrates. She writes:

Socrates…who is commonly said to have believed in the teachability of virtue, seems to have held that talking and thinking about piety, justice, courage, and the rest were liable to make men more pious, more just, more courageous, even though they were not given definitions or “values” to direct their further conduct. [2]

Thinking and communication are transformative insofar as these activities have the potential to alter our perspectives and, thus, change our behavior. In fact, Arendt connects the ability to think critically and reflectively to morality. As she notes above, morality does not have to give a predetermined set of rules to affect our behavior. Instead, morality can also be related to the open and sometimes perplexing conversations we have with others (and ourselves) about moral issues and moral character traits. Theodor W. Adorno, another philosopher that came to the United States in exile during WWII, argues that autonomous thinking (i.e. thinking for oneself) is crucial if we want to prevent the occurrence of another event like Auschwitz, a concentration camp where over 1 million individuals died during the Holocaust. [3] To think autonomously entails reflective and critical thinking—a type of thinking rooted in philosophical activity and a type of thinking that questions and challenges social norms and the status quo. In this sense thinking is critical of what is, allowing us to think beyond what is and to think about what ought to be, or what ought not be. This is one of the transformative elements of philosophical activity and one that is useful in promoting justice and ethical living.

With respect to the meaning of education, the German philosopher Hegel uses the term bildung, which means education or upbringing, to indicate the differences between the traditional type of education that focuses on facts and memorization, and education as transformative. Allen Wood explains how Hegel uses the term bildung: it is “a process of self-transformation and an acquisition of the power to grasp and articulate the reasons for what one believes or knows.” [4] If we think back through all of our years of schooling, particularly those subject matters that involve the teacher passing on information that is to be memorized and repeated, most of us would be hard pressed to recall anything substantial. However, if the focus of education is on how to think and the development of skills include analyzing, synthesizing, and communicating ideas and problems, most of us will use those skills whether we are in the field of philosophy, politics, business, nursing, computer programming, or education. In this sense, philosophy can help you develop a strong foundational skill set that will be marketable for your individual paths. While philosophy is not the only subject that will foster these skills, its method is one that heavily focuses on the types of activities that will help you develop such skills.

Let’s turn to discuss arguments. Arguments consist of a set of statements, which are claims that something is or is not the case, or is either true or false. The conclusion of your argument is a statement that is being argued for, or the point of view being argued for. The other statements serve as evidence or support for your conclusion; we refer to these statements as premises. It’s important to keep in mind that a statement is either true or false, so questions, commands, or exclamations are not statements. If we are thinking critically we will not accept a statement as true or false without good reason(s), so our premises are important here. Keep in mind the idea that supporting statements are called premises and the statement that is being supported is called the conclusion. Here are a couple of examples:

Example 1: Capital punishment is morally justifiable since it restores some sense of

balance to victims or victims’ families.

Let’s break it down so it’s easier to see in what we might call a typical argument form:

Premise: Capital punishment restores some sense of balance to victims or victims’ families.

Conclusion: Capital punishment is morally justifiable.

Example 2 : Because innocent people are sometimes found guilty and potentially

executed, capital punishment is not morally justifiable.

Premise: Innocent people are sometimes found guilty and potentially executed.

Conclusion: Capital punishment is not morally justifiable.

It is worth noting the use of the terms “since” and “because” in these arguments. Terms or phrases like these often serve as signifiers that we are looking at evidence, or a premise.

Check out another example:

Example 3 : All human beings are mortal. Heather is a human being. Therefore,

Heather is mortal.

Premise 1: All human beings are mortal.

Premise 2: Heather is a human being.

Conclusion: Heather is mortal.

In this example, there are a couple of things worth noting: First, there can be more than one premise. In fact, you could have a rather complex argument with several premises. If you’ve written an argumentative paper you may have encountered arguments that are rather complex. Second, just as the arguments prior had signifiers to show that we are looking at evidence, this argument has a signifier (i.e. therefore) to demonstrate the argument’s conclusion.

So many arguments!!! Are they all equally good?

No, arguments are not equally good; there are many ways to make a faulty argument. In fact, there are a lot of different types of arguments and, to some extent, the type of argument can help us figure out if the argument is a good one. For a full elaboration of arguments, take a logic class! Here’s a brief version:

Deductive Arguments: in a deductive argument the conclusion necessarily follows the premises. Take argument Example 3 above. It is absolutely necessary that Heather is a mortal, if she is a human being and if mortality is a specific condition for being human. We know that all humans die, so that’s tight evidence. This argument would be a very good argument; it is valid (i.e the conclusion necessarily follows the premises) and it is sound (i.e. all the premises are true).

Inductive Arguments : in an inductive argument the conclusion likely (at best) follows the premises. Let’s have an example:

Example 4 : 98.9% of all TCC students like pizza. You are a TCC student. Thus, you like pizza.

Premise 1: 98.9% of all TCC students like pizza

Premise 2: You are a TCC student.

Conclusion: You like pizza. (*Thus is a conclusion indicator)

In this example, the conclusion doesn’t necessarily follow; it likely follows. But you might be part of that 1.1% for whatever reason. Inductive arguments are good arguments if they are strong. So, instead of saying an inductive argument is valid, we say it is strong. You can also use the term sound to describe the truth of the premises, if they are true. Let’s suppose they are true and you absolutely love Hideaway pizza. Let’s also assume you are a TCC student. So, the argument is really strong and it is sound.

There are many types of inductive argument, including: causal arguments, arguments based on probabilities or statistics, arguments that are supported by analogies, and arguments that are based on some type of authority figure. So, when you encounter an argument based on one of these types, think about how strong the argument is. If you want to see examples of the different types, a web search (or a logic class!) will get you where you need to go.

Some arguments are faulty, not necessarily because of the truth or falsity of the premises, but because they rely on psychological and emotional ploys. These are bad arguments because people shouldn’t accept your conclusion if you are using scare tactics or distracting and manipulating reasoning. Arguments that have this issue are called fallacies. There are a lot of fallacies, so, again, if you want to know more a web search will be useful. We are going to look at several that seem to be the most relevant for our day-to-day experiences.

  • Inappropriate Appeal to Authority : We are definitely going to use authority figures in our lives (e.g. doctors, lawyers, mechanics, financial advisors, etc.), but we need to make sure that the authority figure is a reliable one.

Things to look for here might include: reputation in the field, not holding widely controversial views, experience, education, and the like. So, if we take an authority figure’s word and they’re not legit, we’ve committed the fallacy of appeal to authority.

Example 5 : I think I am going to take my investments to Voya. After all, Steven Adams advocates for Voya in an advertisement I recently saw.

If we look at the criteria for evaluating arguments that appeal to authority figures, it is pretty easy to see that Adams is not an expert in the finance field. Thus, this is an inappropropriate appeal to authority.

  • Slippery Slope Arguments : Slippery slope arguments are found everywhere it seems. The essential characteristic of a slippery slope argument is that it uses problematic premises to argue that doing ‘x’ will ultimately lead to other actions that are extreme, unlikely, and disastrous. You can think of this type of argument as a faulty chain of events or domino effect type of argument.

Example 6 : If you don’t study for your philosophy exam you will not do well on the exam. This will lead to you failing the class. The next thing you know you will have lost your scholarship, dropped out of school, and will be living on the streets without any chance of getting a job.

While you should certainly study for your philosophy exam, if you don’t it is unlikely that this will lead to your full economic demise.

One challenge to evaluating slippery slope arguments is that they are predictions, so we cannot be certain about what will or will not actually happen. But this chain of events type of argument should be assessed in terms of whether the outcome will likely follow if action ‘x” is pursued.

  • Faulty Analogy : We often make arguments based on analogy and these can be good arguments. But we often use faulty reasoning with analogies and this is what we want to learn how to avoid.

When evaluating an argument that is based on an analogy here are a few things to keep in mind: you want to look at the relevant similarities and the relevant differences between the things that are being compared. As a general rule, if there are more differences than similarities the argument is likely weak.

Example 7 : Alcohol is legal. Therefore, we should legalize marijuana too.

So, the first step here is to identify the two things being compared, which are alcohol and marijuana. Next, note relevant similarities and differences. These might include effects on health, community safety, economic factors, criminal justice factors, and the like.

This is probably not the best argument in support for marijuana legalization. It would seem that one could just as easily conclude that since marijuana is illegal, alcohol should be too. In fact, one might find that alcohol is an often abused and highly problematic drug for many people, so it is too risky to legalize marijuana if it is similar to alcohol.

  • Appeal to Emotion : Arguments should be based on reason and evidence, not emotional tactics. When we use an emotional tactic, we are essentially trying to manipulate someone into accepting our position by evoking pity or fear, when our positions should actually be backed by reasonable and justifiable evidence.

Example 8 : Officer please don’t give me a speeding ticket. My girlfriend broke up with me last night, my alarm didn’t go off this morning, and I’m late for class.

While this is a really horrible start to one’s day, being broken up with and an alarm malfunctioning is not a justifiable reason for speeding.

Example 9 : Professor, I’d like you to remember that my mother is a dean here at TCC. I’m sure that she will be very disappointed if I don’t receive an A in your class.

This is a scare tactic and is not a good way to make an argument. Scare tactics can come in the form of psychological or physical threats; both forms are to be avoided.

  • Appeal to Ignorance : This fallacy occurs when our argument relies on lack of evidence when evidence is actually needed to support a position.

Example 10 : No one has proven that sasquatch doesn’t exist; therefore it does exist.

Example 11 : No one has proven God exists; therefore God doesn’t exist.

The key here is that lack of evidence against something cannot be an argument for something. Lack of evidence can only show that we are ignorant of the facts.

  • Straw Man : A straw man argument is a specific type of argument that is intended to weaken an opponent’s position so that it is easier to refute. So, we create a weaker version of the original argument (i.e. a straw man argument), so when we present it everyone will agree with us and denounce the original position.

Example 12 : Women are crazy arguing for equal treatment. No one wants women hanging around men’s locker rooms or saunas.

This is a misrepresentation of arguments for equal treatment. Women (and others arguing for equal treatment) are not trying to obtain equal access to men’s locker rooms or saunas.

The best way to avoid this fallacy is to make sure that you are not oversimplifying or misrepresenting others’ positions. Even if we don’t agree with a position, we want to make the strongest case against it and this can only be accomplished if we can refute the actual argument, not a weakened version of it. So, let’s all bring the strongest arguments we have to the table!

  • Red Herring : A red herring is a distraction or a change in subject matter. Sometimes this is subtle, but if you find yourself feeling lost in the argument, take a close look and make sure there is not an attempt to distract you.

Example 13 : Can you believe that so many people are concerned with global warming? The real threat to our country is terrorism.

It could be the case that both global warming and terrorism are concerns for us. But the red herring fallacy is committed when someone tries to distract you from the argument at hand by bringing up another issue or side-stepping a question. Politicians are masters at this, by the way.

  • Appeal to the Person : This fallacy is also referred to as the ad hominem fallacy. We commit this fallacy when we dismiss someone’s argument or position by attacking them instead of refuting the premises or support for their argument.

Example 14 : I am not going to listen to what Professor ‘X’ has to say about the history of religion. He told one of his previous classes he wasn’t religious.

The problem here is that the student is dismissing course material based on the professor’s religious views and not evaluating the course content on its own ground.

To avoid this fallacy, make sure that you target the argument or their claims and not the person making the argument in your rebuttal.

  • Hasty Generalization : We make and use generalizations on a regular basis and in all types of decisions. We rely on generalizations when trying to decide which schools to apply to, which phone is the best for us, which neighborhood we want to live in, what type of job we want, and so on. Generalizations can be strong and reliable, but they can also be fallacious. There are three main ways in which a generalization can commit a fallacy: your sample size is too small, your sample size is not representative of the group you are making a generalization about, or your data could be outdated.

Example 15 : I had horrible customer service at the last Starbucks I was at. It is clear that Starbucks employees do not care about their customers. I will never visit another Starbucks again.

The problem with this generalization is that the claim made about all Starbucks is based on one experience. While it is tempting to not spend your money where people are rude to their customers, this is only one employee and presumably doesn’t reflect all employees or the company as a whole. So, to make this a stronger generalization we would want to have a larger sample size (multiple horrible experiences) to support the claim. Let’s look at a second hasty generalization:

Example 16 : I had horrible customer service at the Starbucks on 81st street. It is clear that Starbucks employees do not care about their customers. I will never visit another Starbucks again.

The problem with this generalization mirrors the previous problem in that the claim is based on only one experience. But there’s an additional issue here as well, which is that the claim is based off of an experience at one location. To make a claim about the whole company, our sample group needs to be larger than one and it needs to come from a variety of locations.

  • Begging the Question : An argument begs the question when the argument’s premises assume the conclusion, instead of providing support for the conclusion. One common form of begging the question is referred to as circular reasoning.

Example 17 : Of course, everyone wants to see the new Marvel movie is because it is the most popular movie right now!

The conclusion here is that everyone wants to see the new Marvel movie, but the premise simply assumes that is the case by claiming it is the most popular movie. Remember the premise should give reasons for the conclusion, not merely assume it to be true.

  • Equivocation : In the English language there are many words that have different meanings (e.g. bank, good, right, steal, etc.). When we use the same word but shift the meaning without explaining this move to your audience, we equivocate the word and this is a fallacy. So, if you must use the same word more than once and with more than one meaning you need to explain that you’re shifting the meaning you intend. Although, most of the time it is just easier to use a different word.

Example 18 : Yes, philosophy helps people argue better, but should we really encourage people to argue? There is enough hostility in the world.

Here, argue is used in two different senses. The meaning of the first refers to the philosophical meaning of argument (i.e. premises and a conclusion), whereas the second sense is in line with the common use of argument (i.e. yelling between two or more people, etc.).

  • Henry Imler, ed., Phronesis An Ethics Primer with Readings, (2018). 7-8. ↵
  • Arendt, Hannah, “Thinking and Moral Considerations,” Social Research, 38:3 (1971: Autumn): 431. ↵
  • Theodor W. Adorno, “Education After Auschwitz,” in Can One Live After Auschwitz, ed. by Rolf Tiedemann, trans. by Rodney Livingstone (Stanford: Stanford University Press, 2003): 23. ↵
  • Allen W. Wood, “Hegel on Education,” in Philosophers on Education: New Historical Perspectives, ed. Amelie O. Rorty (London: Routledge 1998): 302. ↵

LOGOS: Critical Thinking, Arguments, and Fallacies Copyright © 2020 by Heather Wilburn, Ph.D is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Everyday Psychology. Critical Thinking and Skepticism.

What are Logical Fallacies? | Critical Thinking Basics

Logical fallacies are errors in reasoning or flawed arguments that can mislead or deceive. They often appear plausible but lack sound evidence or valid reasoning, undermining the credibility of an argument. These errors can be categorized into various types, such as ad hominem attacks, strawman arguments, and false cause correlations.

Impact on Critical Thinking, Communication, and Social Interactions

The presence of logical fallacies hampers critical thinking by leading individuals away from rational and evidence-based conclusions. In communication, they can create confusion, weaken the persuasiveness of an argument, and hinder the exchange of ideas.

In social interactions, reliance on fallacious reasoning can strain relationships, impede collaboration, and contribute to misunderstandings.

Benefits of Identifying and Managing Logical Fallacies

Learning to identify logical fallacies enhances critical thinking skills, enabling individuals to analyze arguments more effectively and make informed decisions. In communication, recognizing fallacies empowers individuals to construct more compelling and convincing arguments, fostering clearer and more meaningful exchanges.

Moreover, the ability to manage logical fallacies promotes healthier social interactions by minimizing misunderstandings, encouraging constructive dialogue, and fostering a more intellectually robust and collaborative environment.

RETURN TO THE MAIN RESOURCE PAGE: CRITICAL THINKING BASICS

Share this:

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

7.4: Fallacies

  • Last updated
  • Save as PDF
  • Page ID 67187

  • Jim Marteney
  • Los Angeles Valley College via ASCCC Open Educational Resources Initiative (OERI)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

A fallacy is an error in reasoning. A fallacy indicates there is a problem with the logic of deductive or inductive reasoning. This differs from a factual error, which is simply being wrong about the facts. To be more specific, a fallacy is an “argument” in which the premises given for the conclusion do not provide the needed degree of support.

A fallacy is a mistake in the way that the final conclusion of the argument, or any intermediate conclusions, are logically related to their supporting premises. When there is a fallacy in an argument, the argument is said to be unsound or invalid

The presence of a logical fallacy in an argument does not necessarily imply anything about the argument’s premises or its conclusion. Both may actually be correct, but the argument is still invalid because the conclusion does not follow from the premises using the inference principles of the argument.

Recognizing fallacies is often difficult, and indeed fallacious arguments often persuade their intended audience. Detecting and avoiding fallacious reasoning will at least prevent adoption of some erroneous conclusions.

Types of Fallacies

Fallacies are usually recognized in isolation, but woven into the context of an argument they may pass unnoticed, unless the critical thinker is on guard against them. Some advocates openly use fallacies in order to exploit an unknowing audience, but many times we use fallacies unintentionally. Many fallacies exist. Here is a few of the most common ones used in everyday argumentation.

False Dilemma The False Dilemma fallacy occurs when an argument offers a false range of choices and requires that you pick one of them. Usually, the False Dilemma fallacy takes this form: Either A or B is true. If A is not true, then B is true. “Either you love me or hate me . ” The range is false because there may be other, unstated choices which would only serve to undermine the original argument. If you agree to pick one of those choices, you accept the premise that those choices are indeed the only ones possible. Seeing something as "black and white” is an example of a false dilemma.

Appeal to Emotion This fallacy is committed when someone manipulates peoples’ emotions in order to get them to accept a claim. More formally, this sort of “reasoning” involves the substitution of various means of producing strong emotions in place of evidence for a claim. Here the attempt is to transfer a positive emotion you have on one thing to the object or belief that is being argued.

This sort of “reasoning” is very common in politics and it serves as the basis for a large portion of modern advertising. Most political speeches are aimed at generating feelings in people, so that these feelings will get them to vote or act a certain way. How many times will you see pictures of American flags in a political commercial? The flag and other traditional images are aimed at getting the audience emotionally involved. In the case of advertising, the commercials are aimed at evoking emotions that will influence people to buy certain products. Beer commercials frequently include people at parties to get the potential consumers excited about the product. In many cases, such speeches and commercials are notoriously free of real evidence.

Non-sequitur The phrase “non-sequitur” is Latin for “it does not follow.” If an inference is made that does not logically follow from the premises of the preceding argument, then the inference is a non-sequitur. For example, “I am wearing my lucky hat today, nothing can go wrong . ” Though the term “non-sequitur” can be used broadly as an informal fallacy to describe any unwarranted conclusion, it is most often used when a statement openly contradicts itself and just makes no sense.

Slippery Slope This fallacy reduces an argument to absurdity by extending it beyond its reasonable limits. This is an abuse of causal reasoning by trying to link events that normally have very little to do with each other. For example: legalizing marijuana will lead to the legalization of cocaine. If you legalize cocaine, you’ll be able to buy crack and every other drug at your local 7-11. In this argument, it is asserted that the legalization of marijuana will eventually lead to purchasing crack at local 7-11’s. Once one accepts the legalization of marijuana, then one is assumed to be on the slippery slope towards the legalization and availability of every other drug. In a Slippery Slope argument, you suggest that a series of events will occur leading to an undesirable conclusion instead of just one step as in Causal Reasoning.

Ad Hominem Translated from Latin to English, “Ad Hominem” means “against the man” or “against the person.” An ad hominem fallacy consists of saying that someone’s argument is wrong purely because of something about the person rather than about the argument itself. You will hear people on the radio and television dismiss comments by people they label as a conservative or a liberal, just because of how they label that person. Merely insulting another person or questioning the credibility of someone does not necessarily constitute an ad hominem fallacy. For this fallacy to exist it must be clear that the purpose of the characterization is to discredit the person offering the argument, in an attempt to invite others to then discount his or her arguments.

The Ad Hominem fallacy was employed by those who wanted to silence 16-year-old Climate Change activist Greta Thunberg. Those who disagreed with her argued that she should be ignored as she is just a child.

Hasty Generalization This fallacy occurs when an arguer bases a conclusion on too few examples, that are not necessarily typical of the conclusion being made. For instance, “My two boyfriends have never shown any concern for my feelings. Therefore, all men are insensitive, selfish, and emotionally uncaring.” Or, “I read about this man who got worms from eating sushi. I always knew that sushi was not good to eat." Without more examples, these arguments can be considered fallacies.

Circular Reasoning The fallacy of circular reasoning is the assertion or repeated assertion of a conclusion, without giving reasons in its support. In other words, supporting a premise with a premise, instead of a conclusion. It may imply that the conclusion is self-evident or rephrase the conclusion to sound like a reason. Circular reasoning creates an illusion of support by simply asserting its conclusion as though it were a reason, or by reasserting the same claim in different words. For example, “Kerosene is combustible; therefore, it burns.” Or, “George Clooney is the best actor we have ever had, because he is the greatest actor of all time.”

Appeal to Ignorance In this fallacy, the arguer claims that something is valid only because it hasn’t been proven false. This fallacy errs by trying to make this argument in a context in which the burden of proof falls on the arguer to show that his or her position is actually accurate, not just that it has not yet been shown false. The argument mistakes lack of evidence for evidence to the contrary. In effect, the argument says, “No one knows it is accurate. Therefore, it is false.” For example, “There is no proof that hand gun legislation will reduce crime. Therefore, outlawing handguns would be a futile gesture." Or, "We have no evidence that God doesn’t exist, therefore, God must exist." Ignorance about something says nothing about its existence or non-existence.

Plato and a Platypus Walk into a Bar

clipboard_e62a80e553ce1af405c5aab2b08be50ba.png

In their book authors Thomas Cathcart and Daniel Klein illustrate logical principles and fallacies using classic jokes. For example, to illustrate the fallacy of post hoc ergo propter hoc, they use the following:

“In general, we’re deceived by post hoc ergo propter hoc because we fail to notice that there’s another cause at work.

A New York boy is being led through the swamps of Louisiana by his cousin. ‘Is it true that an alligator won’t attack you if you carry a flashlight?’ asks the city boy.

His cousin replies, ‘Depends on how fast you carry the flashlight.’

The city boy saw the flashlight as a propter when it was only a prop.” 1

Bandwagon The name “bandwagon fallacy” comes from the phrase “jump on the bandwagon” or “climb on the bandwagon” a bandwagon being a wagon big enough to hold a band of musicians. In past political campaigns, candidates would ride a bandwagon through town, and people would show support for the candidate by climbing aboard the wagon. The phrase has come to refer to joining a cause because of its popularity. For example, trying to convince you that you should do something because everyone else is doing it, is a bandwagon fallacy. "Everybody is buying a Tesla car, so should you."

Post hoc ergo propter hoc The post hoc ergo propter hoc, “after this, therefore because of this,” fallacy is based upon the mistaken notion that simply because one thing happens after another, the first event was a cause of the second event. Post hoc reasoning is the basis for many superstitions and erroneous beliefs.

For example, California earthquakes always happen after unusual weather patterns. Or, Allison always scores a goal when she wears her red and white soccer shoes. Or, I wore my Packers shirt and my Packers team won. I now wear my Packers shirt for every game. These are all, post hoc ergo propter hoc fallacies

Appeal to Pity With this fallacy, the arguer tries to get people to agree with his or her conclusion by evoking pity and sympathy either with the situation or with the situation of some third party. By appealing to people's ability to sympathize with others, a powerful emotive force can be created. Unfortunately, however serious another person's problems are, that does not automatically make their claims any more logical. My sympathy for that situation does not create a reasonable basis for believing his or her claims. For example, "I really need this job since my grandmother is sick" or "I should receive an 'A' in this class. After all, if I don't get an 'A' I won't get the scholarship that I need." These appeals evoke emotions, but are not necessarily logical.

Straw-Man Fallacy The arguer attacks an argument that is different from, and usually weaker than, the opposition’s best argument. To distort or misrepresent an argument one is trying to refute is called the straw man fallacy. In a straw man fallacy, the opponents argument is distorted, misquoted, exaggerated, misrepresented or simply made up. This makes the argument easier to defeat, and can also be used to make opponents look like ignorant extremists. The refutation may appear to be a good one to someone unfamiliar with the original argument.

Logical fallacies are errors of reasoning, errors which may be recognized and corrected by critical thinkers. Fallacies may be created unintentionally, or they may be created intentionally in order to deceive other people. The vast majority of the commonly identified fallacies involve arguments, although some involve explanations, or definitions, or other products of reasoning. Sometimes the term fallacy is used even more broadly to indicate any false belief or cause of a false belief. A fallacy is an argument that sometimes fools human reasoning, but is not logically valid.

In his book, PERSUASION: THEORY AND PRACTICE, Kenneth Anderson writes,

“Logical appeals are powerful forces in persuasion. However, logic alone is rarely sufficient to yield persuasion. Desires and needs of receivers affect and determine what they will accept as logical demonstration. Thus, it is possible for one person to report that he or she is convinced by the logic used while another person remains horrified at the lack of logic presented.” 2

You can have high quality evidence, but lead to incorrect conclusions because your argument has poor reasoning. You always want to create the “soundest” or most logical argument possible. And you also want to examine the logic of others presentations to determine what fallacies might be evident.

  • Cathcart, Thomas, and Daniel Klein. Plato and a Platypus Walk into a Bar. New York: Penguin Books, 2007.
  • Anderson, Kenneth. Persuasion: Theory and Practice . Boston: American Press, 1983.

The Writing Center • University of North Carolina at Chapel Hill

What this handout is about

This handout discusses common logical fallacies that you may encounter in your own writing or the writing of others. The handout provides definitions, examples, and tips on avoiding these fallacies.

Most academic writing tasks require you to make an argument—that is, to present reasons for a particular claim or interpretation you are putting forward. You may have been told that you need to make your arguments more logical or stronger. And you may have worried that you simply aren’t a logical person or wondered what it means for an argument to be strong. Learning to make the best arguments you can is an ongoing process, but it isn’t impossible: “Being logical” is something anyone can do, with practice.

Each argument you make is composed of premises (this is a term for statements that express your reasons or evidence) that are arranged in the right way to support your conclusion (the main claim or interpretation you are offering). You can make your arguments stronger by:

  • using good premises (ones you have good reason to believe are both true and relevant to the issue at hand),
  • making sure your premises provide good support for your conclusion (and not some other conclusion, or no conclusion at all),
  • checking that you have addressed the most important or relevant aspects of the issue (that is, that your premises and conclusion focus on what is really important to the issue), and
  • not making claims that are so strong or sweeping that you can’t really support them.

You also need to be sure that you present all of your ideas in an orderly fashion that readers can follow. See our handouts on argument and organization for some tips that will improve your arguments.

This handout describes some ways in which arguments often fail to do the things listed above; these failings are called fallacies. If you’re having trouble developing your argument, check to see if a fallacy is part of the problem.

It is particularly easy to slip up and commit a fallacy when you have strong feelings about your topic—if a conclusion seems obvious to you, you’re more likely to just assume that it is true and to be careless with your evidence. To help you see how people commonly make this mistake, this handout uses a number of controversial political examples—arguments about subjects like abortion, gun control, the death penalty, gay marriage, euthanasia, and pornography. The purpose of this handout, though, is not to argue for any particular position on any of these issues; rather, it is to illustrate weak reasoning, which can happen in pretty much any kind of argument. Please be aware that the claims in these examples are just made-up illustrations—they haven’t been researched, and you shouldn’t use them as evidence in your own writing.

What are fallacies?

Fallacies are defects that weaken arguments. By learning to look for them in your own and others’ writing, you can strengthen your ability to evaluate the arguments you make, read, and hear. It is important to realize two things about fallacies: first, fallacious arguments are very, very common and can be quite persuasive, at least to the casual reader or listener. You can find dozens of examples of fallacious reasoning in newspapers, advertisements, and other sources. Second, it is sometimes hard to evaluate whether an argument is fallacious. An argument might be very weak, somewhat weak, somewhat strong, or very strong. An argument that has several stages or parts might have some strong sections and some weak ones. The goal of this handout, then, is not to teach you how to label arguments as fallacious or fallacy-free, but to help you look critically at your own arguments and move them away from the “weak” and toward the “strong” end of the continuum.

So what do fallacies look like?

For each fallacy listed, there is a definition or explanation, an example, and a tip on how to avoid committing the fallacy in your own arguments.

Hasty generalization

Definition: Making assumptions about a whole group or range of cases based on a sample that is inadequate (usually because it is atypical or too small). Stereotypes about people (“librarians are shy and smart,” “wealthy people are snobs,” etc.) are a common example of the principle underlying hasty generalization.

Example: “My roommate said her philosophy class was hard, and the one I’m in is hard, too. All philosophy classes must be hard!” Two people’s experiences are, in this case, not enough on which to base a conclusion.

Tip: Ask yourself what kind of “sample” you’re using: Are you relying on the opinions or experiences of just a few people, or your own experience in just a few situations? If so, consider whether you need more evidence, or perhaps a less sweeping conclusion. (Notice that in the example, the more modest conclusion “Some philosophy classes are hard for some students” would not be a hasty generalization.)

Missing the point

Definition: The premises of an argument do support a particular conclusion—but not the conclusion that the arguer actually draws.

Example: “The seriousness of a punishment should match the seriousness of the crime. Right now, the punishment for drunk driving may simply be a fine. But drunk driving is a very serious crime that can kill innocent people. So the death penalty should be the punishment for drunk driving.” The argument actually supports several conclusions—”The punishment for drunk driving should be very serious,” in particular—but it doesn’t support the claim that the death penalty, specifically, is warranted.

Tip: Separate your premises from your conclusion. Looking at the premises, ask yourself what conclusion an objective person would reach after reading them. Looking at your conclusion, ask yourself what kind of evidence would be required to support such a conclusion, and then see if you’ve actually given that evidence. Missing the point often occurs when a sweeping or extreme conclusion is being drawn, so be especially careful if you know you’re claiming something big.

Post hoc (also called false cause)

This fallacy gets its name from the Latin phrase “post hoc, ergo propter hoc,” which translates as “after this, therefore because of this.”

Definition: Assuming that because B comes after A, A caused B. Of course, sometimes one event really does cause another one that comes later—for example, if I register for a class, and my name later appears on the roll, it’s true that the first event caused the one that came later. But sometimes two events that seem related in time aren’t really related as cause and event. That is, correlation isn’t the same thing as causation.

Examples: “President Jones raised taxes, and then the rate of violent crime went up. Jones is responsible for the rise in crime.” The increase in taxes might or might not be one factor in the rising crime rates, but the argument hasn’t shown us that one caused the other.

Tip: To avoid the post hoc fallacy, the arguer would need to give us some explanation of the process by which the tax increase is supposed to have produced higher crime rates. And that’s what you should do to avoid committing this fallacy: If you say that A causes B, you should have something more to say about how A caused B than just that A came first and B came later.

Slippery slope

Definition: The arguer claims that a sort of chain reaction, usually ending in some dire consequence, will take place, but there’s really not enough evidence for that assumption. The arguer asserts that if we take even one step onto the “slippery slope,” we will end up sliding all the way to the bottom; they assume we can’t stop partway down the hill.

Example: “Animal experimentation reduces our respect for life. If we don’t respect life, we are likely to be more and more tolerant of violent acts like war and murder. Soon our society will become a battlefield in which everyone constantly fears for their lives. It will be the end of civilization. To prevent this terrible consequence, we should make animal experimentation illegal right now.” Since animal experimentation has been legal for some time and civilization has not yet ended, it seems particularly clear that this chain of events won’t necessarily take place. Even if we believe that experimenting on animals reduces respect for life, and loss of respect for life makes us more tolerant of violence, that may be the spot on the hillside at which things stop—we may not slide all the way down to the end of civilization. And so we have not yet been given sufficient reason to accept the arguer’s conclusion that we must make animal experimentation illegal right now.

Like post hoc, slippery slope can be a tricky fallacy to identify, since sometimes a chain of events really can be predicted to follow from a certain action. Here’s an example that doesn’t seem fallacious: “If I fail English 101, I won’t be able to graduate. If I don’t graduate, I probably won’t be able to get a good job, and I may very well end up doing temp work or flipping burgers for the next year.”

Tip: Check your argument for chains of consequences, where you say “if A, then B, and if B, then C,” and so forth. Make sure these chains are reasonable.

Weak analogy

Definition: Many arguments rely on an analogy between two or more objects, ideas, or situations. If the two things that are being compared aren’t really alike in the relevant respects, the analogy is a weak one, and the argument that relies on it commits the fallacy of weak analogy.

Example: “Guns are like hammers—they’re both tools with metal parts that could be used to kill someone. And yet it would be ridiculous to restrict the purchase of hammers—so restrictions on purchasing guns are equally ridiculous.” While guns and hammers do share certain features, these features (having metal parts, being tools, and being potentially useful for violence) are not the ones at stake in deciding whether to restrict guns. Rather, we restrict guns because they can easily be used to kill large numbers of people at a distance. This is a feature hammers do not share—it would be hard to kill a crowd with a hammer. Thus, the analogy is weak, and so is the argument based on it.

If you think about it, you can make an analogy of some kind between almost any two things in the world: “My paper is like a mud puddle because they both get bigger when it rains (I work more when I’m stuck inside) and they’re both kind of murky.” So the mere fact that you can draw an analogy between two things doesn’t prove much, by itself.

Arguments by analogy are often used in discussing abortion—arguers frequently compare fetuses with adult human beings, and then argue that treatment that would violate the rights of an adult human being also violates the rights of fetuses. Whether these arguments are good or not depends on the strength of the analogy: do adult humans and fetuses share the properties that give adult humans rights? If the property that matters is having a human genetic code or the potential for a life full of human experiences, adult humans and fetuses do share that property, so the argument and the analogy are strong; if the property is being self-aware, rational, or able to survive on one’s own, adult humans and fetuses don’t share it, and the analogy is weak.

Tip: Identify what properties are important to the claim you’re making, and see whether the two things you’re comparing both share those properties.

Appeal to authority

Definition: Often we add strength to our arguments by referring to respected sources or authorities and explaining their positions on the issues we’re discussing. If, however, we try to get readers to agree with us simply by impressing them with a famous name or by appealing to a supposed authority who really isn’t much of an expert, we commit the fallacy of appeal to authority.

Example: “We should abolish the death penalty. Many respected people, such as actor Guy Handsome, have publicly stated their opposition to it.” While Guy Handsome may be an authority on matters having to do with acting, there’s no particular reason why anyone should be moved by his political opinions—he is probably no more of an authority on the death penalty than the person writing the paper.

Tip: There are two easy ways to avoid committing appeal to authority: First, make sure that the authorities you cite are experts on the subject you’re discussing. Second, rather than just saying “Dr. Authority believes X, so we should believe it, too,” try to explain the reasoning or evidence that the authority used to arrive at their opinion. That way, your readers have more to go on than a person’s reputation. It also helps to choose authorities who are perceived as fairly neutral or reasonable, rather than people who will be perceived as biased.

Definition: The Latin name of this fallacy means “to the people.” There are several versions of the ad populum fallacy, but in all of them, the arguer takes advantage of the desire most people have to be liked and to fit in with others and uses that desire to try to get the audience to accept their argument. One of the most common versions is the bandwagon fallacy, in which the arguer tries to convince the audience to do or believe something because everyone else (supposedly) does.

Example: “Gay marriages are just immoral. 70% of Americans think so!” While the opinion of most Americans might be relevant in determining what laws we should have, it certainly doesn’t determine what is moral or immoral: there was a time where a substantial number of Americans were in favor of segregation, but their opinion was not evidence that segregation was moral. The arguer is trying to get us to agree with the conclusion by appealing to our desire to fit in with other Americans.

Tip: Make sure that you aren’t recommending that your readers believe your conclusion because everyone else believes it, all the cool people believe it, people will like you better if you believe it, and so forth. Keep in mind that the popular opinion is not always the right one.

Ad hominem and tu quoque

Definitions: Like the appeal to authority and ad populum fallacies, the ad hominem (“against the person”) and tu quoque (“you, too!”) fallacies focus our attention on people rather than on arguments or evidence. In both of these arguments, the conclusion is usually “You shouldn’t believe So-and-So’s argument.” The reason for not believing So-and-So is that So-and-So is either a bad person (ad hominem) or a hypocrite (tu quoque). In an ad hominem argument, the arguer attacks their opponent instead of the opponent’s argument.

Examples: “Andrea Dworkin has written several books arguing that pornography harms women. But Dworkin is just ugly and bitter, so why should we listen to her?” Dworkin’s appearance and character, which the arguer has characterized so ungenerously, have nothing to do with the strength of her argument, so using them as evidence is fallacious.

In a tu quoque argument, the arguer points out that the opponent has actually done the thing they are arguing against, and so the opponent’s argument shouldn’t be listened to. Here’s an example: imagine that your parents have explained to you why you shouldn’t smoke, and they’ve given a lot of good reasons—the damage to your health, the cost, and so forth. You reply, “I won’t accept your argument, because you used to smoke when you were my age. You did it, too!” The fact that your parents have done the thing they are condemning has no bearing on the premises they put forward in their argument (smoking harms your health and is very expensive), so your response is fallacious.

Tip: Be sure to stay focused on your opponents’ reasoning, rather than on their personal character. (The exception to this is, of course, if you are making an argument about someone’s character—if your conclusion is “President Jones is an untrustworthy person,” premises about her untrustworthy acts are relevant, not fallacious.)

Appeal to pity

Definition: The appeal to pity takes place when an arguer tries to get people to accept a conclusion by making them feel sorry for someone.

Examples: “I know the exam is graded based on performance, but you should give me an A. My cat has been sick, my car broke down, and I’ve had a cold, so it was really hard for me to study!” The conclusion here is “You should give me an A.” But the criteria for getting an A have to do with learning and applying the material from the course; the principle the arguer wants us to accept (people who have a hard week deserve A’s) is clearly unacceptable. The information the arguer has given might feel relevant and might even get the audience to consider the conclusion—but the information isn’t logically relevant, and so the argument is fallacious. Here’s another example: “It’s wrong to tax corporations—think of all the money they give to charity, and of the costs they already pay to run their businesses!”

Tip: Make sure that you aren’t simply trying to get your audience to agree with you by making them feel sorry for someone.

Appeal to ignorance

Definition: In the appeal to ignorance, the arguer basically says, “Look, there’s no conclusive evidence on the issue at hand. Therefore, you should accept my conclusion on this issue.”

Example: “People have been trying for centuries to prove that God exists. But no one has yet been able to prove it. Therefore, God does not exist.” Here’s an opposing argument that commits the same fallacy: “People have been trying for years to prove that God does not exist. But no one has yet been able to prove it. Therefore, God exists.” In each case, the arguer tries to use the lack of evidence as support for a positive claim about the truth of a conclusion. There is one situation in which doing this is not fallacious: if qualified researchers have used well-thought-out methods to search for something for a long time, they haven’t found it, and it’s the kind of thing people ought to be able to find, then the fact that they haven’t found it constitutes some evidence that it doesn’t exist.

Tip: Look closely at arguments where you point out a lack of evidence and then draw a conclusion from that lack of evidence.

Definition: One way of making our own arguments stronger is to anticipate and respond in advance to the arguments that an opponent might make. In the straw man fallacy, the arguer sets up a weak version of the opponent’s position and tries to score points by knocking it down. But just as being able to knock down a straw man (like a scarecrow) isn’t very impressive, defeating a watered-down version of your opponent’s argument isn’t very impressive either.

Example: “Feminists want to ban all pornography and punish everyone who looks at it! But such harsh measures are surely inappropriate, so the feminists are wrong: porn and its fans should be left in peace.” The feminist argument is made weak by being overstated. In fact, most feminists do not propose an outright “ban” on porn or any punishment for those who merely view it or approve of it; often, they propose some restrictions on particular things like child porn, or propose to allow people who are hurt by porn to sue publishers and producers—not viewers—for damages. So the arguer hasn’t really scored any points; they have just committed a fallacy.

Tip: Be charitable to your opponents. State their arguments as strongly, accurately, and sympathetically as possible. If you can knock down even the best version of an opponent’s argument, then you’ve really accomplished something.

Red herring

Definition: Partway through an argument, the arguer goes off on a tangent, raising a side issue that distracts the audience from what’s really at stake. Often, the arguer never returns to the original issue.

Example: “Grading this exam on a curve would be the most fair thing to do. After all, classes go more smoothly when the students and the professor are getting along well.” Let’s try our premise-conclusion outlining to see what’s wrong with this argument:

Premise: Classes go more smoothly when the students and the professor are getting along well.

Conclusion: Grading this exam on a curve would be the most fair thing to do.

When we lay it out this way, it’s pretty obvious that the arguer went off on a tangent—the fact that something helps people get along doesn’t necessarily make it more fair; fairness and justice sometimes require us to do things that cause conflict. But the audience may feel like the issue of teachers and students agreeing is important and be distracted from the fact that the arguer has not given any evidence as to why a curve would be fair.

Tip: Try laying your premises and conclusion out in an outline-like form. How many issues do you see being raised in your argument? Can you explain how each premise supports the conclusion?

False dichotomy

Definition: In false dichotomy, the arguer sets up the situation so it looks like there are only two choices. The arguer then eliminates one of the choices, so it seems that we are left with only one option: the one the arguer wanted us to pick in the first place. But often there are really many different options, not just two—and if we thought about them all, we might not be so quick to pick the one the arguer recommends.

Example: “Caldwell Hall is in bad shape. Either we tear it down and put up a new building, or we continue to risk students’ safety. Obviously we shouldn’t risk anyone’s safety, so we must tear the building down.” The argument neglects to mention the possibility that we might repair the building or find some way to protect students from the risks in question—for example, if only a few rooms are in bad shape, perhaps we shouldn’t hold classes in those rooms.

Tip: Examine your own arguments: if you’re saying that we have to choose between just two options, is that really so? Or are there other alternatives you haven’t mentioned? If there are other alternatives, don’t just ignore them—explain why they, too, should be ruled out. Although there’s no formal name for it, assuming that there are only three options, four options, etc. when really there are more is similar to false dichotomy and should also be avoided.

Begging the question

Definition: A complicated fallacy; it comes in several forms and can be harder to detect than many of the other fallacies we’ve discussed. Basically, an argument that begs the question asks the reader to simply accept the conclusion without providing real evidence; the argument either relies on a premise that says the same thing as the conclusion (which you might hear referred to as “being circular” or “circular reasoning”), or simply ignores an important (but questionable) assumption that the argument rests on. Sometimes people use the phrase “beg the question” as a sort of general criticism of arguments, to mean that an arguer hasn’t given very good reasons for a conclusion, but that’s not the meaning we’re going to discuss here.

Examples: “Active euthanasia is morally acceptable. It is a decent, ethical thing to help another human being escape suffering through death.” Let’s lay this out in premise-conclusion form:

Premise: It is a decent, ethical thing to help another human being escape suffering through death.

Conclusion: Active euthanasia is morally acceptable.

If we “translate” the premise, we’ll see that the arguer has really just said the same thing twice: “decent, ethical” means pretty much the same thing as “morally acceptable,” and “help another human being escape suffering through death” means something pretty similar to “active euthanasia.” So the premise basically says, “active euthanasia is morally acceptable,” just like the conclusion does. The arguer hasn’t yet given us any real reasons why euthanasia is acceptable; instead, they have left us asking “well, really, why do you think active euthanasia is acceptable?” Their argument “begs” (that is, evades) the real question.

Here’s a second example of begging the question, in which a dubious premise which is needed to make the argument valid is completely ignored: “Murder is morally wrong. So active euthanasia is morally wrong.” The premise that gets left out is “active euthanasia is murder.” And that is a debatable premise—again, the argument “begs” or evades the question of whether active euthanasia is murder by simply not stating the premise. The arguer is hoping we’ll just focus on the uncontroversial premise, “Murder is morally wrong,” and not notice what is being assumed.

Tip: One way to try to avoid begging the question is to write out your premises and conclusion in a short, outline-like form. See if you notice any gaps, any steps that are required to move from one premise to the next or from the premises to the conclusion. Write down the statements that would fill those gaps. If the statements are controversial and you’ve just glossed over them, you might be begging the question. Next, check to see whether any of your premises basically says the same thing as the conclusion (but in different words). If so, you’re probably begging the question. The moral of the story: you can’t just assume or use as uncontroversial evidence the very thing you’re trying to prove.

Equivocation

Definition: Equivocation is sliding between two or more different meanings of a single word or phrase that is important to the argument.

Example: “Giving money to charity is the right thing to do. So charities have a right to our money.” The equivocation here is on the word “right”: “right” can mean both something that is correct or good (as in “I got the right answers on the test”) and something to which someone has a claim (as in “everyone has a right to life”). Sometimes an arguer will deliberately, sneakily equivocate, often on words like “freedom,” “justice,” “rights,” and so forth; other times, the equivocation is a mistake or misunderstanding. Either way, it’s important that you use the main terms of your argument consistently.

Tip: Identify the most important words and phrases in your argument and ask yourself whether they could have more than one meaning. If they could, be sure you aren’t slipping and sliding between those meanings.

So how do I find fallacies in my own writing?

Here are some general tips for finding fallacies in your own arguments:

  • Pretend you disagree with the conclusion you’re defending. What parts of the argument would now seem fishy to you? What parts would seem easiest to attack? Give special attention to strengthening those parts.
  • List your main points; under each one, list the evidence you have for it. Seeing your claims and evidence laid out this way may make you realize that you have no good evidence for a particular claim, or it may help you look more critically at the evidence you’re using.
  • Learn which types of fallacies you’re especially prone to, and be careful to check for them in your work. Some writers make lots of appeals to authority; others are more likely to rely on weak analogies or set up straw men. Read over some of your old papers to see if there’s a particular kind of fallacy you need to watch out for.
  • Be aware that broad claims need more proof than narrow ones. Claims that use sweeping words like “all,” “no,” “none,” “every,” “always,” “never,” “no one,” and “everyone” are sometimes appropriate—but they require a lot more proof than less-sweeping claims that use words like “some,” “many,” “few,” “sometimes,” “usually,” and so forth.
  • Double check your characterizations of others, especially your opponents, to be sure they are accurate and fair.

Can I get some practice with this?

Yes, you can. Follow this link to see a sample argument that’s full of fallacies (and then you can follow another link to get an explanation of each one). Then there’s a more well-constructed argument on the same topic.

Works consulted

We consulted these works while writing this handout. This is not a comprehensive list of resources on the handout’s topic, and we encourage you to do your own research to find additional publications. Please do not use this list as a model for the format of your own reference list, as it may not match the citation style you are using. For guidance on formatting citations, please see the UNC Libraries citation tutorial . We revise these tips periodically and welcome feedback.

Copi, Irving M., Carl Cohen, and Victor Rodych. 1998. Introduction to Logic . London: Pearson Education.

Hurley, Patrick J. 2000. A Concise Introduction to Logic , 7th ed. Belmont, CA: Wadsworth.

Lunsford, Andrea A., and John J. Ruszkiewicz. 2016. Everything’s an Argument , 7th ed. Boston: Bedford/St Martin’s.

You may reproduce it for non-commercial use if you use the entire handout and attribute the source: The Writing Center, University of North Carolina at Chapel Hill

Make a Gift

Library Home

Introduction to Logic and Critical Thinking

(10 reviews)

fallacies definition critical thinking

Matthew Van Cleave, Lansing Community College

Copyright Year: 2016

Publisher: Matthew J. Van Cleave

Language: English

Formats Available

Conditions of use.

Attribution

Learn more about reviews.

Reviewed by "yusef" Alexander Hayes, Professor, North Shore Community College on 6/9/21

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness. read more

Comprehensiveness rating: 5 see less

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness.

Content Accuracy rating: 5

The book is accurate.

Relevance/Longevity rating: 5

While many modern examples are used, and they are helpful, they are not necessarily needed. The usefulness of logical principles and skills have proved themselves, and this text presents them clearly with many examples.

Clarity rating: 5

It is obvious that the author cares about their subject, audience, and students. The text is comprehensible and interesting.

Consistency rating: 5

The format is easy to understand and is consistent in framing.

Modularity rating: 5

This text would be easy to adapt.

Organization/Structure/Flow rating: 5

The organization is excellent, my one suggestion would be a concluding chapter.

Interface rating: 5

I accessed the PDF version and it would be easy to work with.

Grammatical Errors rating: 5

The writing is excellent.

Cultural Relevance rating: 5

This is not an offensive text.

Reviewed by Susan Rottmann, Part-time Lecturer, University of Southern Maine on 3/2/21

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it... read more

Comprehensiveness rating: 4 see less

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it fits better for a general critical thinking course than for a true logic course. I'm not sure that I'd agree. I have been using Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," and I think that book is a better introduction to critical thinking for non-philosophy majors. However, the latter is not open source so I will figure out how to get by without it in the future. Overall, the book seems comprehensive if the subject is logic. The index is on the short-side, but fine. However, one issue for me is that there are no page numbers on the table of contents, which is pretty annoying if you want to locate particular sections.

Content Accuracy rating: 4

I didn't find any errors. In general the book uses great examples. However, they are very much based in the American context, not for an international student audience. Some effort to broaden the chosen examples would make the book more widely applicable.

Relevance/Longevity rating: 4

I think the book will remain relevant because of the nature of the material that it addresses, however there will be a need to modify the examples in future editions and as the social and political context changes.

Clarity rating: 3

The text is lucid, but I think it would be difficult for introductory-level students who are not philosophy majors. For example, in Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," the sub-headings are very accessible, such as "Experts cannot rescue us, despite what they say" or "wishful thinking: perhaps the biggest single speed bump on the road to critical thinking." By contrast, Van Cleave's "Introduction to Logic and Critical Thinking" has more subheadings like this: "Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form" or "Propositional logic and the four basic truth functional connectives." If students are prepared very well for the subject, it would work fine, but for students who are newly being introduced to critical thinking, it is rather technical.

It seems to be very consistent in terms of its terminology and framework.

Modularity rating: 4

The book is divided into 4 chapters, each having many sub-chapters. In that sense, it is readily divisible and modular. However, as noted above, there are no page numbers on the table of contents, which would make assigning certain parts rather frustrating. Also, I'm not sure why the book is only four chapter and has so many subheadings (for instance 17 in Chapter 2) and a length of 242 pages. Wouldn't it make more sense to break up the book into shorter chapters? I think this would make it easier to read and to assign in specific blocks to students.

Organization/Structure/Flow rating: 4

The organization of the book is fine overall, although I think adding page numbers to the table of contents and breaking it up into more separate chapters would help it to be more easily navigable.

Interface rating: 4

The book is very simply presented. In my opinion it is actually too simple. There are few boxes or diagrams that highlight and explain important points.

The text seems fine grammatically. I didn't notice any errors.

The book is written with an American audience in mind, but I did not notice culturally insensitive or offensive parts.

Overall, this book is not for my course, but I think it could work well in a philosophy course.

fallacies definition critical thinking

Reviewed by Daniel Lee, Assistant Professor of Economics and Leadership, Sweet Briar College on 11/11/19

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument read more

Comprehensiveness rating: 3 see less

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument

To the best of my ability, I regard this content as accurate, error-free, and unbiased

The book is broadly relevant and up-to-date, with a few stray temporal references (sydney olympics, particular presidencies). I don't view these time-dated examples as problematic as the logical underpinnings are still there and easily assessed

Clarity rating: 4

My only pushback on clarity is I didn't find the distinction between argument and explanation particularly helpful/useful/easy to follow. However, this experience may have been unique to my class.

To the best of my ability, I regard this content as internally consistent

I found this text quite modular, and was easily able to integrate other texts into my lessons and disregard certain chapters or sub-sections

The book had a logical and consistent structure, but to the extent that there are only 4 chapters, there isn't much scope for alternative approaches here

No problems with the book's interface

The text is grammatically sound

Cultural Relevance rating: 4

Perhaps the text could have been more universal in its approach. While I didn't find the book insensitive per-se, logic can be tricky here because the point is to evaluate meaningful (non-trivial) arguments, but any argument with that sense of gravity can also be traumatic to students (abortion, death penalty, etc)

No additional comments

Reviewed by Lisa N. Thomas-Smith, Graduate Part-time Instructor, CU Boulder on 7/1/19

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text,... read more

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text, and the index is very thorough.

The content is excellent. The text is thorough and accurate with no errors that I could discern. The terminology and exercises cover the material nicely and without bias.

The text should easily stand the test of time. The exercises are excellent and would be very helpful for students to internalize correct critical thinking practices. Because of the logical arrangement of the text and the many sub-sections, additional material should be very easy to add.

The text is extremely clearly and simply written. I anticipate that a diligent student could learn all of the material in the text with little additional instruction. The examples are relevant and easy to follow.

The text did not confuse terms or use inconsistent terminology, which is very important in a logic text. The discipline often uses multiple terms for the same concept, but this text avoids that trap nicely.

The text is fairly easily divisible. Since there are only four chapters, those chapters include large blocks of information. However, the chapters themselves are very well delineated and could be easily broken up so that parts could be left out or covered in a different order from the text.

The flow of the text is excellent. All of the information is handled solidly in an order that allows the student to build on the information previously covered.

The PDF Table of Contents does not include links or page numbers which would be very helpful for navigation. Other than that, the text was very easy to navigate. All the images, charts, and graphs were very clear

I found no grammatical errors in the text.

Cultural Relevance rating: 3

The text including examples and exercises did not seem to be offensive or insensitive in any specific way. However, the examples included references to black and white people, but few others. Also, the text is very American specific with many examples from and for an American audience. More diversity, especially in the examples, would be appropriate and appreciated.

Reviewed by Leslie Aarons, Associate Professor of Philosophy, CUNY LaGuardia Community College on 5/16/19

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an... read more

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an argument and an explanation; validity; soundness; and the distinctions between an inductive and a deductive argument in accessible terms in the first chapter. It also does a good job introducing and discussing informal fallacies (Chapter 4). The incorporation of opportunities to evaluate real-world arguments is also very effective. Chapter 2 also covers a number of formal methods of evaluating arguments, such as Venn Diagrams and Propositional logic and the four basic truth functional connectives, but to my mind, it is much more thorough in its treatment of Informal Logic and Critical Thinking skills, than it is of formal logic. I also appreciated that Van Cleave’s book includes exercises with answers and an index, but there is no glossary; which I personally do not find detracts from the book's comprehensiveness.

Overall, Van Cleave's book is error-free and unbiased. The language used is accessible and engaging. There were no glaring inaccuracies that I was able to detect.

Van Cleave's Textbook uses relevant, contemporary content that will stand the test of time, at least for the next few years. Although some examples use certain subjects like former President Obama, it does so in a useful manner that inspires the use of critical thinking skills. There are an abundance of examples that inspire students to look at issues from many different political viewpoints, challenging students to practice evaluating arguments, and identifying fallacies. Many of these exercises encourage students to critique issues, and recognize their own inherent reader-biases and challenge their own beliefs--hallmarks of critical thinking.

As mentioned previously, the author has an accessible style that makes the content relatively easy to read and engaging. He also does a suitable job explaining jargon/technical language that is introduced in the textbook.

Van Cleave uses terminology consistently and the chapters flow well. The textbook orients the reader by offering effective introductions to new material, step-by-step explanations of the material, as well as offering clear summaries of each lesson.

This textbook's modularity is really quite good. Its language and structure are not overly convoluted or too-lengthy, making it convenient for individual instructors to adapt the materials to suit their methodological preferences.

The topics in the textbook are presented in a logical and clear fashion. The structure of the chapters are such that it is not necessary to have to follow the chapters in their sequential order, and coverage of material can be adapted to individual instructor's preferences.

The textbook is free of any problematic interface issues. Topics, sections and specific content are accessible and easy to navigate. Overall it is user-friendly.

I did not find any significant grammatical issues with the textbook.

The textbook is not culturally insensitive, making use of a diversity of inclusive examples. Materials are especially effective for first-year critical thinking/logic students.

I intend to adopt Van Cleave's textbook for a Critical Thinking class I am teaching at the Community College level. I believe that it will help me facilitate student-learning, and will be a good resource to build additional classroom activities from the materials it provides.

Reviewed by Jennie Harrop, Chair, Department of Professional Studies, George Fox University on 3/27/18

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters... read more

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters that are dense with statistical analyses and critical vocabulary. These topics are likely better broached in manageable snippets rather than hefty single chapters.

The ideas addressed in Introduction to Logic and Critical Thinking are accurate but at times notably political. While politics are effectively used to exemplify key concepts, some students may be distracted by distinct political leanings.

The terms and definitions included are relevant, but the examples are specific to the current political, cultural, and social climates, which could make the materials seem dated in a few years without intentional and consistent updates.

While the reasoning is accurate, the author tends to complicate rather than simplify -- perhaps in an effort to cover a spectrum of related concepts. Beginning readers are likely to be overwhelmed and under-encouraged by his approach.

Consistency rating: 3

The four chapters are somewhat consistent in their play of definition, explanation, and example, but the structure of each chapter varies according to the concepts covered. In the third chapter, for example, key ideas are divided into sub-topics numbering from 3.1 to 3.10. In the fourth chapter, the sub-divisions are further divided into sub-sections numbered 4.1.1-4.1.5, 4.2.1-4.2.2, and 4.3.1 to 4.3.6. Readers who are working quickly to master new concepts may find themselves mired in similarly numbered subheadings, longing for a grounded concepts on which to hinge other key principles.

Modularity rating: 3

The book's four chapters make it mostly self-referential. The author would do well to beak this text down into additional subsections, easing readers' accessibility.

The content of the book flows logically and well, but the information needs to be better sub-divided within each larger chapter, easing the student experience.

The book's interface is effective, allowing readers to move from one section to the next with a single click. Additional sub-sections would ease this interplay even further.

Grammatical Errors rating: 4

Some minor errors throughout.

For the most part, the book is culturally neutral, avoiding direct cultural references in an effort to remain relevant.

Reviewed by Yoichi Ishida, Assistant Professor of Philosophy, Ohio University on 2/1/18

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic,... read more

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic, this textbook does not cover suppositional arguments, such as conditional proof and reductio ad absurdum. But other standard argument forms are covered. Chapter 3 covers inductive logic, and here this textbook introduces probability and its relationship with cognitive biases, which are rarely discussed in other textbooks. Chapter 4 introduces common informal fallacies. The answers to all the exercises are given at the end. However, the last set of exercises is in Chapter 3, Section 5. There are no exercises in the rest of the chapter. Chapter 4 has no exercises either. There is index, but no glossary.

The textbook is accurate.

The content of this textbook will not become obsolete soon.

The textbook is written clearly.

The textbook is internally consistent.

The textbook is fairly modular. For example, Chapter 3, together with a few sections from Chapter 1, can be used as a short introduction to inductive logic.

The textbook is well-organized.

There are no interface issues.

I did not find any grammatical errors.

This textbook is relevant to a first semester logic or critical thinking course.

Reviewed by Payal Doctor, Associate Professro, LaGuardia Community College on 2/1/18

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner... read more

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner book, but seems to be a good text for a course that needs a foundation for arguments. There are exercises on creating truth tables and proofs, so it could work as a logic primer in short sessions or with the addition of other course content.

The books is accurate in the information it presents. It does not contain errors and is unbiased. It covers the essential vocabulary clearly and givens ample examples and exercises to ensure the student understands the concepts

The content of the book is up to date and can be easily updated. Some examples are very current for analyzing the argument structure in a speech, but for this sort of text understandable examples are important and the author uses good examples.

The book is clear and easy to read. In particular, this is a good text for community college students who often have difficulty with reading comprehension. The language is straightforward and concepts are well explained.

The book is consistent in terminology, formatting, and examples. It flows well from one topic to the next, but it is also possible to jump around the text without loosing the voice of the text.

The books is broken down into sub units that make it easy to assign short blocks of content at a time. Later in the text, it does refer to a few concepts that appear early in that text, but these are all basic concepts that must be used to create a clear and understandable text. No sections are too long and each section stays on topic and relates the topic to those that have come before when necessary.

The flow of the text is logical and clear. It begins with the basic building blocks of arguments, and practice identifying more and more complex arguments is offered. Each chapter builds up from the previous chapter in introducing propositional logic, truth tables, and logical arguments. A select number of fallacies are presented at the end of the text, but these are related to topics that were presented before, so it makes sense to have these last.

The text is free if interface issues. I used the PDF and it worked fine on various devices without loosing formatting.

1. The book contains no grammatical errors.

The text is culturally sensitive, but examples used are a bit odd and may be objectionable to some students. For instance, President Obama's speech on Syria is used to evaluate an extended argument. This is an excellent example and it is explained well, but some who disagree with Obama's policies may have trouble moving beyond their own politics. However, other examples look at issues from all political viewpoints and ask students to evaluate the argument, fallacy, etc. and work towards looking past their own beliefs. Overall this book does use a variety of examples that most students can understand and evaluate.

My favorite part of this book is that it seems to be written for community college students. My students have trouble understanding readings in the New York Times, so it is nice to see a logic and critical thinking text use real language that students can understand and follow without the constant need of a dictionary.

Reviewed by Rebecca Owen, Adjunct Professor, Writing, Chemeketa Community College on 6/20/17

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current... read more

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current events, funny scenarios, or other interesting ways to evaluate argument structure and validity. The third section, which deals with logical fallacies, is very clear and comprehensive. My only critique of the material included in the book is that the middle section may be a bit dense and math-oriented for learners who appreciate the more informal, informative style of the first and third section. Also, the book ends rather abruptly--it moves from a description of a logical fallacy to the answers for the exercises earlier in the text.

The content is very reader-friendly, and the author writes with authority and clarity throughout the text. There are a few surface-level typos (Starbuck's instead of Starbucks, etc.). None of these small errors detract from the quality of the content, though.

One thing I really liked about this text was the author's wide variety of examples. To demonstrate different facets of logic, he used examples from current media, movies, literature, and many other concepts that students would recognize from their daily lives. The exercises in this text also included these types of pop-culture references, and I think students will enjoy the familiarity--as well as being able to see the logical structures behind these types of references. I don't think the text will need to be updated to reflect new instances and occurrences; the author did a fine job at picking examples that are relatively timeless. As far as the subject matter itself, I don't think it will become obsolete any time soon.

The author writes in a very conversational, easy-to-read manner. The examples used are quite helpful. The third section on logical fallacies is quite easy to read, follow, and understand. A student in an argument writing class could benefit from this section of the book. The middle section is less clear, though. A student learning about the basics of logic might have a hard time digesting all of the information contained in chapter two. This material might be better in two separate chapters. I think the author loses the balance of a conversational, helpful tone and focuses too heavily on equations.

Consistency rating: 4

Terminology in this book is quite consistent--the key words are highlighted in bold. Chapters 1 and 3 follow a similar organizational pattern, but chapter 2 is where the material becomes more dense and equation-heavy. I also would have liked a closing passage--something to indicate to the reader that we've reached the end of the chapter as well as the book.

I liked the overall structure of this book. If I'm teaching an argumentative writing class, I could easily point the students to the chapters where they can identify and practice identifying fallacies, for instance. The opening chapter is clear in defining the necessary terms, and it gives the students an understanding of the toolbox available to them in assessing and evaluating arguments. Even though I found the middle section to be dense, smaller portions could be assigned.

The author does a fine job connecting each defined term to the next. He provides examples of how each defined term works in a sentence or in an argument, and then he provides practice activities for students to try. The answers for each question are listed in the final pages of the book. The middle section feels like the heaviest part of the whole book--it would take the longest time for a student to digest if assigned the whole chapter. Even though this middle section is a bit heavy, it does fit the overall structure and flow of the book. New material builds on previous chapters and sub-chapters. It ends abruptly--I didn't realize that it had ended, and all of a sudden I found myself in the answer section for those earlier exercises.

The simple layout is quite helpful! There is nothing distracting, image-wise, in this text. The table of contents is clearly arranged, and each topic is easy to find.

Tiny edits could be made (Starbuck's/Starbucks, for one). Otherwise, it is free of distracting grammatical errors.

This text is quite culturally relevant. For instance, there is one example that mentions the rumors of Barack Obama's birthplace as somewhere other than the United States. This example is used to explain how to analyze an argument for validity. The more "sensational" examples (like the Obama one above) are helpful in showing argument structure, and they can also help students see how rumors like this might gain traction--as well as help to show students how to debunk them with their newfound understanding of argument and logic.

The writing style is excellent for the subject matter, especially in the third section explaining logical fallacies. Thank you for the opportunity to read and review this text!

Reviewed by Laurel Panser, Instructor, Riverland Community College on 6/20/17

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as... read more

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as the 13th edition with the same title. Lori Watson is the second author on the 13th edition.

Competing with Hurley is difficult with respect to comprehensiveness. For example, Van Cleave’s book is comprehensive to the extent that it probably covers at least two-thirds or more of what is dealt with in most introductory, one-semester logic courses. Van Cleave’s chapter 1 provides an overview of argumentation including discerning non-arguments from arguments, premises versus conclusions, deductive from inductive arguments, validity, soundness and more. Much of Van Cleave’s chapter 1 parallel’s Hurley’s chapter 1. Hurley’s chapter 3 regarding informal fallacies is comprehensive while Van Cleave’s chapter 4 on this topic is less extensive. Categorical propositions are a topic in Van Cleave’s chapter 2; Hurley’s chapters 4 and 5 provide more instruction on this, however. Propositional logic is another topic in Van Cleave’s chapter 2; Hurley’s chapters 6 and 7 provide more information on this, though. Van Cleave did discuss messy issues of language meaning briefly in his chapter 1; that is the topic of Hurley’s chapter 2.

Van Cleave’s book includes exercises with answers and an index. A glossary was not included.

Reviews of open source textbooks typically include criteria besides comprehensiveness. These include comments on accuracy of the information, whether the book will become obsolete soon, jargon-free clarity to the extent that is possible, organization, navigation ease, freedom from grammar errors and cultural relevance; Van Cleave’s book is fine in all of these areas. Further criteria for open source books includes modularity and consistency of terminology. Modularity is defined as including blocks of learning material that are easy to assign to students. Hurley’s book has a greater degree of modularity than Van Cleave’s textbook. The prose Van Cleave used is consistent.

Van Cleave’s book will not become obsolete soon.

Van Cleave’s book has accessible prose.

Van Cleave used terminology consistently.

Van Cleave’s book has a reasonable degree of modularity.

Van Cleave’s book is organized. The structure and flow of his book is fine.

Problems with navigation are not present.

Grammar problems were not present.

Van Cleave’s book is culturally relevant.

Van Cleave’s book is appropriate for some first semester logic courses.

Table of Contents

Chapter 1: Reconstructing and analyzing arguments

  • 1.1 What is an argument?
  • 1.2 Identifying arguments
  • 1.3 Arguments vs. explanations
  • 1.4 More complex argument structures
  • 1.5 Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form
  • 1.6 Validity
  • 1.7 Soundness
  • 1.8 Deductive vs. inductive arguments
  • 1.9 Arguments with missing premises
  • 1.10 Assuring, guarding, and discounting
  • 1.11 Evaluative language
  • 1.12 Evaluating a real-life argument

Chapter 2: Formal methods of evaluating arguments

  • 2.1 What is a formal method of evaluation and why do we need them?
  • 2.2 Propositional logic and the four basic truth functional connectives
  • 2.3 Negation and disjunction
  • 2.4 Using parentheses to translate complex sentences
  • 2.5 “Not both” and “neither nor”
  • 2.6 The truth table test of validity
  • 2.7 Conditionals
  • 2.8 “Unless”
  • 2.9 Material equivalence
  • 2.10 Tautologies, contradictions, and contingent statements
  • 2.11 Proofs and the 8 valid forms of inference
  • 2.12 How to construct proofs
  • 2.13 Short review of propositional logic
  • 2.14 Categorical logic
  • 2.15 The Venn test of validity for immediate categorical inferences
  • 2.16 Universal statements and existential commitment
  • 2.17 Venn validity for categorical syllogisms

Chapter 3: Evaluating inductive arguments and probabilistic and statistical fallacies

  • 3.1 Inductive arguments and statistical generalizations
  • 3.2 Inference to the best explanation and the seven explanatory virtues
  • 3.3 Analogical arguments
  • 3.4 Causal arguments
  • 3.5 Probability
  • 3.6 The conjunction fallacy
  • 3.7 The base rate fallacy
  • 3.8 The small numbers fallacy
  • 3.9 Regression to the mean fallacy
  • 3.10 Gambler's fallacy

Chapter 4: Informal fallacies

  • 4.1 Formal vs. informal fallacies
  • 4.1.1 Composition fallacy
  • 4.1.2 Division fallacy
  • 4.1.3 Begging the question fallacy
  • 4.1.4 False dichotomy
  • 4.1.5 Equivocation
  • 4.2 Slippery slope fallacies
  • 4.2.1 Conceptual slippery slope
  • 4.2.2 Causal slippery slope
  • 4.3 Fallacies of relevance
  • 4.3.1 Ad hominem
  • 4.3.2 Straw man
  • 4.3.3 Tu quoque
  • 4.3.4 Genetic
  • 4.3.5 Appeal to consequences
  • 4.3.6 Appeal to authority

Answers to exercises Glossary/Index

Ancillary Material

About the book.

This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a “critical thinking textbook.”

About the Contributors

Matthew Van Cleave ,   PhD, Philosophy, University of Cincinnati, 2007.  VAP at Concordia College (Moorhead), 2008-2012.  Assistant Professor at Lansing Community College, 2012-2016. Professor at Lansing Community College, 2016-

Contribute to this Page

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Purdue Online Writing Lab Purdue OWL® College of Liberal Arts

Logical Fallacies

OWL logo

Welcome to the Purdue OWL

This page is brought to you by the OWL at Purdue University. When printing this page, you must include the entire legal notice.

Copyright ©1995-2018 by The Writing Lab & The OWL at Purdue and Purdue University. All rights reserved. This material may not be published, reproduced, broadcast, rewritten, or redistributed without permission. Use of this site constitutes acceptance of our terms and conditions of fair use.

Fallacies are common errors in reasoning that will undermine the logic of your argument. Fallacies can be either illegitimate arguments or irrelevant points, and are often identified because they lack evidence that supports their claim. Avoid these common fallacies in your own arguments and watch for them in the arguments of others.

Slippery Slope: This is a conclusion based on the premise that if A happens, then eventually through a series of small steps, through B, C,..., X, Y, Z will happen, too, basically equating A and Z. So, if we don't want Z to occur, A must not be allowed to occur either. Example:

If we ban Hummers because they are bad for the environment eventually the government will ban all cars, so we should not ban Hummers.

In this example, the author is equating banning Hummers with banning all cars, which is not the same thing.

Hasty Generalization: This is a conclusion based on insufficient or biased evidence. In other words, you are rushing to a conclusion before you have all the relevant facts. Example:

Even though it's only the first day, I can tell this is going to be a boring course.

In this example, the author is basing his evaluation of the entire course on only the first day, which is notoriously boring and full of housekeeping tasks for most courses. To make a fair and reasonable evaluation the author must attend not one but several classes, and possibly even examine the textbook, talk to the professor, or talk to others who have previously finished the course in order to have sufficient evidence to base a conclusion on.

Post hoc ergo propter hoc: This is a conclusion that assumes that if 'A' occurred after 'B' then 'B' must have caused 'A.' Example:

I drank bottled water and now I am sick, so the water must have made me sick.

In this example, the author assumes that if one event chronologically follows another the first event must have caused the second. But the illness could have been caused by the burrito the night before, a flu bug that had been working on the body for days, or a chemical spill across campus. There is no reason, without more evidence, to assume the water caused the person to be sick.

Genetic Fallacy: This conclusion is based on an argument that the origins of a person, idea, institute, or theory determine its character, nature, or worth. Example:

The Volkswagen Beetle is an evil car because it was originally designed by Hitler's army.

In this example the author is equating the character of a car with the character of the people who built the car. However, the two are not inherently related.

Begging the Claim: The conclusion that the writer should prove is validated within the claim. Example:

Filthy and polluting coal should be banned.

Arguing that coal pollutes the earth and thus should be banned would be logical. But the very conclusion that should be proved, that coal causes enough pollution to warrant banning its use, is already assumed in the claim by referring to it as "filthy and polluting."

Circular Argument: This restates the argument rather than actually proving it. Example:

George Bush is a good communicator because he speaks effectively.

In this example, the conclusion that Bush is a "good communicator" and the evidence used to prove it "he speaks effectively" are basically the same idea. Specific evidence such as using everyday language, breaking down complex problems, or illustrating his points with humorous stories would be needed to prove either half of the sentence.

Either/or: This is a conclusion that oversimplifies the argument by reducing it to only two sides or choices. Example:

We can either stop using cars or destroy the earth.

In this example, the two choices are presented as the only options, yet the author ignores a range of choices in between such as developing cleaner technology, car-sharing systems for necessities and emergencies, or better community planning to discourage daily driving.

Ad hominem: This is an attack on the character of a person rather than his or her opinions or arguments. Example:

Green Peace's strategies aren't effective because they are all dirty, lazy hippies.

In this example, the author doesn't even name particular strategies Green Peace has suggested, much less evaluate those strategies on their merits. Instead, the author attacks the characters of the individuals in the group.

Ad populum/Bandwagon Appeal: This is an appeal that presents what most people, or a group of people think, in order to persuade one to think the same way. Getting on the bandwagon is one such instance of an ad populum appeal.

If you were a true American you would support the rights of people to choose whatever vehicle they want.

In this example, the author equates being a "true American," a concept that people want to be associated with, particularly in a time of war, with allowing people to buy any vehicle they want even though there is no inherent connection between the two.

Red Herring: This is a diversionary tactic that avoids the key issues, often by avoiding opposing arguments rather than addressing them. Example:

The level of mercury in seafood may be unsafe, but what will fishers do to support their families?

In this example, the author switches the discussion away from the safety of the food and talks instead about an economic issue, the livelihood of those catching fish. While one issue may affect the other it does not mean we should ignore possible safety issues because of possible economic consequences to a few individuals.

Straw Man: This move oversimplifies an opponent's viewpoint and then attacks that hollow argument.

People who don't support the proposed state minimum wage increase hate the poor.

In this example, the author attributes the worst possible motive to an opponent's position. In reality, however, the opposition probably has more complex and sympathetic arguments to support their point. By not addressing those arguments, the author is not treating the opposition with respect or refuting their position.

Moral Equivalence: This fallacy compares minor misdeeds with major atrocities, suggesting that both are equally immoral.

That parking attendant who gave me a ticket is as bad as Hitler.

In this example, the author is comparing the relatively harmless actions of a person doing their job with the horrific actions of Hitler. This comparison is unfair and inaccurate.

Guide to the Most Common Logical Fallacies

Logical fallacies are flaws in reasoning that weaken or invalidate an argument. Whether they’re used intentionally or unintentionally, they can be quite persuasive. Learning how to identify fallacies is an excellent way to avoid being fooled or manipulated by faulty arguments. It will also help you avoid making fallacious arguments yourself! 

Scroll through to learn to identify some of the more common fallacies. Once you can recognize them, you’ll see them everywhere!

A note on how to use this post: This page is a resource of the most common logical fallacies, and is not intended to be read from top to bottom. Feel free to share the graphics to help educate others about errors in reasoning. Hopefully together we can encourage more productive (and logical!) dialogue.

Ad hominem fallacy attempts to discredit an argument by attacking the source. Shown are examples: Vaccines are safer than the disease. You're in bed with big pharma! Humans are causing climate change. Scientists are in it for the money! GMOs are safe to eat. You're a monsanto shill! Not everyone who endorses a position is a paid "shill." Instead, arguments should be evaluated on their merits.

Other names: Personal attack, name-calling

Definition and explanation : Latin for “to the person,” the ad hominem fallacy is a personal attack. Essentially, instead of addressing the substance of an argument, someone is attempting to discredit the argument by attacking the source. 

The ad hominem is one of the most common logical fallacies. While it can take many forms — from name calling and insults, to attacking a person’s character, to questioning their motives, to calling them hypocrites — any argument that targets the source, rather than the argument, is an ad hominem. Unfortunately, ad hominem arguments are often quite effective because they appeal to people’s emotions and biases. 

No matter the form, the ad hominem is fallacious because the source of the argument is irrelevant to the substance of the argument. It’s a diversion tactic.

How to counter: While there are many sub-types of ad hominem fallacies, identifying the exact kind is less important than recognizing someone is attempting to divert from your argument back onto you in some way. It might be tempting to respond to their attack in kind. However, by resorting to an ad hominem fallacy, an arguer is essentially admitting they lack a substantive argument. Instead of letting them get under your skin, point out the irrelevance of their argument. Or, you could ignore it and move on!

Back to the Top

A necdotal fallacy.

Anecdotal fallacy uses a personal experience or story as evidence for a claim. Examples are I've seen a ghost so I know they're real, removing GMOs from my diet cured my ADHD, this is a crazy snowstorm what global warming, my parents spanked me and I turned out fine, and acupuncture cured my headaches that's all the proof I need.

For a more detailed explanation : Four ways your personal experiences can lead you astray

Definition and explanation : The anecdotal fallacy uses a personal experience or story as evidence for a claim.

Many people think personal experiences provide the best kind of evidence. “We’ll believe it when we see it!” But anecdotes are infamously unreliable. 

  • We can misperceive our experiences. While there is an objective reality outside of our heads, our perception of that reality is a subjective interpretation. Our brains filter and interpret stimuli and fill in any gaps based on expectations. Yet even though our perceptions are flawed and incomplete, we’re convinced we “know” what we saw or experienced.  For example, eyewitness testimony tends to be among the most valued forms of evidence in a trial…yet it’s also the leading cause of wrongful convictions. 
  • Anecdotes aren’t controlled. Imagine you have a headache and take a supplement. Your headache gets better! Was it due to the supplement? The correct answer is: You don’t know. Any number of things could’ve been the cause. (And also…nearly all headaches go away on their own.) That’s why treatments are tested in carefully controlled trials that compare the treatment to a placebo. 
  • Anecdotes often aren’t typical. Small samples are often not representative of normal conditions. Yet because the human brain doesn’t intuitively grasp probabilities we rely on stories or experiences when deciding what to believe.  Consider the following examples: My cousin was mugged in Sydney, so Australia is dangerous. Toyotas are unreliable because I once had a Toyota that was always in the shop. This winter seemed really cold, so there’s no global warming. 
  • Finally, people aren’t always truthful.  Did that person on social media REALLY lose weight eating bacon and ice cream? Did your dad exaggerate his “ghost” story? Did the person in the testimonial really “cure” their acne with the supplement? It’s the hard truth, but people can lie.

The bottom line: Anecdotes aren’t good evidence. 

How to counter : First, it’s important to not fall for this fallacy yourself. Remember that your brain prefers stories – especially vivid and emotional ones – to data. So if your goal is to align your beliefs with reality, be skeptical about the conclusions you can draw from your lived experiences and be humble enough to admit that you might be wrong.

In conversations with others, be empathetic. Most people are convinced that anecdotes are a sure-fire way of knowing what’s true, and it can be quite difficult to convince someone that they might be wrong. We don’t like to admit that we can be fooled, especially by ourselves. If the conversation allows, gently explain why anecdotes aren’t good evidence.

Appeal to authority

Appeal to authority fallacy asserts a claim is true based on the position of an assumed authority. Examples are friend who is a nurse says vaccines cause autism; a senator says climate change is a hoax; and an actress says eating cheese causes acne. These are all false authorities and not actual experts. Note: Deferring to the expert consensus is not fallacious!

Other names: Argument from authority, appeal to false authority

Definition and explanation : The appeal to authority fallacy claims that something is true based on the position of an assumed authority. 

This fallacy is very common, and can be quite tricky, so it’s important to be able to recognize it and understand when it is and isn’t fallacious. In general, arguments should have evidence, and authorities aren’t evidence. However, since it’s impossible for any of us to verify every single thing ourselves, we often rely on experts. For example, we trust mechanics to fix cars, doctors to treat diseases, and pilots to fly planes. Due to their experience, training, and knowledge, experts are better able to evaluate evidence related to their areas of expertise. Therefore, citing actual experts isn’t an appeal to authority. However, citing authorities can only support an argument, not prove it, as experts can be wrong… though they are more likely to be right than non-experts. 

An appeal to authority IS fallacious when:

  • The “authority” isn’t an expert. This is probably the most common form of the fallacy, and easiest to spot. For example, Jenny McCarthy isn’t a vaccine expert, the Food Babe isn’t a nutrition expert, and Tom Brady isn’t a health expert, yet all use their celebrity to promote products or services. Or consider a senator who says climate change is a hoax, or your math teacher who claims aliens built the pyramids. The point is, these supposed “authorities” aren’t experts, so their testimony doesn’t hold any weight.
  • The “authority” is an expert in another area. The opinion of experts is only relevant within their area of expertise. For example, a heart surgeon is not an expert in climate change and a physicist is not an expert in cancer. An advanced degree does not make someone an expert in all areas, and there will always be someone with a doctorate who has crackpot ideas. So when an authority makes a claim, it’s important to ask whether they are in a position to make such a claim.
  • The “authority” makes a claim that is contrary to the expert consensus in that particular area. The expert consensus is the collective agreed-upon knowledge of experts in a specialized field. As non-experts, the consensus is the most reliable form of knowledge. However, it can be difficult for the average person to sort out, especially when someone who appears to be a legitimate expert makes claims contrary to the consensus. But if you’re cherry picking an authority with a minority position to make an argument, but neglecting the authority of the expert consensus, you’re committing an appeal to authority.

The bottom line is that if you’re going to cite an authority to make an argument, it needs to be an actual expert on the topic at hand, and one who is in line with the expert consensus, otherwise you’ve committed an appeal to authority fallacy. How to counter: We’re often unaware of how many of our beliefs originated from trusted authority figures, such as teachers, parents, and various online sources. Only when we have to defend a belief do we realize how much of our knowledge we’ve outsourced… which can result in us committing an appeal to authority fallacy. Get in the habit of evaluating authorities to determine if they have the expertise to make various claims. And empathize with others who may not yet know how to discern who is and isn’t a reliable authority.

To learn more: The Logic of Science: Appealing to Authority vs Deferring to Experts

Appeal to emotions

Appeal to emotions logical fallacies uses emotions, either negative or positive, to persuade in place of evidence. Fear: Don't vote for them. You won't even be safe in your own home! Anger: This crime was heinous. The defendant must be guilty! Pit: Please don't fail me. I need this class to graduate! Happiness: Buy this product. It will make you feel great! TIP: Emotions can drive our reasoning without our awareness. Ask yourself: What is the evidence?

Definition and explanation : The appeal to emotion fallacy uses emotion to persuade, instead of evidence. 

This fallacy is actually a broad category of fallacies that includes appeals to any number of emotions, ranging from negative –– anger, jealousy, shame, hate, and disgust — to positive — happiness, hope, courage, and love. Recognizing which specific emotion was used isn’t as important as realizing that an argument has substituted logical reasons with emotional ones.

However, it’s important to note that the use of emotions in-and-of-itself is not fallacious, as they can be relevant to an argument. It’s only fallacious when feelings are used instead of facts as evidence that something is true or false. 

The appeal to emotion fallacy is often used alongside other fallacies and rhetorical techniques in an attempt to bolster up an argument. For example, loaded , hyperbolic, or even inflammatory, language is common, as are ad hominem attacks, name-calling, YELLING, etc. 

As you might expect, the appeal to emotion is common in advertisements, politics, propaganda…and quite frankly, everywhere. It can also be very effective, as many of us are unaware of how much our emotions can drive our reasoning. This not only leaves us vulnerable to emotional manipulation by others, it also makes us prone to committing this fallacy ourselves. (When we’re emotional we tend to make emotional arguments.)

If you notice that you’re emotionally triggered in some way, practice emotional skepticism. Instead of allowing your emotions to control you, try to take a step back and evaluate the arguments on their merits.

How to counter : As always, it’s important not to commit this fallacy yourself. One of the most important – and difficult – critical thinking skills is emotional awareness. Ask yourself: are my emotions driving my reasoning? And if so, am I using emotions to argue? In short, try to argue with facts, not feelings.

If someone else is appealing to your emotions, consider that it might not be a deliberate attempt to manipulate your reasoning. Sometimes they’re a reflection of the emotions of the person making the argument. We all do it from time to time, so have empathy! They may not realize what they’re doing. Instead, kindly point out the error and ask for evidence.

The bottom line is that arguments should be supported by evidence – and emotions aren’t evidence.

Appeal to nature

Appeal to nature fallacy claims natural is good and/or unnatural is bad; advertisements all natural, herbal supplement, nature's best remedies, plant-based cleaner, organic non-GMO water, chemical-free and non-toxic beauty cream; it's a myth because nature can be harmful examples black death arsenic tape worms asbestos smallpox black widow spiders; synthetic can be good examples vaccines antibiotics toilets fertilizers pasteurization chemotherapy satellites drugs

Definition and explanation : The appeal to nature fallacy argues that something is good because it’s natural or bad because it’s unnatural. 

There are two major reasons the appeal to nature is fallacious:

  • “Natural” is difficult to define. Many things that are “unnatural” come from nature, so when does something stop being natural? For example, oil is natural, and plastic is made from oil. Is plastic natural? Many antibiotics originally came from nature, so are antibiotics natural? It’s also often unclear what someone means by “natural.”  People will often use vague terms like “chemicals” to denote that something is unnatural. But literally everything is made of chemicals. The point is, dividing things into natural and synthetic is often a false choice. Like most things in life, it’s not black and white, but shades of gray. 
  • We can’t assume something is good based on whether it’s natural or unnatural. I have bad news: nature doesn’t care if you live. Nature is full of things that will harm or kill you, from typhoons to botulinum to viruses to sharks.  Also, human advancements that are considered “unnatural” have greatly improved our quality and quantity of life. Flushing toilets, treated drinking water, antibiotics, and vaccines have nearly doubled our life expectancy over the last century. 

Appeals to nature are very common, as advertisers are acutely aware of our tendency to think natural equals good. For example, “all natural” labels are ubiquitous on food packages; personal care products claim they are “plant based” and “chemical free”; and alternative medicine routinely claims its “natural” treatments are safer than the “chemicals” and “pharmaceuticals” used in modern medicine. 

The point is, the appeal to nature is fallacious because we simply cannot determine if something is good based on whether it’s perceived to be natural.

How to counter: First, try not to fall for the fallacy yourself! Notice how often it’s used to sell products. Then ask yourself what they mean by “natural,” and remember that its supposed naturalness doesn’t inherently mean it’s good. 

When countering the appeal to nature fallacy, keep in mind how pervasive the belief is. Consider asking questions to get to the root of someone’s misunderstandings, and based on their responses, offer gentle explanations. To learn more : The Logic of Science: Dying the way that nature intended: Appeal to nature fallacies

Appeal to tradition

Appeal to tradition fallacy assumes something is good or true because it's old or the way it's always been done. Examples are gay marriage, which has historically been between a man and a woman, and women changing names when they marry. If tradition is your only reason, you don't have a good reason.

Other names: Appeal to antiquity, appeal to age, appeal to custom

Definition and explanation : The appeal to tradition fallacy asserts that something is good or true because it’s old, or because it’s the way it’s “always been done.” 

People have a fascination with the past, from ancient civilizations to old cultural traditions to long-standing familial practices. We tend to stick with the familiar, and assume it’s the right, or better, way of doing things. We have believing brains , and often believe what we’re told, perpetuating beliefs over generations, even if the beliefs aren’t true.

The appeal to tradition fallacy is very common, and if you’re not paying attention it can be quite convincing. We equate being around for a long time with standing the test of time. For example, astrology has been used for millennia, and acupuncture is an ancient practice… so they must work! Sometimes that’s true. But we’ve also weeded out a lot of bad ideas. The “ancients” believed the earth was flat and diseases were caused by witches. We can thank science and progress for discovering that germs cause diseases and bloodletting isn’t an effective treatment. (Or burning witches at the stake!)

The appeal to tradition is fallacious because it substitutes actual reasons with historical ones. If something truly works, provide the evidence. “Ancient wisdom” and “this is the way it’s always been done” are NOT sufficient.

How to counter: First, try not to fall for the fallacy yourself! Notice how often it’s used to sell products and services, or to justify existing social, cultural, or religious practices. Then ask yourself: Other than tradition, what is the evidence?

When countering the appeal to tradition fallacy, remember how pervasive the belief is, then consider explaining why it’s fallacious. It might even be a good opportunity to explain the value of evidence!

Argument from ignorance

Argument from ignorance logical fallacy asserts a claim is true because we don't know that it's not. We don't know exactly how the pyramids were built. It must've been aliens. That had to have been a ghost! You can't prove it wasn't. Just because science can't explain chi yet doesn't mean it isn't real. Claims require evidence. A lack of evidence simply means we don't know.

Other names: Appeal to ignorance

Definition and explanation : The argument from ignorance fallacy asserts that something is true, because we don’t know that it’s not. A foundation of critical thinking is that claims require evidence, and claims without evidence can be dismissed without evidence. A lack of evidence simply demonstrates our ignorance, and is not a reason to believe. 

This fallacy is commonly used by science deniers, pseudoscience proponents, conspiracy theorists, and believers in the supernatural who point to anomalies that aren’t immediately easily explainable as “evidence” of their beliefs. Was that a UFO? Bigfoot? A ghost? A miracle? We don’t know that it wasn’t, so it must have been! 

The argument from ignorance is fallacious because when we don’t know, we simply don’t know! A more rational response is to maintain a healthy level of skepticism and demand sufficient evidence before accepting a claim.

This fallacy is frequently committed alongside the burden of proof fallacy , which is when someone makes a claim without evidence then demands their opponent disprove their claim. They then claim their belief is true because you can’t disprove it.

How to counter: If someone is unfamiliar with critical thinking, they may not understand the importance of evidence. Kindly remind them that claims require evidence. And if they demand that you disprove their claim, remind them that they bear the burden of proof. 

Bandwagon fallacy appeal to the masses or appeal to popularity asserts a claim is true because many people believe it; example everyone was speeding so I shouldn't get a ticket; example half of Americans believe in ghosts; example book best seller list; example election was rigged; example millions use homeopathy

Other names: Appeal to the masses, appeal to common belief, argumentum ad populum, appeal to popularity

Definition and explanation : The bandwagon fallacy argues that a claim is true because a lot of people believe it’s true, or conversely a claim is not true because a lot of people believe it’s not. 

Because humans are social animals, the bandwagon fallacy can be quite convincing. A common tactic is to claim “everybody knows” or “people say/think.” Unsurprisingly, advertisers use our desire to belong by touting the popularity of their products. Likewise, politicians appeal to their popularity when trying to earn our support. But bandwagon is fallacious because popularity doesn’t determine truth. A lot of people can be, and have been, wrong. Your parents may have warned you about the potential harm of peer pressure when they asked rhetorically, “If all of your friends jumped off a bridge, would you do it, too?!?!”

How to counter: It’s important to remember that the truth of a claim isn’t determined by its popularity but by the evidence supporting it. Therefore, once you recognize the use of the bandwagon fallacy, call it out, and ask for evidence.

Burden of proof

Burden of proof logical fallacy someone makes a claim that requires evidence, then shifts the burden of proof onto another to prove it wrong; example: You claimed unicorns exist. Where's the evidence? Responses: Do your research. Prove they aren't! You have Google. Look it up! TIP: Remember extraordinary claims require extraordinary evidence; claims made without evidence can be dismissed without evidence

Other names: Shifting the burden of proof

Definition and explanation : A person who makes a claim bears the burden of proof to provide sufficient evidence for their claim. This standard is the foundation of our legal system, the process of science, and critical thinking. How much evidence is sufficient depends on the claim, but more extraordinary claims require more extraordinary evidence. And claims made without evidence can be dismissed without evidence.

The burden of proof fallacy occurs when someone makes a claim, but instead of providing evidence, they demand their opponent disprove their claim. To illustrate why this is fallacious, imagine a prosecutor charging a suspect with murder, but instead of presenting evidence to establish the suspect’s guilt beyond a reasonable doubt, they demand the suspect prove their innocence. The point is, it’s never up to anyone else to prove your claim wrong. You made the claim, you need to provide evidence.

The burden of proof fallacy frequently appears alongside the argument from ignorance fallacy , which is when someone asserts a claim is true because no one has proven it false. But remember that the truth of a claim is determined by the amount of evidence supporting it. A lack of evidence simply means we don’t know.

How to counter: Evasion of the burden of proof can take many forms, from shifting the burden to their opponent (eg “prove me wrong”), to making claims that can’t be disproved (eg supernatural or subjective), to shifting the burden onto vague sources (eg “they say”).  People unfamiliar with critical thinking are often unaware of the burden of proof concept, or even evidence-based thinking. To respond, point out that they’ve made a claim that requires evidence and that they bear the burden of proof. 

Cherry picking

Cherry picking fallacy: Selectively chooses evidence to support a claim instead of considering all available evidence. Shown is a cherry tree full of red cherries with two blue cherries. A hand is picking the blue cherries and says "all cherries are blue!"

Definition and explanation : The cherry picking fallacy occurs when evidence is selectively chosen to support a claim instead of considering all available evidence. 

Imagine a cherry tree, where each of the cherries represent a piece of evidence for a claim. If the goal is to determine the truth of the claim, it’s essential to look at the body of evidence. But by cherry picking favorable evidence – and ignoring contradictory evidence – the overall understanding of an issue can be distorted.

It’s possible to selectively choose evidence to support nearly any position and miss the bigger picture. For example, everything living thing needs liquid water. In fact, water is so essential to life that, when looking for life outside of Earth, we look for evidence of water. But, what if I told you that all serial killers have admitted to drinking water? Or that it’s the primary ingredient in many toxic pesticides? Or that drinking too much water can lead to death? By selectively choosing these facts, we’re left with a distorted, inaccurate view of water’s importance for life.

Sometimes cherry picking is a deliberate attempt to deceive. However, sometimes it’s not purposeful, but a result of the arguer’s desire to believe. (No one can lie to us like we can.) In either case, the more motivated we are the “better” we are at cherry picking, and the more fallacious the argument. 

How to counter : This fallacy can be convincing and can even lead to overconfidence in a position. After all, cherry picked evidence is still evidence! That said, if the goal is to find the truth, all evidence must be considered.

As always, be sure you don’t commit this fallacy yourself! Be aware of potential emotions and biases that could be driving your reasoning.

And if someone else commits this fallacy, apply the principle of charity. It’s possible that their selection of evidence was caused by a desire to believe, not a desire to deceive. Then gently point out evidence they might not have considered.

False Cause

False cause fallacy, or confusing correlation with causation, assumes that because two things occurred together, one must have caused the other. Examples give are vaccines and autism, crystal necklaces and catching colds, and St. Joseph statue and selling a house.

Other names: Confusing correlation and causation, Questionable cause, Mistaking correlation for causation, Post hoc ergo propter hoc

Definition and explanation : The false cause fallacy assumes that two events are causally connected when they aren’t. 

You may have heard the phrase, “Correlation doesn’t prove causation.” And it’s true… but what does that mean? Let’s start with a couple of definitions:

  • Correlation : A co-relationship or link exists between things or events, so that they occur together
  • Causation : A change in one thing or event causes a change in another

The point is that just because two things occur together doesn’t mean one caused the other. Sometimes it’s just a coincidence, and sometimes there’s something else causing both events.

For example, every morning the rooster crows and the sun rises. These two events are correlated. But the rooster doesn’t cause the sunrise.

While that may be an obvious example, it’s very easy to confuse correlation and causation. And if we’re not careful, we can easily be fooled. It’s human nature to search for explanations by finding patterns and connecting the dots. Maybe you ate a sandwich for lunch and threw up later? Or maybe you wore your lucky socks and hit a home run? But are these events causally related? We often don’t know, yet assume we do. 

Science can help sort correlation from causation, but it’s complicated, as only carefully controlled studies can demonstrate causation. Let’s say we want to find out if Echinacea treats the common cold.  We randomly divide subjects into two groups, where one gets Echinacea and the other a placebo. (Importantly, we don’t let the participants know which group they’re in!) If the group receiving Echinacea reports shorter and less severe colds than the placebo group, we can say that Echinacea has a causal effect on colds. This research has been done , and Echinacea does not treat colds. However, many people will swear that it does, because they’ve tried it and their cold went away. Some of this is probably a placebo effect. But also… colds go away. That’s a correlation.  

However, it’s not always possible (or ethical) to control. Therefore much of science involves observational studies, where data is collected in the “real world.” Because the real world is messy, these studies only provide correlations. For example, people observed that smokers tended to get lung cancer. We can’t ethically ask study participants to smoke to see if they get sick… so we had to study the issue in other ways, such as epidemiological studies. Through lots of research exploring the different possible variables that could be causing the link between smoking and lung cancer, eventually we concluded smoking almost certainly causes lung cancer. 

The false cause fallacy is extremely common, so it’s important to learn how to recognize it so we don’t fall for it ourselves. We not only see patterns everywhere, much of science is based on studies that provide “links.” We should always keep in mind the limits of what we can say about the relationship between events. Your best bet is to be skeptical and hold off on assuming causation until you know more.

How to counter:  First, try not to fall for the fallacy yourself! Notice how often you perceive correlations. Then think through what else might be behind the relationship. We all want explanations, and understanding the difference between correlation and causation can be an empowering way to make better decisions.

In conversations with others, remember that they may not know the difference between correlation and causation. So if someone commits this fallacy give them the benefit of the doubt, and try to gently explain it to them. We all would benefit from understanding the difference!

False choice

False choice fallacy oversimplifies a complex issue to two options; example the choice is clear socialism or freedom; which came first the chicken or egg; you're either with me or against me; Be wary of black-or white thinking. Ask are these really my only choices

Other names: False dilemma, false dichotomy, black-or-white, either/or, fallacy of the excluded middle

Definition and explanation : The false choice oversimplifies a complex issue into two options, ignoring the possibility that other options may exist. In this fallacy, the arguer frames their opponent’s choices as black or white, this or that, when in reality there is likely a spectrum of options in between. Often, the choices presented are used to shut down debate and force an opponent into the arguer’s preferred position.

False choice fallacies are quite common, and they can be very powerful, so it’s important to be able to recognize them. Binary thinking can force us into positions we wouldn’t normally take. The false choice is fallacious because the vast majority of the time there are more than two positions available to us. 

How to counter: First, consider the possibility that the false choice was unintentional, due to an emotional attachment to one of the choices or a lack of knowledge on the issue. Sometimes, however, the false choice was a purposeful attempt to strengthen the arguer’s position, by presenting their view as the only reasonable option. Depending on the circumstances, it can be difficult to propose an alternative, as the two options may have been presented to force choosing a side. 

The most important thing is to not let a false choice limit your options. So ask yourself, are these really my only two choices?

False EQUIVALENCE

False equivalence fallacy argues that two or more things are the same, despite key differences. How is an apple and tomato. The apple says, apples and tomatoes are both fruits, so they're the same! The tomato responds, I think we're missing something important. But sometimes this is not like that.

Definition and explanation : The false equivalence fallacy argues that two or more things are the same, despite having important differences.

Comparisons can be a powerful way to understand new concepts, as they help us to make connections with things we already know.  

But the devil is in the details. While superficial similarities are often easier to see, they may be masking important differences…and if we’re not careful we could be misled.

The false equivalence fallacy can take many different forms, some of which include:

  • Oversimplifying a comparison by exaggerating a shared characteristic, downplaying important differences, or removing essential context or nuance, such as comparing things that have differing orders of magnitude. Or in other words, comparing apples to oranges .
  • Presenting a false balance by giving equal weight to “both sides” of an argument despite one side being supported by significantly more evidence. Sometimes called bothsidesism , this type of false equivalence is often committed by the media, especially when covering science. In their attempt to avoid bias, opposing “sides” of an issue are given equal air time, creating the impression of a balanced debate.
  • Drawing a moral equivalence between actions that aren’t comparable. Sometimes referred to as whataboutism , it implies all parties are equally wrong or to blame. Essentially, this is a diversion tactic, used to justify someone’s behavior and/or deflect criticism. 

False equivalences are very common and can be quite persuasive if you’re not paying attention. It’s often easier to see similarities than it is to dive into the weeds and analyze how things are different, but just because they seem to share an attribute does not make them equivalent.  So if someone makes a comparison, stop and ask yourself: Are there important differences that would invalidate the conclusion? 

How to counter : First, it’s important to not commit this fallacy yourself! If you’re making a comparison, don’t just think about how they’re similar, try to think about how they’re different.  Make sure you’re able to justify why things are equivalent in a way that’s relevant to the conclusion.

In conversations with others, (kindly) explain to them why their argument is flawed. Point out that, while there are similarities, there are key differences that impact their conclusion. 

GALILEO GAMBIT

Galileo Gambit: Someone claims they must be right, because they've been criticized or mocked, often for disagreeing with a scientific consensus. A flat earthed says, "They mocked Galileo and called him a denier... so I'm right, too!" Galileo responds, "That's not how it works. Tip: Feeling persecuted isn't the same as being correct.

Definition and explanation: The Galileo gambit claims that someone must be correct because they’ve been criticized, often for disagreeing with a scientific consensus. The argument suggests that, since Galileo was thought to be wrong but was later proven right, people who are thought to be wrong today will also be proven right.

Galileo Galilei (1564-1642) was an astronomer who was convicted by the church for his support of heliocentrism, the idea that the sun (and not the earth) was the center of the universe.

Those who deny science are quite fond of pointing out that “Galileo was branded a denier” and often point to this famous quote: “In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”

But many who use Galileo’s name are missing a few important things.

The Galileo gambit is a type of false equivalence , which compares things that have important differences. For example, Galileo was suppressed by the Church, not the “scientific establishment”. While the scientists continued to disagree and collect evidence, they didn’t ban his teachings or put him under house arrest for heresy. Conversely, the Church didn’t clear Galileo’s name or admit he was right until 1992 !

And importantly, disagreeing with a consensus doesn’t make a position correct. The vast majority of people (scientists included!) who refuse to accept well-supported scientific conclusions are just wrong. 

Those who compare themselves to Galileo imagine themselves as being persecuted or censored for bravely standing up to a scientific establishment. But feeling persecuted isn’t the same as being correct. And ironically, those who deny overwhelming evidence bear more resemblance to the ideologically-motivated church than to the scientific community.

This fallacy is also based on the idea, largely a myth, that scientific advancements are overturned by lone geniuses. Galileo was building on the work of Nicolas Copernicus. Around the same time, Johannes Kepler was developing the laws of planetary motion. And in 1687, Isaac Newton explained planetary motion using the universal law of gravitation.

For every fringe idea that turns out to be true, there are countless others that are just wrong. As Carl Sagan said, “The fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses. They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers. But they also laughed at Bozo the Clown.”

Circling back to Galileo’s quote , the “authority” of which he was speaking was the religious authorities in the Church, not scientists. He also wasn’t saying that disagreeing with authorities is what makes someone right, but their “humble reasoning.” In short, evidence trumps authority.

The point is, denying a consensus doesn’t make you like Galileo. Revolutionary ideas do get accepted, but they require evidence. We don’t remember Galileo because he disagreed, but because he was right. 

How to counter: While science denial relies heavily on the use of logical fallacies, the Galileo gambit is one of the most reliable indicators . It’s easier to think of ourselves as outsider geniuses heroically standing up for the truth than it is to admit the much more likely possibility that it’s us who’s wrong. No one likes to think of themselves as a science denier. 

Science denial usually stems from a desire to avoid accepting conclusions that conflict with our identity or ideology, or those in which we don’t like the solutions (i.e., solution aversion ). Thus, if you see this fallacy in use, consider the underlying motivations. 

One option is to explain why the comparison with Galileo doesn’t stand up to scrutiny.  Or you could simply ask, what evidence would change their mind ?

Hasty GeNeralization S

Hasty generalizations fallacy draws a broad conclusion based on a small sample. Examples: Kids are all so rude these days! Global warming isn't real. It's cold outside! A dog bit me, so dogs are dangerous. Also known as: Jumping to conclusions, overgeneralization, argument from small numbers, biased generalization, sweeping generalizations. Avoid exaggerations and overconfidence. And when possible, use reliable data!

Other names : Jumping to conclusions, argument from small numbers, overgeneralization, biased generalization

Definition and explanation : The hasty generalization fallacy draws a broad conclusion based on a small sample. 

All day, every day, we use limited information to generalize. By definition we have to, as we will never have fully complete data sets to analyze for each and every decision. However, the strength of our conclusion depends on our sample: those based on larger and more representative samples are significantly stronger than those based on smaller or atypical samples.

The hasty generalization fallacy occurs when we use limited evidence (i.e. a small sample) to make a broad claim. Often the examples we use to support our conclusion (i.e. our “sample”) are simply anecdotes , such as our own personal experiences or stories we’ve heard from others. Armed with our “evidence”, we jump to a conclusion. Essentially, we assume our “sample” is representative of reality. But not only are our experiences limited, our biases can influence the examples we choose, resulting in a conclusion that isn’t justified.

For example:

  • Anthony was bitten by a cat when he was young, so he concludes that “all cats bite.” 
  • Leslie claims “smoking isn’t bad for you,” because his dad smoked his whole life “and was fine.” 
  • Monica was mugged on her vacation in Sydney, so she claims that “Australia is a dangerous country.” 
  • Jason doesn’t like the news coverage of his favorite politician, so he says “all media is biased.”

Hasty generalizations can lead to the formation of stereotypes, as we assume our limited experiences with people can be generalized to their supposed “group.” For example:

  • Mauve hasn’t had the best experience with her younger co-workers, so she claims that “all Millennials are lazy.”
  • Sean was cut off in traffic by a female, so he thinks that “all women are bad drivers.”
  • Marjorie saw a news story about an illegal immigrant who committed murder, so she says that “all illegal immigrants are dangerous.”
  • Jaimie thinks that “all Muslims are terrorists” because of the way she sees them portrayed on TV.

Hasty generalization fallacies are very common and can be quite convincing. They are supported by “evidence,” after all. (Though they’re often unreliable, anecdotes can be very powerful.) Even more, once we’ve jumped to a conclusion, confirmation bias kicks in and we “see” supporting evidence everywhere…so we become even more convinced we’re right.

How to counter : First, it’s important not to fall for this fallacy yourself. While making generalizations is unavoidable, we should remember that the resulting conclusions are tentative, and therefore avoid overconfidence and exaggerated language (e.g. all/none, always/never). In short, don’t make arguments you can’t support!

If someone else makes a hasty generalization, ask for evidence! (As always, the burden of proof is on the person making the claim.) Based on their response, consider pointing out the fallacy in their argument and that there’s insufficient evidence to justify their claim. Remember to be kind, as they honestly might not know their argument was flawed. 

Either way, consider looking up reliable data (if it’s available), and proportion your acceptance of the conclusion accordingly.

I’m entitled to my opinion

I'm entitled to my opinion asserts a right to a position without justification or accountability, treats opinions as fact, suggests all opinions are equal, and implies further disagreement is disrespectful; hide behind opinion shield don't have evidence but not going to change mind

Definition and explanation : It is true that everyone is entitled to their opinion. However, “I’m entitled to my opinion,” is often used to shield an unsupported position. And being entitled to your opinion doesn’t mean you’re right, or that anyone needs to take your opinion seriously.

There are several problems with this fallacy. 

  • An opinion doesn’t mean what they think it means. Opinions are subjective judgements or viewpoints. When someone claims they are “entitled to their opinion,” it’s often because they are trying to shield factually incorrect assertions.  
  • All opinions aren’t equal. Expert opinions are interpretations of evidence based on years of study and experience, and are significantly more robust and reliable than non-expert opinions. “I’m entitled to my opinion” suggests that their opinion is just as good as anyone else’s.
  • We should all be open to changing our minds if the evidence suggests we should. “I’m entitled to my opinion” is essentially an admission that they can’t justify their position… but yet they’re unwilling to entertain contradictory evidence or other points of view. 

In the face of facts that contradict a belief, especially one central to our identity or values, we use  motivated reasoning  to reduce the  cognitive dissonance  that arises when reality and our beliefs are in conflict. When that fails, our Get Out of Jail Free Card is to defend the belief by rendering it unfalsifiable and therefore immune to evidence. We move the goalposts. Discount sources or deny evidence. Proclaim that it’s our opinion.

It’s important to remember that the purpose of arguing isn’t to be right, it’s to get it right. “I’m entitled to my opinion” shuts down any further disagreement. We simply don’t want to be wrong.

How to counter: I tell my students on the first day of class that they are not entitled to their opinions. (At least not in class!) They are only entitled to what they can support with evidence. 

It’s a bit trickier in the “real world.” Remember that people are often unfamiliar with the basics of critical thinking, so they may not be used to having to support their positions with evidence. Consider explaining why “I’m entitled to my opinion” is fallacious, and the benefits of healthy disagreement. The best approach is probably to ask them why they hold this particular opinion, and why they think they’re “entitled” to it.

To learn more: The Conversation: No, you’re not entitled to your opinion

PERFECT SOLUTION

Perfect solution fallacy: Rejects any solution that's less than ideal despite potential benefits. Shown is someone struggling to write something who says: This is never going to be perfect. I should just give up! Tip: Be wary of unattainable standards and focus on progress instead.

Written by Bryan Montford; edited by Melanie Trecek-King

Definition and explanation: The Perfect Solution Fallacy assumes a perfect solution exists, and rejects any solution that’s less than ideal despite potential benefits. This fallacy relies on black-and-white thinking, oversimplifying complex situations and creating a false choice between perfection and inaction. However, it’s fallacious because solutions don’t need to be perfect to provide benefits. Not only that, perfect solutions are unrealistic, unattainable, and often unnecessary.

The perfect solution fallacy can pop up in many forms in our daily lives. For example, someone might say they don’t want a relationship unless it’s completely conflict-free. Or, they won’t take music lessons because they’ll never be as good as a professional musician. Rejecting solutions because of unrealistic expectations creates inaction and missed opportunities, and dismisses the benefits of incremental improvements.

How to counter: Inaction won’t result in change or help you achieve your goals, so focus on progress instead. Be wary of arguments that hinge on unattainable standards, as solutions don’t need to be perfect to provide benefits. We can achieve progress even if solutions aren’t perfect.

Red Herring

Red herring fallacy graphic example. Red herring attempts to distract from the main issue by bringing in irrelevant information. One politician says, “What’s your plan to address climate change?” The other politician responds, “My tax cut boosted the economy!”

Definition and explanation : The red herring fallacy is an attempt to distract from the main issue by bringing in irrelevant information.

The name of this fallacy comes from a story in which the smelly fish were used to cover up the scent of hares to distract hunting hounds. While the story may or may not be true, it can be a helpful way to visualize how this fallacy works.

In argumentation, the red herring fallacy is used to divert attention from the main point by changing the subject. It can be difficult to recognize, as the new information might seem tangentially relevant…but in fact it’s a shift towards something that’s easier or safer for the party committing the fallacy. Basically, they lack a good response, so they confuse and distract.

How to counter : The red herring fallacy is very common, so it’s important to be able to recognize it. Pay close attention to responses to monitor if the information that’s offered is relevant to the original issue. This can be more challenging than it seems, and if you’re not careful you could end up miles away from where you intended. 

If you spot a red herring, you could try to redirect the conversation back to the topic at hand. If they continue to offer distractions, gently point out the red herring and why it’s fallacious, keeping in mind that they might not be aware of their misdirections. Finally, there might be a reason the other person wishes to avoid the topic, so you could choose to move on. There might not be a point in continuing with that particular thread or even the conversation in general.

Single cause

Single cause logical fallacy oversimplifies a complex issue to a single cause; example is one person says wildfires are caused by climate change, another says wildfires are caused by poor forest management; complicated problems tend to have multiple causes

Other names: Causal reductionism, complex cause 

Definition and explanation : The single cause fallacy oversimplifies a complex and nuanced issue into one simple cause. It’s human nature to prefer simple answers to complex ones. But oversimplification can lead us astray. If we are to have any chance at solving problems, we need to fully understand them.  

The single cause fallacy can take many forms, from assigning blame for a problem or giving credit for a solution.   However, it is fallacious because it’s likely that many factors contributed to an outcome. 

How to counter:   Phrases like, “ The reason x happened,” or “ The cause of x is,” are indicators that a single cause fallacy has been committed. However, it’s possible that the single cause fallacy was unintentional, and maybe the other person doesn’t understand the complex nature of the issue. So ask yourself, could there be more to the issue than what is being presented? 

SLippery Slope

Slippery slope fallacy argues an action will lead to a chain of events, resulting in undesirable and extreme consequences. Student emails professor: Dear Professor, If I fail this class my life will spiral out of control. I'll end up homeless and have to beg for change. It'll ruin my chances of going to medical school. So if you fail me I'll have to live in a tent and people will die and it will all be your fault.

Definition and explanation : The slippery slope fallacy argues that an action will lead to a chain of events, resulting in undesirable and often extreme consequences. 

Slippery slope arguments are often used to argue against an issue at hand by diverting attention towards negative and unlikely outcomes. In addition, by presenting the horrible sequence of events as inevitable, the arguer uses appeals to fear or anger to persuade (or manipulate, if it’s done intentionally) without evidence. However, the slippery slope is fallacious because moderate positions don’t necessarily lead to extreme outcomes.

Slippery slope arguments are common in political discourse. They can be quite persuasive, as when our emotions are activated we’re less able to think critically. The good news is that slippery slope arguments are easy to identify and avoid once you learn how they work. 

How to counter : Critical thinking can keep us from going down a slippery slope. To identify a slippery slope fallacy, ask yourself how likely it is that the action at hand will lead to the outcome(s) presented. (Keeping tabs on your emotions is also a useful strategy.) 

Once identified, consider pointing out the fallacy to your opponent, and ask them to justify their conclusion by providing evidence. The burden of proof is on the person making the claim, after all.

Straw man fallacy mispresents someone's argument to make it easier to dismiss; example is someone who says evolutionary theory states all living things have a common ancestor and the other says so you're saying my grandpa was a monkey?

Definition and explanation : A straw man argument misrepresents an opponent’s argument to make it easier to dismiss. The straw man can take many forms, but it often involves distorting, exaggerating, oversimplifying, or taking parts out of context, and  can be quite effective if you’re not paying attention. By building a straw version of the original argument they can more easily knock it down and claim victory. However, it is fallacious because they are attacking a position their opponent doesn’t hold. 

Anti-science arguments often employ straw man fallacies, as it’s much easier to “defeat” strongly supported scientific conclusions when those conclusions are misrepresented.

How to counter: First, consider the possibility that the straw man argument was unintentional, and that the other person doesn’t understand the original argument. Sometimes, however, the straw man is a purposeful attempt to persuade others through deception. Either way, point out the use of the straw man by explaining how their version differs from your original position.

Resources to Learn More

Logically Fallacious Your Logical Fallacy Is Effectiviology: Logical Fallacies: What they are and how to counter them Campus Explorer: 20 Types of Logical Fallacies and Examples

44 thoughts on “Guide to the Most Common Logical Fallacies”

' data-src=

I love ththis kind of analytical explanations and I’m looking forward to more.

' data-src=

Thanks for the compliment, and you’re welcome!

' data-src=

Simplified and very useful article.

Thanks for the kind words. That’s exactly what I was going for. 🙂

' data-src=

Thank you…

' src=

I think it would be good to list potential exceptions to these fallacies. Here are a few examples:

Slippery slope – I think there are plenty of occasions where people are quite right to be concerned about one thing leading to another, and there have been examples of it, such as dictators rising to power

Anecdotal – if you’re simply trying to prove that something exists or is possible, rather than common or likely, then your own experience is sufficient proof of that

Appeal to tradition – if arguing against the idea of change for the sake of change, rather than change per se, then I don’t think there’s a great deal wrong with it. Essentially it’s the ‘If it ain’t broke, don’t fix it’ approach

You’re not wrong: Exceptions would be good additions. In my attempt at keeping things brief I opted to (mostly) leave them out.

Thanks Melanie

' data-src=

The person that wants you to include “exceptions” may have missed out on understanding fallacies of logic. To say, for example, that you are concerned that a bad person will rise to power, or that you personally experienced something are not exceptions AND they are not acts of fallacious thinking.

' data-src=

I think he may be more concerned with these exceptions being labeled as logical fallacies when they might not be.

' data-src=

I can never remember the correct term for the tactic being used when an agitated conversation goes bad. This helps! Thanks!

You’re welcome! 🙂 Melanie

' data-src=

While I don’t necessarily disagree with any of them, the bias in these graphics speak loudly.

Could you explain more? I try to point out fallacies, especially those used to deny science (or reality) or justify pseudoscience, wherever I see them.

' data-src=

Excellent stuff!

This may be another fallacy type: Claiming, “I can always tell when someone is lying!” It correlates with ‘clairvoyance’ type fallacies, though not sure what category.

I would love to use an interactive quiz-type method to practice and hone my critical thinking skills and speed up my fallacy-type identification process! Any recommendations?

Yes! The game Cranky Uncle is a wonderful way to learn and practice recognizing fallacies. It’s specifically focused on the techniques of science denial, but fallacies are fallacies!

It’s also free 🙂 Just go to http://www.CrankyUncle.com and have fun!

Thanks for the comment Melanie

' data-src=

Great post, thanks! I especially like the modern, relevant examples in the cartoons. I have been a practicing scientist for 25 years and I learned things here. Once you recognize these fallacies you see them everywhere (confirmation bias? kidding…). The more I think about it, logic and critical thinking should be incorporated in school curricula starting in early grade school. I very much doubt we’d be in the awful position we find ourselves in now as a society if critical thinking was a 4th pillar of education (in addition to the 3 Rs).

Thanks for the kind words. And I couldn’t agree with you more about teaching critical thinking.

Best Melanie

' data-src=

Why do you need critical thinking when all you have to do is look up what the “experts” tell us to think?

An important component of critical thinking is intellectual humility. Since none of us can possibly know everything, knowing where to find reliable knowledge is essential. Experts, by definition, have the skills and knowledge to understand their area of expertise. If experts all generally agree — ie they have reached a consensus — their conclusion is the most reliable we have at the moment. Not trusting experts, or thinking we can do better than an expert consensus, is a great way to be misled.

Thanks for your comment. Melanie

' data-src=

I love this article so much! One of the best I’ve seen so far. Great and fun graphics. Clear and simplified explanations. Just bookmarked this page! :))))

Thanks so much for the kind words! Do come back to the site to check for new material. 🙂 Melanie

' data-src=

This is nicely done. Sharing it with my kids and grandkids. Thank you!

Thank you for the kind words! Melanie

' data-src=

This would be a lot better if the examples chosen didn’t betray a political bias to one one side.

I don’t disagree with you, I could stand to have more examples from the ideological left. I’m open to suggestions.

Thanks for the comment. Melanie

' data-src=

Is there a place to find actual example in media, YouTube channels or podcasts?

Hi! Thanks for the comment. I’m not sure I’m fully understanding your question, but logical fallacies are ubiquitous. Once you can see them, you can’t “unsee” them.

' data-src=

Brilliant work!

Thanks! 🙂 Melanie

' data-src=

Whoa, great article! Very in depth and I love that responding with kindness is always the first step

Thanks for the kind words! 🙂 Melanie

' data-src=

I agree with Joan, that’s not how fallacies work lol

Pingback: Fallacy / unfair tactic quick guide – Sgt Scholar

Pingback: GTU Globalization's Impact on Developing Economies Essay - Writer Bay

Pingback: GIT The Current Trend in Globalization: Advancing, Stalling, or Retreating Essay - Writer Bay

Pingback: Georgia tech ChatGPT Essay Exercise - Peakassignments.com

Pingback: Top 10 Logical Fallacies We Use Every Day (and How to Spot Them) – The Drunken Skeptic

Pingback: Georgia tech ChatGPT Essay Exercise - quillstudy

Pingback: Anti-LGBTQ+ adults could be making kids more hateful. One crucial antidote could stop this. – LGBTQ Pride Talk

Pingback: Anti-LGBTQ+ adults could be making kids more hateful. One crucial antidote could stop this. – NOTSTRAIGHT News

Leave a Reply Cancel reply

Pursuing Truth: A Guide to Critical Thinking

Chapter 9 informal fallacies.

A fallacy is a mistake in reasoning. A formal fallacy is a fallacy that can be identified merely by examining the argument’s form or using a tool like a truth table. An informal fallacy cannot be detected from the argument’s form. There are no foolproof tools for detecting informal fallacies. Unlike validity, these fallacies can occur in degrees. Sometimes, it is clear that a fallacy has been committed, at other times, there can be legitimate questions about whether a fallacy has been committed.

9.1 Emotions and Critical Thinking

There are two important uses of language that play an important role in critical thinking. The first is to convey information, or what is sometimes called cognitive content. The premises and the conclusion of an argument all have cognitive content. It is on the basis of the information they convey that we can evaluate them as true or false. Another use of language, however, is to express emotion, which often tends to evoke similar feelings in the audience. The emotions that are expressed by the language is its emotive content.

Good critical thinkers are persuaded by relevant cognitive content, and are not unduly persuaded by irrelevant emotive content.

9.2 Slanters

Slanters are words or phrases that are used to manipulate by using emotive language. Slanters can have both positive and negative connotations. Slanters are types of non-argumentative persuasion. Sometimes, people will use them innocently, maybe because they have passionate feelings about the subject being discussed. Other time, though, they are used because the speaker knows that he does not have a good argument for his position. They are ways for people to affect the beliefs of others without offering reasons for their positions.

9.2.1 Euphemisms and Dysphemisms

Different words and phrases passages can have the same cognitive content but differ in their emotive content. A euphemism is a positive synonym for some neutral term. A dysphemism is a negative synonym. For example, think of the words that we use to talk about the death of a pet:

  • Neutral: euthanize
  • Euphemism: put to sleep
  • Dysphemism: ?

Euphemisms are common in the military and in advertising. Here are some examples:

  • Enhanced interrogation methods
  • Collateral damage
  • Depopulated area
  • Pre-owned, Pre-loved
  • Genuine imitation leather

In other cases, a person may use a euphemism when they have been caught doing something wrong, but want to minimize the wrongness of the action. I heard a politician once, who was caught telling a falsehood, admit to having “committed terminological inexactitude.”

Dysphemisms convey a negative attitude towards something. Examples of dysphemisms include:

  • Cancer stick
  • Dead tree edition

9.2.2 Innuendoes

Innuendoes imply something by what is not said. A common scene in crime films has a gangster go into a store and say something like “Nice store you got. It would be a shame if anything happened to it.” Notice that he didn’t actually say he would damage the store, although he certainly implied it.

Another use of an innuendo is to condemn with faint praise. Imagine receiving a reference letter for a student applying to graduate school. The letter only says that the student was never late to class. The implication is that her punctuality is the best that can be said of her.

Finally, there is the apophasis, which is mentioning something by saying that it won’t be mentioned. For example, “I’m not going to talk about your failure to turn assignments in on time.”

9.2.3 Weaselers

A weaseler is a way of qualifying a claim in order to avoid criticism. Weaselers include words and phrases like “perhaps,” “there’s a good chance that,” and “it’s possible that.” A weaseler that is common in advertising is “up to” — “By using our diet plan, you can lose up to ten pounds in a month.” Notice that “up to ten pounds” means “no more than ten pounds.” So, if a customer loses no weight at all, then there is no grounds for a lawsuit.

9.2.4 Downplayers

A downplayer is a way of making something seem less important than it is. The most common downplayers are “mere, merely, and so-called.” A person might say, “That’s merely your opinion” to avoid having to respond with facts. A politician might talk about his opponent’s “so-called” plan to cut spending, implying that it isn’t much of a plan at all. Improper use of quotation marks can also serve as a downplayer.

9.2.5 Proof surrogate

A proof surrogate offers no real support, but just claims that support exists. Examples are using “studies show” without saying what those studies are and where they can be found. Another proof surrogate is just to say that “It’s obvious that….” Doing so implies that proof is simply not needed.

9.2.6 Hyperbole

Hyperbole is an inappropriate or extreme exaggeration. “Taking critical thinking is the most exciting thing you’ll do in your whole life!” Since it is an extreme exaggeration, no one will be fooled into believing it, so what’s the danger? The danger of hyperbole is that once the exaggeration is made, the listener is then prepared to accept a weaker version of the statement. The weaker version, compared to the extreme exaggeration, sounds more believable.

9.3 Fallacies of Ambiguity and Vagueness

An ambiguous word or phrase is one that has more than one meaning. “Bank” is an ambiguous term, it can refer to a financial institution, a riverbank, a kind of basketball or pool shot, etc. A vague term is one that does not have a precise meaning. That is, there will be cases where a vague term clearly applies, cases in which it clearly does not apply, and cases in the middle where it’s just not clear whether it applies. Terms like “rich” and “heap” are vague terms.

9.3.1 Equivocation

The fallacy of equivocation is committed by using the same term in two different senses in the same argument. Here is my favorite example:

  • God is love.
  • Love is blind
  • Ray Charles is blind.
  • Ray Charles is God.

There are several things wrong with this argument, one of them is equivocating on “blind.” To say that love is blind, is to say that people overlooks the faults of those they love. To say that Ray Charles is blind is to say that he cannot physically see anything, not that he just overlooks things.

9.3.2 Amphiboly

Amphibolies rely on syntactic ambiguities. Those are ambiguities that result from the arrangement of the words. Church bulletin bloopers are good places to find amphibolies: “The Rev. Adams spoke briefly, much to the delight of his audience.”

9.3.3 Accent

The fallacy of accent is an equivocation resulting from accenting different words in a sentence. Think about the different meanings that are implied from accenting different words in this sentence: “I didn’t take the exam yesterday.”

9.3.4 Division and Composition

The last two fallacies of ambiguity are division and composition. The fallacy of division improperly attributes a property of the whole to its parts. The fallacy of division improperly attributes a property of the parts to the whole. Here is an example of division: “That wall weighs more than 500 pounds, so each brick in it weighs more than 500 pounds.” A similar example of composition is “Each brick in that wall weighs less than a pound, so the entire wall weighs less than a pound.”

Some properties, however, can be attributed from the parts to the whole or the whole to the parts. For example, “Each link in that chain is solid gold, so the whole chain is solid gold.”

9.3.5 Line-Drawing Fallacy

The line-drawing fallacy is a fallacy of vagueness, having the form “Since there is no precise line that can be drawnd between A and not-A, there is no real difference between A and not-A.” Example: “Since no one can say where the line should be drawn between legitimate uses of force and excessive uses of force, then no one can honestly claim that any use of force is excessive.”

9.4 Fallacies of Relevance

A good critical thinker will offer arguments that have premises that are logically relevant to their conclusions. A fallacy of relevance is committed when the premises of the argument are not logically relevant to the truth of the conclusion. They may be, however, psychologically relevant, so that we can be deceived in thinking that the argument is valid, when in fact it is not.

9.4.1 Ad Hominem

The Ad Hominem fallacy is committed by attacking the person giving the argument, rather than responding to the argument itself. There are four common types:

9.4.1.1 Personal Attack

This is also known as an ad hominem abusive. This is committed when one verbally attacks the person giving the argument instead of responding to the argument itself. For example:

Jack: “There are so many unexplored planets out there. Surely, there must be life somewhere out there”

Jill: “You can’t be right, you’re just a loser who watches too much television.”

9.4.1.2 Circumstantial Ad Hominem

A person commits the circumstantial ad hominem by referring to circumstances that discredit the arguer. This is often a charge of bias or vested interest. A person has a vested interest when they stand to gain, financially or otherwise, by something. For example, a doctor has a vested interest in a pharmaceutical study when she owns stock in the company that produces the drug. Bias or vested interest is a good reason to examine an argument carefully, but not a good reason to simply dismiss it.

Jill: John has made an excellent case for increasing the budget of the church’s youth program.

Jack: Of course he would say that — he’s the youth minister! You can just forget everything he said.

9.4.1.3 Tu Quoque

This is also known as a pseudorefutation. It accuses the arguer of hypocrisy. Example: “Don’t tell me I shouldn’t start smoking. I know how many packs a day you inhale!”

A person’s behavior may very well be inconsistent with their argument, but that doesn’t mean that the argument is bad.

9.4.1.4 Poisoning the Well

Poisoning the well is an ad hominem committed before the arguer has spoken. The goal is to provide harmful information about the speaker to preemptively discredit anything that the speaker might say. Any of the previous examples can be turned into examples of poisoning the well. For example: “John is about make his case for increasing the church’s youth budget. Don’t pay any attention to him — he’s the youth minister, what else would he say?”

9.4.2 Appeal to Force

The appeal to force, also called scare tactics, is a threat, either explicit or implicit. For example, imagine a student saying to a professor, “I deserve an A because my father is a major donor to this university and a very good friend of the dean.” The appeal to force tries to instill fear in the listener, and to be fallacious, the fear must be irrelevant to the truth of the claim. These are common in both advertising and politics. An example of an appeal to force in advertising would be an ad for a Medicare supplement policy with an elderly woman weeping in front of a pile of unpaid bills. The advertisement works by making the viewer afraid of ending up like the person in the commercial.

9.4.3 Appeal to Pity

The appeal to pity is like the appeal to force, except that the goal is to evoke pity, not fear. For example: “I deserve an A because my mother is very ill, and I had to spend most of my time caring for her this semester.”

9.4.4 Popular Appeal

This is also called appeal to the people. Here, the goal is to use the desire to be loved, admired, accepted, etc. to get others to accept the conclusion. Two important types are the bandwagon fallacy and the appeal to vanity.

9.4.4.1 Bandwagon

The bandwagon fallacy tells the listener that since everyone does, or believes, something, then they should too.

Example: “Everyone supports Smith for president. You need to get with the program and support him too!”

9.4.4.2 Appeal to Vanity

This is a claim that you will be admired if you do this. Unlike bandwagon, which claims that everyone does this, the appeal to vanity is usually about something that not everyone can do or have.

Example: “Wear a Rolex — that way everyone will know that you’re not just somebody.”

9.4.5 Appeal to Ignorance

9.4.5.1 burden of proof.

On most issues, one side will have the burden of proof. That means that if that side fails to make its case, then the other side wins by default. There are two standard rules for determining burden of proof:

  • Especially for existence claims, the side making the positive case has the burden of proof.
  • The side making the more implausible claim has the burden of proof.

Sometimes, these conditions can conflict. Here is an example:

Jill: Surely, there are species of insects that we have not yet discovered.

Jack: I don’t think that’s true.

Who has the burden of proof? Jill is making a positive existence claim, but it is one that very plausible. That makes Jack’s claim very implausible. In this case, I’d say that Jack has the burden of proof.

9.4.5.2 The Law

One area where these rules do not apply is the American legal system. There, the prosecution always has the burden of proof. That is, if the prosecution fails to make its case against the defendant, then the defense wins.

Burden of proof should not be confused with standard of proof. Burden of proof is concerned with who needs to make their case. Standard of proof is concerned with how strong a case needs to be made. There are four different levels of standard of proof in the law:

  • Beyond a reasonable doubt
  • Clear and convincing evidence
  • Preponderance of evidence
  • Probable cause

Criminal cases use the highest standard of proof, which is “beyond a reasonable doubt.” This is a high degree of probability. It does not mean that no doubt at all is possible, but that any doubt, given the evidence, would be unreasonable. The next two level are used in civil cases. Most civil cases are tried at the “preponderance of evidence” level. This means that, given the evidence, it is more likely that the defendant is liable than not. Clear and convincing evidence is a standard of proof between preponderance of evidence and beyond a reasonable doubt. It is used in civil cases that involve the potential loss of important rights or interests, such as the termination of potential rights. The lowest standard of proof is probable cause. This is used to determine if a search or arrest is warranted, and also used by grand juries to issue indictments.

9.4.6 Straw Man

The straw man fallacy Distorts a position so that it can be easily attacked. It does not address the actual view held by the opponent, but responds to a weaker version. It is often committed by making the conclusion of an argument more extreme than it actually is, since extreme views are often easy to attack.

Example: “Senator Snodgrass has argued that there be a mandatory waiting period before any handgun purchase. Obviously, the senator wants to make all firearm ownership illegal.”

Here some subtle ways of committing the fallacy:

  • Taking words out of context.
  • Treating extreme views as representative.
  • Criticizing early versions of a position.
  • Criticizing deliberately simplified versions of a position.

9.4.7 Red Herring

The goal of the red herring fallacy is to lead the opponent off the track, by subtly changing the issue being discussed. The arguer changes the subject to a different but related one. To determine if something is a red herring, ask yourself if the issue at the beginning of the argument is the same as the issue at the end. Here is an example:

The American Cancer Society has argued that smoking is bad for your health. Many people in the Southeastern United States are dependent upon the tobacco industry for their jobs. Making smoking illegal would have a devastating economic effect on many states. Therefore, the ACS is simply wrong.

Notice the original isse is whether smoking has bad health consequences. By the end of the paragraph, the issue has been changed to the economic impact of making smoking illegal.

9.4.8 Horse Laugh

This occurs when someone simply ridicules the position held, and offers no real response to the argument at all. For example: “Mr. Jones has argued that watching television is emotionally unhealthy. If you believe that, then I’ve got a great deal on some swampland for you.”

9.5 Fallacies of Unwarranted Assumptions

9.5.1 begging the question.

An argument begs the question if it is impossible to believe at least one of the premises unless one already believes the conclusion. Note that if this is the case, then the premises cannot serve as reasons to believe the conclusion, since believing the premises requires already believing the conclusion. There are three common types of arguments that beg the question.

The first is a circular argument. That occurs when one explicitly uses the conclusion as support for one of the premises. Here is an example:

  • The Bible says that God exists.
  • The Bible is the inspired word of God.
  • God exists.

The Bible can’t be the inspired word of God unless God exists, so the argument begs the question.

Another type of argument that begs the question is one that simply rephrases the conclusion and uses it as a premise. Example: “If such actions were not illegal, then they would not be prohibited by the law.” In this case, the conclusion is synonymous with one of the premises.

The last type is one that generalizes the conclusion and uses the generalized rule as a premise. Example: “Spanking children is wrong because corporal punishment is wrong.”

9.5.2 Appeal to Authority

The fallacy of appeal to authority is committed by using an pseudo-authority to support a claim. Note that it is not committed by merely appealing to an authority, but by appealing to an unqualified authority. Always ask, “Should this person know more about this subject than the average person?”

9.5.3 Loaded Question

A loaded question suggests something with the question. “Whem will you stop cheating on exams?” is a loaded question, the question implies that the person is cheating. Notice that there is no way to directly answer the question without admitting to cheating on exams.

9.5.4 False Dilemma

This is sometimes called the either-or fallacy. This happens when a person asserts a disjunction, a sentence of the form "either A or B , when there is at least one more option that is true. Disjunctions are true whenever at least one of the disjuncts, the sentences joined by the ‘or,’ are true. A False dilemma asserts that one of the two sentences must be true when there is really a third alternative. Here are some examples:

“Either buy our personal financial guide or never have control of your finances.”

Child to parent: “Either let me go to the party or I’ll just die.”

False dilemmas are often expressed in pithy slogans on bumper stickers: “It’s my way or the highway” or “America, love it or leave it.”

Consider this example: “My opponent voted against the public schools spending bill. He must think educating our children is not important.” The claim is that either one votes for the bill or one believes that education is not important. This is a false dilemma since there may be many other reasons to vote against a particular bill.

It’s important to remember that a disjunction can be expressed as a conditional: “Either let me go to that party or I’ll die” is equivalent to “If you don’t let me go to that party, then I’ll die.” In general, P or Q is equivalent to if not-P then Q.

9.5.5 Slippery Slope

Slippery slopes rest a conclusion on a chain reaction that is not likely to occur. They generally have this form:

In order for this to be a fallacy, at least one of the conditional statements in the premises must not be likely to be true. Here’s an example: “If I fail this test, then I will fail the course. If I fail the course, then I’ll be expelled from school. If I’m expelled from school, then I’ll never be able to have a good job. If I can’t get a good job, then I can’t support a family…”

To test for a slippery slope, just ask, are there any weak links in this chain of conditionals? Is it really the case that one failed exam will result in an F for a course grade?

Have a thesis expert improve your writing

Check your thesis for plagiarism in 10 minutes, generate your apa citations for free.

  • Knowledge Base
  • Logical Fallacies | Definition, Types, List & Examples

Logical Fallacies | Definition, Types, List & Examples

Published on 20 April 2023 by Kassiani Nikolopoulou . Revised on 9 October 2023.

A logical fallacy is an argument that may sound convincing or true but is actually flawed. Logical fallacies are leaps of logic that lead us to an unsupported conclusion. People may commit a logical fallacy unintentionally, due to poor reasoning, or intentionally, in order to manipulate others.

Because logical fallacies can be deceptive, it is important to be able to spot them in your own argumentation and that of others.

Instantly correct all language mistakes in your text

Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.

upload-your-document-ai-proofreader

Table of contents

Logical fallacy list (free download), what is a logical fallacy, types of logical fallacy, what are common logical fallacies, logical fallacy examples, other interesting articles, frequently asked questions about logical fallacies.

There are many logical fallacies. You can download an overview of the most common logical fallacies by clicking the blue button.

Logical fallacy list (Google Docs)

The only proofreading tool specialized in correcting academic writing

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

fallacies definition critical thinking

Correct my document today

A logical fallacy is an error in reasoning that occurs when invalid arguments or irrelevant points are introduced without any evidence to support them. People often resort to logical fallacies when their goal is to persuade others. Because fallacies appear to be correct even though they are not, people can be tricked into accepting them.

The majority of logical fallacies involve arguments—in other words, one or more statements (called the premise ) and a conclusion . The premise is offered in support of the claim being made, which is the conclusion.

There are two types of mistakes that can occur in arguments:

  • A factual error in the premises . Here, the mistake is not one of logic. A premise can be proven or disproven with facts. For example, If you counted 13 people in the room when there were 14, then you made a factual mistake.
  • The premises fail to logically support the conclusion . A logical fallacy is usually a mistake of this type. In the example above, the students never proved that English 101 was itself a useless course—they merely “begged the question” and moved on to the next part of their argument, skipping the most important part.

In other words, a logical fallacy violates the principles of critical thinking because the premises do not sufficiently support the conclusion, while a factual error involves being wrong about the facts.

There are several ways to label and classify fallacies, such as according to the psychological reasons that lead people to use them or according to similarity in their form. Broadly speaking, there are two main types of logical fallacy, depending on what kind of reasoning error the argument contains:

Informal logical fallacies

Formal logical fallacies.

An informal logical fallacy occurs when there is an error in the content of an argument (i.e., it is based on irrelevant or false premises).

Informal fallacies can be further subdivided into groups according to similarity, such as relevance (informal fallacies that raise an irrelevant point) or ambiguity (informal fallacies that use ambiguous words or phrases, the meanings of which change in the course of discussion).

“ Some philosophers argue that all acts are selfish . Even if you strive to serve others, you are still acting selfishly because your act is just to satisfy your desire to serve others.”

A formal logical fallacy occurs when there is an error in the logical structure of an argument.

Premise 2: The citizens of New York know that Spider-Man saved their city.

Conclusion: The citizens of New York know that Peter Parker saved their city.  

This argument is invalid, because even though Spider-Man is in fact Peter Parker, the citizens of New York don’t necessarily know Spider-Man’s true identity and therefore don’t necessarily know that Peter Parker saved their city.

A logical fallacy may arise in any form of communication, ranging from debates to writing, but it may also crop up in our own internal reasoning. Here are some examples of common fallacies that you may encounter in the media, in essays, and in everyday discussions.

Logical fallacies

Red herring logical fallacy

The red herring fallacy is the deliberate attempt to mislead and distract an audience by bringing up an unrelated issue to falsely oppose the issue at hand. Essentially, it is an attempt to change the subject and divert attention elsewhere.

Bandwagon logical fallacy

The bandwagon logical fallacy (or ad populum fallacy ) occurs when we base the validity of our argument on how many people believe or do the same thing as we do. In other words, we claim that something must be true simply because it is popular.

This fallacy can easily go unnoticed in everyday conversations because the argument may sound reasonable at first. However, it doesn’t factor in whether or not “everyone” who claims x is in fact qualified to do so.

Straw man logical fallacy

The straw man logical fallacy is the distortion of an opponent’s argument to make it easier to refute. By exaggerating or simplifying someone’s position, one can easily attack a weak version of it and ignore their real argument.

Person 2: “So you are fine with children taking ecstasy and LSD?”

Slippery slope logical fallacy

The slippery slope logical fallacy occurs when someone asserts that a relatively small step or initial action will lead to a chain of events resulting in a drastic change or undesirable outcome. However, no evidence is offered to prove that this chain reaction will indeed happen.

Hasty generalisation logical fallacy

The hasty generalisation fallacy (or jumping to conclusions ) occurs when we use a small sample or exceptional cases to draw a conclusion or generalise a rule.

A false dilemma (or either/or fallacy ) is a common persuasion technique in advertising. It presents us with only two possible options without considering the broad range of possible alternatives.

In other words, the campaign suggests that animal testing and child mortality are the only two options available. One has to save either animal lives or children’s lives.

People often confuse correlation (i.e., the fact that two things happen one after the other or at the same time) with causation (the fact that one thing causes the other to happen).

It’s possible, for example, that people with MS have lower vitamin D levels because of their decreased mobility and sun exposure, rather than the other way around.

It’s important to carefully account for other factors that may be involved in any observed relationship. The fact that two events or variables are associated in some way does not necessarily imply that there is a cause-and-effect relationship between them and cannot tell us the direction of any cause-and-effect relationship that does exist.

If you want to know more about fallacies , research bias , or AI tools , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • Sunk cost fallacy
  • Straw man fallacy
  • Slippery slope fallacy
  • Red herring fallacy
  • Ecological fallacy

Research bias

  • Implicit bias
  • Framing bias
  • Cognitive bias
  • Optimism bias
  • Hawthorne effect

An ad hominem (Latin for “to the person”) is a type of informal logical fallacy . Instead of arguing against a person’s position, an ad hominem argument attacks the person’s character or actions in an effort to discredit them.

This rhetorical strategy is fallacious because a person’s character, motive, education, or other personal trait is logically irrelevant to whether their argument is true or false.

Name-calling is common in ad hominem fallacy (e.g., “environmental activists are ineffective because they’re all lazy tree-huggers”).

An appeal to ignorance (ignorance here meaning lack of evidence) is a type of informal logical fallacy .

It asserts that something must be true because it hasn’t been proven false—or that something must be false because it has not yet been proven true.

For example, “unicorns exist because there is no evidence that they don’t.” The appeal to ignorance is also called the burden of proof fallacy .

People sometimes confuse cognitive bias and logical fallacies because they both relate to flawed thinking. However, they are not the same:

  • Cognitive bias is the tendency to make decisions or take action in an illogical way because of our values, memory, socialization, and other personal attributes. In other words, it refers to a fixed pattern of thinking rooted in the way our brain works.
  • Logical fallacies relate to how we make claims and construct our arguments in the moment. They are statements that sound convincing at first but can be disproven through logical reasoning.

In other words, cognitive bias refers to an ongoing predisposition, while logical fallacy refers to mistakes of reasoning that occur in the moment.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Nikolopoulou, K. (2023, October 09). Logical Fallacies | Definition, Types, List & Examples. Scribbr. Retrieved 26 May 2024, from https://www.scribbr.co.uk/fallacy/logical-fallacies/

Is this article helpful?

Kassiani Nikolopoulou

Kassiani Nikolopoulou

Other students also liked, slippery slope fallacy | definition & examples, what is a red herring fallacy | definition & examples, what is straw man argument | definition & examples, still have questions.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

6: Persuasive Appeals

  • Last updated
  • Save as PDF
  • Page ID 100030

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Rhetorical Analysis and the Rhetorical Triangle

A rhetorical analysis refers to the process of analyzing a text, source, or artifact. It may be in written form (like your sources) or some other form of communication like a speech. The goal of a rhetorical analysis is to take into consideration the purpose, audience, and the occasion (or the context in which the text will be written or read).

The term rhetoric is often connected to argument or criticism, and often carries a negative connotation (e.g., "The political rhetoric is so divisive." ). But rhetoric is really a neutral term. It’s the effort to use rhetorical appeals to influence an audience and achieve a certain set of purposes and outcomes.

The principles Aristotle laid out in his Rhetoric nearly 2,500 years ago still form the foundation of much of our contemporary practice of argument. Aristotle argued that rhetoric was present in every situation. Have you ever heard the phrase, "Everything's an argument"? - This refers to Aristotle's definition of rhetoric, which he defined as "the faculty of observing in any given case the available means of persuasion" (Aristotle). The rhetorical situation Aristotle argued was present in any piece of communication is often illustrated with a triangle to suggest the interdependent relationships among its three elements:

  • Ethos or the voice of the speaker or writer and how they establish their credibility
  • Pathos or the audience and the emotional connection that's established with the intended listeners or readers
  • Logos or the message and how logically it is conveyed

Fig36.png

If each corner of the triangle is represented by one of the three elements of the rhetorical situation, then each side of the triangle depicts a particular relationship between two elements:

  • Tone - The connection established between the speaker and the audience.
  • Attitude - The orientation of the speaker toward the message; the purpose of the message
  • Reception - The manner in which the audience receives the message conveyed

Rhetorical Appeals

In this section, we’ll focus on how the rhetorical triangle can be used in service of argumentation, especially through the balanced use of ethical, logical, and emotional appeals: ethos, logos, and pathos, respectively. In the rhetorical triangle, you’ll note that each appeal has been placed next to the corner of the triangle with which it is most closely associated:

  • Ethos - Appeals to the audience's ethics. Ethos relies on the credibility, reputation, and trustworthiness of the speaker or writer (most closely associated with the voice).
  • Logos - Appeals to the audience's logic. Logos relies on reason, logic, and facts in the argument (most closely associated with the message).
  • Pathos - Appeals to emotion. Pathos relies on stirring the emotions (sympathy, anger, pride, etc.) of the listeners or readers (most closely associated with the audience).

Each of these appeals relies on a certain type of evidence: logical, emotional, or ethical. Based on your audience and purpose, you have to decide what combination of techniques will work best as you present your case. Some of the best arguments use a combination of all three.

The Appeal to Ethics (Ethos)

Ethos is an appeal to the audiences' ethics and relies on the credibility of the author. When reading a source, you should ask yourself, "Why should I trust you?" In other words, "How has the author established their credibility?" The same holds true when you are writing. You want to establish reasons for your audience to trust you.

For example, a college professor who places a college logo on a website gains some immediate credibility from being associated with the college. An advertisement for tennis shoes using a well-known athlete gains some credibility. You might create an ethical appeal in an essay on solving a campus problem by noting that you are serving in student government.

Three of the best ways to demonstrate ethos are:

  • By personal experience: Although your lived experience might not set hard and-fast rules about the world, it is worth noting that you may be an expert on certain facets of your life. For instance, a student who has played rugby for fifteen years of their life is in many ways an authority on the sport.
  •  By education or other certifications: Professional achievements demonstrate ethos by revealing status in a certain field or discipline.
  • By citing other experts: The common expression is “Stand on the shoulders of giants.” This is actually the tag line of Google Scholar . You can develop ethos by pointing to other people with authority and saying, “Look, this smart/experienced/qualified/important person agrees with me.” 

As a student, you might not have enough experience to persuade your audience that you're an expert. In these cases, you can establish your credibility by using reliable sources and acknowledging those who've helped you learn more about the topic (citing your sources). The appeal to ethics can add an important component to your argument, but keep in mind that ethos is only as strong as the credibility of the association being made. In other words, if you're not citing very good sources, then you're harming your ethical appeal to the audience.

The Appeal to Logic (Logos)

Logos refers to an appeal to the audience's logical reasoning. Logos will often employ statistics, data, or other quantitative facts to demonstrate the validity of an argument.  For example, in an essay proposing that participating in high school athletics helps students develop into more successful students, you could show graphs comparing the grades of athletes and non-athletes, as well as high school graduation rates and post–high school education enrollment. These statistics would support your points in a logical way and would probably work well with a school board that is considering cutting a sports program. 

Keep in mind that stating a fact or a statistics does not alone constitute logos.You need to make sure your interpretation of the logic is sound. If it's not, it's described as a logical fallacy. We'll explore this further in the next page of reading.

The Appeal to Emotions (Pathos)

The goal of an emotional appeal is to garner sympathy, develop anger, instill pride, inspire happiness, or trigger other emotions. When authors choose this method, their goal is for the audience to react emotionally regardless of what they might think logically.

There are two especially effective techniques for cultivating pathos:

  • Make the audience aware of the issue’s relevance to them specifically. “How would you feel if this happened to you? What are we to do about this issue?”
  • About 1.5 million pets are euthanized each year
  • Scooter, an energetic and loving former service dog with curly brown hair like a Brillo pad, was put down yesterday.
  • Both are impactful, but the latter is more memorable and more specific.

Unfortunately, emotional appeals are also often used unethically to sway opinions without solid reasoning.

Whether your argument relies primarily on ethos, logos, pathos, or a combination of these appeals, plan to make your case with your entire arsenal of facts, statistics, examples, anecdotes, illustrations, figurative language, quotations, expert opinions, discountable opposing views, and common ground with the audience. Carefully choosing these supporting details will control the tone of your writing as well as the success of your argument.

How to Use Rhetoric to Get What You Want

This video by Camille Langston from from TED-Ed gives you another quick overview of rhetorical appeals.

Tile that links to how to use rhetoric video.

Inductive and Deductive Reasoning

There are two basic approaches to how you believe something is true.  These are called inductive and deductive reasoning.

Inductive Reasoning

Inductive reasoning presents facts and then wraps them up with a general conclusion. For instance, you visit your local grocery store daily to pick up necessary items. You notice that on Friday, two weeks ago, all the clerks in the store were wearing football jerseys. Again, last Friday, the clerks wore their football jerseys. Today, also a Friday, they’re wearing them again. From just these observations, you can conclude that on all Fridays, these supermarket employees will wear football jerseys to support their local team. This type of pattern recognition, leading to a conclusion, is known as inductive reasoning.

The Power of Inductive Reasoning

You have been employing inductive reasoning for a very long time. Inductive reasoning is based on your ability to recognize meaningful patterns and connections. By taking into account both examples and your understanding of how the world works, induction allows you to conclude that something is likely to be true. By using induction, you move from specific data to a generalization that tries to capture what the data “mean.”

Imagine that you ate a dish of strawberries and soon afterward your lips swelled. Now imagine that a few weeks later you ate strawberries and soon afterwards your lips again became swollen. The following month, you ate yet another dish of strawberries, and you had the same reaction as formerly. You are aware that swollen lips can be a sign of an allergy to strawberries. Using induction, you conclude that, more likely than not, you are allergic to strawberries.

  • Data: After I ate strawberries, my lips swelled (1st time).
  • Data: After I ate strawberries, my lips swelled (2nd time).
  • Data: After I ate strawberries, my lips swelled (3rd time).
  • Additional Information: Swollen lips after eating strawberries may be a sign of an allergy.
  • Conclusion: Likely I am allergic to strawberries.

The results of inductive thinking can be skewed if relevant data are overlooked. In the previous example, inductive reasoning was used to conclude that you are likely allergic to strawberries after suffering multiple instances of lips swelling after eating them. Would you be as confident in your conclusion if you were eating strawberry shortcake on each of those occasions? Is it reasonable to assume that the allergic reaction might be due to another ingredient besides strawberries?

This example illustrates that inductive reasoning must be used with care. When evaluating an inductive argument, consider

  • the amount of the data,
  • the quality of the data,
  • the existence of additional data,
  • the relevance of necessary additional information, and
  • the existence of additional possible explanations.

Inductive reasoning can never lead to absolute certainty. Instead, induction allows you to say that, given the examples provided for support, the claim more likely than not is true. Because of the limitations of inductive reasoning, a conclusion will be more credible if multiple lines of reasoning are presented in its support.

Deductive Reasoning

Knowledge can also move the opposite direction. Deductive reasoning presents a generalization (think thesis statement), then provides supportive facts to back up that generalization.

Say that you read in the news about a tradition in a local grocery store, where employees wore football jerseys on Fridays to support the home team. This time, you’re starting from the overall rule, and you would expect individual evidence to support this rule. Each time you visited the store on a Friday, you would expect the employees to wear jerseys. Such a case, of starting with the overall statement and then identifying examples that support it, is known as deductive reasoning.

The Power of Deductive Reasoning

Deductive reasoning is built on two statements whose logical relationship should lead to a third statement that is an unquestionably correct conclusion. In other words, one fact plus one fact equal a third fact. Here's an example:

  • All raccoons are omnivores.
  • This animal is a raccoon.
  • This animal is an omnivore.

If the first statement is true (All raccoons are omnivores) and the second statement is true (This animal is a raccoon), then the conclusion (This animal is an omnivore) is unavoidable. If a group must have a certain quality, and an individual is a member of that group, then the individual must have that quality.

Going back to the very first example from the introduction, we could frame it this way:

  • Grocery store employees wear football jerseys on Fridays.
  • Today is Friday.
  • Grocery store employees will be wearing football jerseys today.

Unlike inductive reasoning, deductive reasoning allows for certainty only as long as the evidence is a true fact.

What this means is that inductive reasoning can often be hidden inside a deductive argument. That is, a generalization reached through inductive reasoning can be turned around and used as a starting “truth” for a deductive argument. Here's an example:

  • Most Labrador retrievers are friendly.
  • Kimber is a Labrador retriever.
  • Therefore, Kimber is friendly.

In this case we cannot know for certain that Kimber is a friendly Labrador retriever. The structure of the argument may look logical, but it is based on observations and generalizations rather than indisputable facts.

One way to test the accuracy of a premise is to apply the same questions asked of inductive arguments. As a recap, you should consider:

  • the relevance of the additional data, and

Determine whether the starting claim is based upon a sample that is both representative and sufficiently large, and ask yourself whether all relevant factors have been taken into account in the analysis of data that leads to a generalization.

Here is a video that briefly explains the difference between inductive and deductive reasoning.

tile that links to difference between deductive and inductive reasoning video

Logical, Emotional, or Ethical Fallacies

Rhetorical appeals used in argumentation have power. They can be used to motivate or to manipulate. When they are used irresponsibly, they lead to fallacies, which are misleading statements and constructions used in argumentation.

Fallacies are, at best, unintentional reasoning errors, and at worst, they are deliberate attempts to deceive. Fallacies are commonly used in advertising and politics, but they are not acceptable in academic arguments and won't serve you well in the workplace either. The following are some examples of three kinds of fallacies that abuse the power of logical, emotional, or ethical appeals (logos, pathos, or ethos).

Do your best to avoid using these examples of fallacious reasoning, and be alert to their use by others so that you aren’t “tricked” into a line of unsound reasoning. Getting into the habit of reading academic, commercial, and political rhetoric carefully will enable you to see through manipulative, fallacious uses of verbal, written, and visual language. Being on guard for these fallacies will make you a more proficient college student, a smarter consumer, and a more careful voter, citizen, and member of your community.

Spotting Fallacies

It takes time and practice to be able to spot fallacies in arguments. This is a fun website you can use to help you think more critically and test your knowledge on spotting fallacies.

homepage of your logical fallacy is which links to the website

Rhetorical Theory of Reading

We got into critical reading last week. It's vital to use critical reading to understand a text and to spot rhetoric. You are constantly making meaning from texts. To understand this further, we're going to look at the rhetorical theory of reading.

The work that best describes and justifies the rhetorical reading theory is Douglas Brent’s 1992 book Reading as Rhetorical Invention: Knowledge, Persuasion, and the Teaching of Research-Based Writing .  Brent’s ideas  do a good job demystifying critical reading’s main claims. 

Brent treats reading not only as a vehicle for transmitting information and knowledge, but also as a means of persuasion. To Brent, knowledge equals persuasion because, in his words, “Knowledge is not simply what one has been told. Knowledge is what one believes, what one accepts as being at least provisionally true.” (xi).

This short passage contains two assertions which are key to the understanding of what's happening when you are reading critically. Notice that simply reading “for the main point” will not necessarily make you “believe” what you read. Surely, such reading can fill our heads with information, but will that information become our knowledge in a true sense, will we be persuaded by it, or will we simply memorize it to pass the test and forget it as soon as we pass it?

All of us can probably recall many instances in which we read a lot to pass a test only to forget, with relief, what we read as soon as we left the classroom where that test was held. The purpose of reading and research, then, is not to get as much information out of a text as possible, but to change and update one’s system of beliefs on a given subject (Brent 55-57).

Brent further states:

The way we believe or disbelieve certain texts clearly varies from one individual to the next. If you present a text that is remotely controversial to a group of people, some will be convinced by it and some not, and those who are convinced will be convinced in different degrees. The task of a rhetoric of reading is to explain systematically how these differences arise— how people are persuaded differently by texts (18).

Critical and active readers not only accept the possibility that the same texts will have different meanings for different people, but welcome this possibility as an inherent and indispensable feature of strong, engaged, and enjoyable reading process. To answer his own questions about what factors contribute to different readers’ different interpretations of the same texts, Brent offers us the following principles that have been summarized from his book:

  • Readers are guided by personal beliefs, assumptions, and pre-existing knowledge when interpreting texts. This is called information bias and we'll be exploring it further later in the course.
  • Readers react differently to the logical proofs presented by the writers of texts.
  • Readers react differently to emotional and ethical proofs presented by writers. For example, an emotional story told by a writer may resonate with one person more than with another because the first person lived through a similar experience and the second one did not, and so on.

The idea behind the rhetorical theory of reading is that when we read, we not only take in ideas, information, and facts, but instead we “update our view of the world.” You cannot force someone to update their worldview, and therefore, the purpose of writing is persuasion and the purpose of reading is being persuaded. Persuasion is possible only when the reader is actively engaged with the text and understands that much more than simple retrieval of information is at stake when reading.

Applied to research, Brent’s theory of reading means the following:

  • The purpose of research is not simply to retrieve data, but to participate in a conversation about it. Simple summaries of sources is not research, and writers should be aiming for active interpretation of sources instead.
  • There is no such thing as an unbiased source. Writers make claims for personal reasons that critical readers need to learn to understand and evaluate.
  • Feelings can be a source of shareable good reason for belief. Readers and writers need to use, judiciously, ethical and pathetic proofs in interpreting texts and in creating their own.
  • Research is recursive. Critical readers and researchers never stop asking questions about their topic and never consider their research finished.

CC BY-NC logo

Original sources used to create content (also licensed under CC BY-NC 4.0 unless otherwise noted):

Abrams, S. (2018). Argumentation. In EmpoWord: A student-centered anthology and handbook for college writers , (pp. 201-241). https://pdxscholar.library.pdx.edu/cgi/viewcontent.cgi?article=1025&context=pdxopen

Inductive and deductive reasoning. (n.d.). English composition I. Lumen Learning. https://courses.lumenlearning.com/engcomp1-wmopen/chapter/text-inductive-reasoning/

Khan Academy. (2010). Difference between inductive and deductive reasoning . https://youtu.be/GEId0GonOZM [NOT LICENSED UNDER CC - FREELY ACCESSIBLE ON YOUTUBE]

Langston, C.A. (2016) How to use rhetoric to get what you want [Video]. https://youtu.be/3klMM9BkW5o [NOTE LICENSED UNDER CC- FREELY ACCESSIBLE ON YOUTUBE AND TED-ED]

Recognizing the rhetorical situation. (2012). The writer's handbook, v.1. https://2012books.lardbucket.org/books/writers-handbook/s08-02-recognizing-the-rhetorical-sit.html NOTE: This title is licensed under CC BY-NC-SA 3.0

Richardson, J. , Smith, A., Meaden, S.  (2019). Thou shalt not commit logical fallacies. https://yourlogicalfallacyis.com/ Note: authors do not specify CC license, but list Creative Commons Attribution-NonCommerical CC BY-NC

Zemilansky, P. (2016). Research writing and argument: All writing is an argument. J. Kepka (Ed.) In Oregon writes open writing text .  https://openoregon.pressbooks.pub/oregonwrites/

Aristotle. Rhetoric . http://classics.mit.edu/Aristotle/rhetoric.1.i.html

Brent, D. (1992). Reading as rhetorical invention: Knowledge, persuasion, and the teaching of research-based writing. National Council of Teachers of English.

  • More from M-W
  • To save this word, you'll need to log in. Log In

Definition of fallacy

Did you know.

Philosophers are constantly using the word fallacy . For them, a fallacy is reasoning that comes to a conclusion without the evidence to support it. This may have to do with pure logic, with the assumptions that the argument is based on, or with the way words are used, especially if they don't keep exactly the same meaning throughout the argument. There are many classic fallacies that occur again and again through the centuries and everywhere in the world. You may have heard of such fallacies as the "ad hominem" fallacy, the "question-begging" fallacy, the "straw man" fallacy, the "slippery slope" fallacy, the "gambler's" fallacy, or the "red herring" fallacy. Look them up and see if you've ever been guilty of any of them.

  • hallucination
  • misconception
  • old wives' tale

Examples of fallacy in a Sentence

These examples are programmatically compiled from various online sources to illustrate current usage of the word 'fallacy.' Any opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback about these examples.

Word History

Latin fallacia , from fallac-, fallax deceitful, from fallere to deceive

15th century, in the meaning defined at sense 2b

Phrases Containing fallacy

  • pathetic fallacy

Dictionary Entries Near fallacy

fallacy of accident

Cite this Entry

“Fallacy.” Merriam-Webster.com Dictionary , Merriam-Webster, https://www.merriam-webster.com/dictionary/fallacy. Accessed 26 May. 2024.

Kids Definition

Kids definition of fallacy, more from merriam-webster on fallacy.

Nglish: Translation of fallacy for Spanish Speakers

Britannica English: Translation of fallacy for Arabic Speakers

Britannica.com: Encyclopedia article about fallacy

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!

Play Quordle: Guess all four words in a limited number of tries.  Each of your guesses must be a real 5-letter word.

Can you solve 4 words at once?

Word of the day.

See Definitions and Examples »

Get Word of the Day daily email!

Popular in Grammar & Usage

More commonly misspelled words, your vs. you're: how to use them correctly, every letter is silent, sometimes: a-z list of examples, more commonly mispronounced words, how to use em dashes (—), en dashes (–) , and hyphens (-), popular in wordplay, the words of the week - may 24, flower etymologies for your spring garden, birds say the darndest things, a great big list of bread words, 10 scrabble words without any vowels, games & quizzes.

Play Blossom: Solve today's spelling word game by finding as many words as you can using just 7 letters. Longer words score more points.

IMAGES

  1. List Of Logical Fallacies And Definitions

    fallacies definition critical thinking

  2. What is a Logical Fallacy?

    fallacies definition critical thinking

  3. Argument from Fallacy

    fallacies definition critical thinking

  4. It's Only Logical: Logical Fallacies Poster, English: Teacher's Discovery

    fallacies definition critical thinking

  5. Fallacies and Bias

    fallacies definition critical thinking

  6. Fallacies in critical thinking

    fallacies definition critical thinking

VIDEO

  1. Identify Logical Fallacies #tips for #criticalthinking

  2. Logic and Critical Thinking-Fallacies according to Quantity and Fallacies according to Quality

  3. Mastering Logical Fallacies: The Online Course

  4. How to Identify Logical Fallacies in Arguments and Discussions?

  5. Common Logical Fallacies

  6. Are You UNKOWINGLY Making These Logical Mistakes?

COMMENTS

  1. Critical Thinking and Decision-Making: Logical Fallacies

    Sometimes logical fallacies are intentionally used to try and win a debate. In these cases, they're often presented by the speaker with a certain level of confidence.And in doing so, they're more persuasive: If they sound like they know what they're talking about, we're more likely to believe them, even if their stance doesn't make complete logical sense.

  2. Logical Fallacies

    A logical fallacy is an argument that may sound convincing or true but is actually flawed. Logical fallacies are leaps of logic that lead us to an unsupported conclusion. People may commit a logical fallacy unintentionally, due to poor reasoning, or intentionally, in order to manipulate others. Logical fallacy example.

  3. Fallacies

    The study of fallacies is an application of the principles of critical thinking. Being familiar with typical fallacies can help us avoid them and help explain other people's mistakes. There are different ways of classifying fallacies. Broadly speaking, we might divide fallacies into four kinds:

  4. PDF The Thinker's Guide To Fallacies

    The Foundation for Critical Thinking. To understand the human mind, understand self-deception. Anon. The word 'fallacy' derives from two Latin words, fallax ("deceptive") and fallere ("to deceive"). This is an important concept in human life because much human thinking deceives itself while deceiving others. The human mind has no ...

  5. LOGOS: Critical Thinking, Arguments, and Fallacies

    LOGOS: Critical Thinking, Arguments, and Fallacies Heather Wilburn, Ph.D. Critical Thinking: With respect to critical thinking, it seems that everyone uses this phrase. Yet, there is a fear that this is becoming a buzz-word (i.e. a word or phrase you use because it's popular or enticing in some way). Ultimately, this means that we may be ...

  6. What are Logical Fallacies?

    Logical fallacies are errors in reasoning or flawed arguments that can mislead or deceive. They often appear plausible but lack sound evidence or valid reasoning, undermining the credibility of an argument. These errors can be categorized into various types, such as ad hominem attacks, strawman arguments, and false cause correlations. Impact on Critical Thinking,

  7. Logical Fallacies

    Logical Fallacies. Logical fallacies are errors in reasoning that are based on poor or faulty logic. When presented in a formal argument, they can cause you to lose your credibility as a writer, so you have to be careful of them. Sometimes, writers will purposefully use logical fallacies to make an argument seem more persuasive or valid than it ...

  8. 3.3: Fallacies

    The study of fallacies is an application of the principles of critical thinking. Being familiar with typical fallacies can help us avoid them. We would also be in a position to explain other people's mistakes. There are different ways of classifying fallacies. Broadly speaking, we might divide fallacies into four kinds.

  9. 7.4: Fallacies

    Types of Fallacies. Fallacies are usually recognized in isolation, but woven into the context of an argument they may pass unnoticed, unless the critical thinker is on guard against them. Some advocates openly use fallacies in order to exploit an unknowing audience, but many times we use fallacies unintentionally. Many fallacies exist.

  10. PDF 1 Critical Thinking: An Introduction. Logic and Logical Fallacies

    1 1 Critical Thinking: An Introduction. Logic and Logical Fallacies Lecture II 2 Truth Tables and Logical Operators zFace it…some things are either true or false (specifying this formally is called "propositional calculus") zA "proposition" is a meaningful statement zLimited number of operators: not, and, or, if…then, if and only if zTruth tables chart truth value of proposition by ...

  11. Fallacies

    The definition of fallacy Walton proposes (1995, 255) has five parts. A fallacy: ... In response to these findings, one can admit their relevance to the pedagogy of critical thinking but still recall the distinction between what causes mistakes and what the mistakes are. The analysis of fallacies belongs to the normative study of arguments and ...

  12. 9.14: Common Logical Fallacies

    A logical fallacy is a flaw in reasoning or a flawed structure that undermines the validity of an argument. A fallacious argument can make productive conversation impossible. Logical fallacies are often used by politicians and the media to fool people because they have the deceptive appearance of being reasonable—despite their exploitation of ...

  13. Fallacies

    For each fallacy listed, there is a definition or explanation, an example, and a tip on how to avoid committing the fallacy in your own arguments. Hasty generalization. Definition: Making assumptions about a whole group or range of cases based on a sample that is inadequate (usually because it is atypical or too small). Stereotypes about people ...

  14. Introduction to Logic and Critical Thinking

    This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a ...

  15. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  16. Critical Thinking

    Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. The goal of this process is to help us have good beliefs, where "good" means that our beliefs meet certain goals of thought, such as truth, usefulness, or rationality. Critical thinking is widely ...

  17. Fallacies

    Fallacies are common errors in reasoning that will undermine the logic of your argument. Fallacies can be either illegitimate arguments or irrelevant points, and are often identified because they lack evidence that supports their claim. Avoid these common fallacies in your own arguments and watch for them in the arguments of others.

  18. Guide to the Most Common Logical Fallacies

    Definition and explanation: A person who makes a claim bears the burden of proof to provide sufficient evidence for their claim. This standard is the foundation of our legal system, the process of science, and critical thinking. How much evidence is sufficient depends on the claim, but more extraordinary claims require more extraordinary evidence.

  19. Chapter 9 Informal Fallacies

    Chapter 9. Informal Fallacies. A fallacy is a mistake in reasoning. A formal fallacy is a fallacy that can be identified merely by examining the argument's form or using a tool like a truth table. An informal fallacy cannot be detected from the argument's form. There are no foolproof tools for detecting informal fallacies.

  20. Logical Fallacies

    Revised on 9 October 2023. A logical fallacy is an argument that may sound convincing or true but is actually flawed. Logical fallacies are leaps of logic that lead us to an unsupported conclusion. People may commit a logical fallacy unintentionally, due to poor reasoning, or intentionally, in order to manipulate others.

  21. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  22. 6: Persuasive Appeals

    Logical Fallacies. Definition. Examples. Begging the question (or circular reasoning) The point is simply restated in different words as proof to support the point. Of course animals have rights, just look at how they're being treated. Either/or fallacy. A situation is presented as a an "either/or" choice when in reality, there are more than ...

  23. Fallacy Definition & Meaning

    fallacy: [noun] guile, trickery. deceptive appearance : deception.