Cognitive Distortions: 22 Examples & Worksheets (& PDF)

Cognitive Distortions: When Your Brain Lies to You

Generally, this is a good thing—our brain has been wired to alert us to danger, attract us to potential mates, and find solutions to the problems we encounter every day.

However, there are some occasions when you may want to second guess what your brain is telling you. It’s not that your brain is purposely lying to you, it’s just that it may have developed some faulty or non-helpful connections over time.

It can be surprisingly easy to create faulty connections in the brain. Our brains are predisposed to making connections between thoughts, ideas, actions, and consequences, whether they are truly connected or not.

This tendency to make connections where there is no true relationship is the basis of a common problem when it comes to interpreting research: the assumption that because two variables are correlated, one causes or leads to the other. The refrain “correlation does not equal causation!” is a familiar one to any student of psychology or the social sciences.

It is all too easy to view a coincidence or a complicated relationship and make false or overly simplistic assumptions in research—just as it is easy to connect two events or thoughts that occur around the same time when there are no real ties between them.

There are many terms for this kind of mistake in social science research, complete with academic jargon and overly complicated phrasing. In the context of our thoughts and beliefs, these mistakes are referred to as “cognitive distortions.”

Before you continue, we thought you might like to download our three Positive CBT Exercises for free. These science-based exercises will provide you with a detailed insight into Positive CBT and will give you additional tools to address cognitive distortions in your therapy or coaching.

This Article Contains:

What are cognitive distortions, experts in cognitive distortions: aaron beck and david burns, a list of the most common cognitive distortions, changing your thinking: examples of techniques to combat cognitive distortions, a take-home message.

Cognitive distortions are biased perspectives we take on ourselves and the world around us. They are irrational thoughts and beliefs that we unknowingly reinforce over time.

These patterns and systems of thought are often subtle–it’s difficult to recognize them when they are a regular feature of your day-to-day thoughts. That is why they can be so damaging since it’s hard to change what you don’t recognize as something that needs to change!

Cognitive distortions come in many forms (which we’ll cover later in this piece), but they all have some things in common.

All cognitive distortions are:

  • Tendencies or patterns of thinking or believing;
  • That are false or inaccurate;
  • And have the potential to cause psychological damage.

It can be scary to admit that you may fall prey to distorted thinking. You might be thinking, “There’s no way I am holding on to any blatantly false beliefs!” While most people don’t suffer in their daily lives from these kinds of cognitive distortions, it seems that no one can completely escape these distortions.

If you’re human, you have likely fallen for a few of the numerous cognitive distortions at one time or another. The difference between those who occasionally stumble into a cognitive distortion and those who struggle with them on a more long-term basis is the ability to identify and modify or correct these faulty patterns of thinking.

As with many skills and abilities in life, some are far better at this than others–but with practice, you can improve your ability to recognize and respond to these distortions.

These distortions have been shown to relate positively to symptoms of depression, meaning that where cognitive distortions abound, symptoms of depression are likely to occur as well (Burns, Shaw, & Croker, 1987).

In the words of the renowned psychiatrist and researcher David Burns:

“I suspect you will find that a great many of your negative feelings are in fact based on such thinking errors.”

Errors in thinking, or cognitive distortions, are particularly effective at provoking or exacerbating symptoms of depression. It is still a bit ambiguous as to whether these distortions cause depression or depression brings out these distortions (after all, correlation does not equal causation!) but it is clear that they frequently go hand-in-hand.

Much of the knowledge around cognitive distortions come from research by two experts: Aaron Beck and David Burns. Both are prominent in the fields of psychiatry and psychotherapy.

If you dig any deeper into cognitive distortions and their role in depression, anxiety, and other mental health issues, you will find two names over and over again: Aaron Beck and David Burns.

These two psychologists literally wrote the book(s) on depression, cognitive distortions, and the treatment of these problems.

Cognitive Distortions

Aaron Beck began his career at Yale Medical School, where he graduated in 1946 (GoodTherapy, 2015). His required rotations in psychiatry during his residency ignited his passion for research on depression, suicide, and effective treatment.

In 1954, he joined the University of Pennsylvania’s Department of Psychiatry, where he still holds the position of Professor Emeritus of Psychiatry.

In addition to his prodigious catalog of publications, Beck founded the Beck Initiative to teach therapists how to conduct cognitive therapy with their patients–an endeavor that has helped cognitive therapy grow into the therapy juggernaut that it is today.

Beck also applied his knowledge as a member or consultant for the National Institute of Mental Health, an editor for several peer-reviewed journals, and lectures and visiting professorships at various academic institutions throughout the world (GoodTherapy, 2015).

While there are clearly many honors, awards, and achievements Beck may be known for, perhaps his greatest contribution to the field of psychology is his role in the development of cognitive therapy.

Beck developed the basis for Cognitive Behavioral Therapy , or CBT, when he noticed that many of his patients struggling with depression were operating on false assumptions and distorted thinking (GoodTherapy, 2015). He connected these distorted thinking patterns with his patients’ symptoms and hypothesized that changing their thinking could change their symptoms.

This is the foundation of CBT – the idea that our thought patterns and deeply held beliefs about ourselves and the world around us drive our experiences. This can lead to mental health disorders when they are distorted but can be modified or changed to eliminate troublesome symptoms.

In line with his general research focus, Beck also developed two important scales that are among some of the most used scales in psychology: the Beck Depression Inventory and the Beck Hopelessness Scale. These scales are used to evaluate symptoms of depression and risk of suicide and are still applied decades after their original development (GoodTherapy, 2015).

David Burns

Another big name in depression and treatment research, Dr. David Burns, also spent some time learning and developing his skills at the University of Pennsylvania – it seems that UPenn is particularly good at producing future leaders in psychology!

Burns graduated from Stanford University School of Medicine and moved on to the University of Pennsylvania School of Medicine, where he completed his psychiatry residency and cemented his interest in the treatment of mental health disorders (Feeling Good, n.d.).

He is currently serving as a Professor Emeritus of Psychiatry and Behavioral Sciences at the Stanford University School of Medicine, in addition to continuing his research on treating depression and training therapists to conduct effective psychotherapy sessions (Feeling Good, n.d.). Much of his work is based on Beck’s research revealing the potential impacts of distorted thinking and suggesting ways to correct this thinking.

He is perhaps most well known outside of strictly academic circles for his worldwide best-selling book Feeling Good: The New Mood Therapy . This book has sold more than 4 million copies within the United States alone and is often recommended by therapists to their patients struggling with depression (Summit for Clinical Excellence, n.d.).

This book outlines Burns’ approach to treating depression, which mostly focuses on identifying, correcting, and replacing distorted systems and patterns of thinking. If you are interested in learning more about this book, you can find it on Amazon with over 1,400 reviews to help you evaluate its effectiveness.

To hear more about Burns’ work in the treatment of depression, check out his TED talk on the subject below.

As Burns discusses in the above video, his studies of depression have also influenced the studies around joy and self-esteem.

The most researched form of psychotherapy right now is covered by his book, Feeling Good , aimed at providing tools to the general public.

Overthinking and Cognitive Distortions.

There are many others who have picked up the torch for this research, often with their own take on cognitive distortions. As such, there are numerous cognitive distortions floating around in the literature, but we’ll limit this list to the most common sixteen.

The first eleven distortions come straight from Burns’ Feeling Good Handbook (1989).

1. All-or-Nothing Thinking / Polarized Thinking

Also known as “Black-and-White Thinking,” this distortion manifests as an inability or unwillingness to see shades of gray. In other words, you see things in terms of extremes – something is either fantastic or awful, you believe you are either perfect or a total failure.

2. Overgeneralization

This sneaky distortion takes one instance or example and generalizes it to an overall pattern. For example, a student may receive a C on one test and conclude that she is stupid and a failure. Overgeneralizing can lead to overly negative thoughts about yourself and your environment based on only one or two experiences.

3. Mental Filter

Similar to overgeneralization, the mental filter distortion focuses on a single negative piece of information and excludes all the positive ones. An example of this distortion is one partner in a romantic relationship dwelling on a single negative comment made by the other partner and viewing the relationship as hopelessly lost, while ignoring the years of positive comments and experiences.

The mental filter can foster a decidedly pessimistic view of everything around you by focusing only on the negative.

4. Disqualifying the Positive

On the flip side, the “Disqualifying the Positive” distortion acknowledges positive experiences but rejects them instead of embracing them.

For example, a person who receives a positive review at work might reject the idea that they are a competent employee and attribute the positive review to political correctness, or to their boss simply not wanting to talk about their employee’s performance problems.

This is an especially malignant distortion since it can facilitate the continuation of negative thought patterns even in the face of strong evidence to the contrary.

5. Jumping to Conclusions – Mind Reading

This “Jumping to Conclusions” distortion manifests as the inaccurate belief that we know what another person is thinking. Of course, it is possible to have an idea of what other people are thinking, but this distortion refers to the negative interpretations that we jump to.

Seeing a stranger with an unpleasant expression and jumping to the conclusion that they are thinking something negative about you is an example of this distortion.

6. Jumping to Conclusions – Fortune Telling

A sister distortion to mind reading, fortune telling refers to the tendency to make conclusions and predictions based on little to no evidence and holding them as gospel truth.

One example of fortune-telling is a young, single woman predicting that she will never find love or have a committed and happy relationship based only on the fact that she has not found it yet. There is simply no way for her to know how her life will turn out, but she sees this prediction as fact rather than one of several possible outcomes.

7. Magnification (Catastrophizing) or Minimization

Also known as the “Binocular Trick” for its stealthy skewing of your perspective, this distortion involves exaggerating or minimizing the meaning, importance, or likelihood of things.

An athlete who is generally a good player but makes a mistake may magnify the importance of that mistake and believe that he is a terrible teammate, while an athlete who wins a coveted award in her sport may minimize the importance of the award and continue believing that she is only a mediocre player.

8. Emotional Reasoning

This may be one of the most surprising distortions to many readers, and it is also one of the most important to identify and address. The logic behind this distortion is not surprising to most people; rather, it is the realization that virtually all of us have bought into this distortion at one time or another.

Emotional reasoning refers to the acceptance of one’s emotions as fact. It can be described as “ I feel it, therefore it must be true .” Just because we feel something doesn’t mean it is true; for example, we may become jealous and think our partner has feelings for someone else, but that doesn’t make it true. Of course, we know it isn’t reasonable to take our feelings as fact, but it is a common distortion nonetheless.

Relevant:  What is Emotional Intelligence? + 18 Ways to Improve It

9. Should Statements

Another particularly damaging distortion is the tendency to make “should” statements. Should statements are statements that you make to yourself about what you “should” do, what you “ought” to do, or what you “must” do. They can also be applied to others, imposing a set of expectations that will likely not be met.

When we hang on too tightly to our “should” statements about ourselves, the result is often guilt that we cannot live up to them. When we cling to our “should” statements about others, we are generally disappointed by their failure to meet our expectations, leading to anger and resentment.

10. Labeling and Mislabeling

These tendencies are basically extreme forms of overgeneralization, in which we assign judgments of value to ourselves or to others based on one instance or experience.

For example, a student who labels herself as “an utter fool” for failing an assignment is engaging in this distortion, as is the waiter who labels a customer “a grumpy old miser” if he fails to thank the waiter for bringing his food. Mislabeling refers to the application of highly emotional, loaded, and inaccurate or unreasonable language when labeling.

11. Personalization

As the name implies, this distortion involves taking everything personally or assigning blame to yourself without any logical reason to believe you are to blame.

This distortion covers a wide range of situations, from assuming you are the reason a friend did not enjoy the girls’ night out, to the more severe examples of believing that you are the cause for every instance of moodiness or irritation in those around you.

In addition to these basic cognitive distortions, Beck and Burns have mentioned a few others (Beck, 1976; Burns, 1980):

12. Control Fallacies

A control fallacy manifests as one of two beliefs: (1) that we have no control over our lives and are helpless victims of fate, or (2) that we are in complete control of ourselves and our surroundings, giving us responsibility for the feelings of those around us. Both beliefs are damaging, and both are equally inaccurate.

No one is in complete control of what happens to them, and no one has absolutely no control over their situation. Even in extreme situations where an individual seemingly has no choice in what they do or where they go, they still have a certain amount of control over how they approach their situation mentally.

13. Fallacy of Fairness

While we would all probably prefer to operate in a world that is fair, the assumption of an inherently fair world is not based in reality and can foster negative feelings when we are faced with proof of life’s unfairness.

A person who judges every experience by its perceived fairness has fallen for this fallacy, and will likely feel anger, resentment, and hopelessness when they inevitably encounter a situation that is not fair.

14. Fallacy of Change

Another ‘fallacy’ distortion involves expecting others to change if we pressure or encourage them enough. This distortion is usually accompanied by a belief that our happiness and success rests on other people, leading us to believe that forcing those around us to change is the only way to get what we want.

A man who thinks “If I just encourage my wife to stop doing the things that irritate me, I can be a better husband and a happier person” is exhibiting the fallacy of change.

15. Always Being Right

Perfectionists and those struggling with Imposter Syndrome will recognize this distortion – it is the belief that we must always be right. For those struggling with this distortion, the idea that we could be wrong is absolutely unacceptable, and we will fight to the metaphorical death to prove that we are right.

For example, the internet commenters who spend hours arguing with each other over an opinion or political issue far beyond the point where reasonable individuals would conclude that they should “agree to disagree” are engaging in the “Always Being Right” distortion. To them, it is not simply a matter of a difference of opinion, it is an intellectual battle that must be won at all costs.

16. Heaven’s Reward Fallacy

This distortion is a popular one, and it’s easy to see myriad examples of this fallacy playing out on big and small screens across the world. The “Heaven’s Reward Fallacy” manifests as a belief that one’s struggles, one’s suffering, and one’s hard work will result in a just reward.

It is obvious why this type of thinking is a distortion – how many examples can you think of, just within the realm of your personal acquaintances, where hard work and sacrifice did not pay off?

Sometimes no matter how hard we work or how much we sacrifice, we will not achieve what we hope to achieve. To think otherwise is a potentially damaging pattern of thought that can result in disappointment, frustration, anger, and even depression when the awaited reward does not materialize.

These distortions in our thinking are often subtle, and it is challenging to recognize them when they are a regular feature of our day-to-day thoughts. Importantly also, these distortions have been shown to relate positively to symptoms of depression, meaning that where cognitive distortions abound, symptoms of depression are likely to occur (Burns et al., 1987).

But, all is not lost. Identifying and being mindful of when we engage in these distorted thoughts can be really helpful. Ways to tackle this may be by keeping a thought log, checking whether these thoughts are facts or just the opinions of ourselves or others or, even putting our thoughts on trial and actively trying to challenge them.

Attempting to recognize and challenge our cognitive distortions can be difficult, but know that we aren’t alone in this experience. Shedding a gentle awareness onto our thoughts can be a great first step.

common errors in critical thinking

These distortions, while common and potentially extremely damaging, are not something we must simply resign ourselves to living with.

Beck, Burns, and other researchers in this area have developed numerous ways to identify, challenge, minimize, or erase these distortions from our thinking.

Some of the most effective and evidence-based techniques and resources are listed below.

Cognitive Distortions Handout

Since you must first identify the distortions you struggle with before you can effectively challenge them, this resource is a must-have.

The Cognitive Distortions handout lists and describes several types of cognitive distortions to help you figure out which ones you might be dealing with.

The distortions listed include:

  • All-or-Nothing Thinking;
  • Overgeneralizing;
  • Discounting the Positive;
  • Jumping to Conclusions;
  • Mind Reading;
  • Fortune Telling;
  • Magnification (Catastrophizing) and Minimizing;
  • Emotional Reasoning;
  • Should Statements;
  • Labeling and Mislabeling;
  • Personalization.

The descriptions are accompanied by helpful descriptions and a couple of examples.

This information can be found in the Increasing Awareness of Cognitive Distortions exercise in the Positive Psychology Toolkit© .

Automatic Thought Record

This worksheet is an excellent tool for identifying and understanding your cognitive distortions. Our automatic, negative thoughts are often related to a distortion that we may or may not realize we have. Completing this exercise can help you to figure out where you are making inaccurate assumptions or jumping to false conclusions.

The worksheet is split into six columns:

  • Automatic Thoughts (ATs)
  • Your Response
  • A More Adaptive Response

First, you note the date and time of the thought.

In the second column, you will write down the situation. Ask yourself:

  • What led to this event?
  • What caused the unpleasant feelings I am experiencing?

The third component of the worksheet directs you to write down the negative automatic thought, including any images or feelings that accompanied the thought. You will consider the thoughts and images that went through your mind, write them down, and determine how much you believed these thoughts.

After you have identified the thought, the worksheet instructs you to note the emotions that ran through your mind along with the thoughts and images identified. Ask yourself what emotions you felt at the time and how intense the emotions were on a scale from 1 (barely felt it) to 10 (completely overwhelming).

Next, you have an opportunity to come up with an adaptive response to those thoughts. This is where the real work happens, where you identify the distortions that are cropping up and challenge them.

Ask yourself these questions:

  • Which cognitive distortions were you employing?
  • What is the evidence that the automatic thought(s) is true, and what evidence is there that it is not true?
  • You’ve thought about the worst that can happen, but what’s the best that could happen? What’s the most realistic scenario?
  • How likely are the best-case and most realistic scenarios?

Finally, you will consider the outcome of this event. Think about how much you believe the automatic thought now that you’ve come up with an adaptive response, and rate your belief. Determine what emotion(s) you are feeling now and at what intensity you are experiencing them.

common errors in critical thinking

Download 3 Free Positive CBT Exercises (PDF)

These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.

Download 3 Free Positive CBT Tools Pack (PDF)

By filling out your name and email address below.

Decatastrophizing

This is a particularly good tool for talking yourself out of catastrophizing a situation.

The worksheet begins with a description of cognitive distortions in general and catastrophizing in particular; catastrophizing is when you distort the importance or meaning of a problem to be much worse than it is, or you assume that the worst possible scenario is going to come to pass. It’s a reinforcing distortion, as you get more and more anxious the more you think about it, but there are ways to combat it.

First, write down your worry. Identify the issue you are catastrophizing by answering the question, “What are you worried about?”

Once you have articulated the issue that is worrying you, you can move on to thinking about how this issue will turn out.

Think about how terrible it would be if the catastrophe actually came to pass. What is the worst-case scenario? Consider whether a similar event has occurred in your past and, if so, how often it occurred. With the frequency of this catastrophe in mind, make an educated guess of how likely the worst-case scenario is to happen.

After this, think about what is most likely to happen–not the best possible outcome, not the worst possible outcome, but the most likely. Consider this scenario in detail and write it down. Note how likely you think this scenario is to happen as well.

Next, think about your chances of surviving in one piece. How likely is it that you’ll be okay one week from now if your fear comes true? How likely is it that you’ll be okay in one month? How about one year? For all three, write down “Yes” if you think you’d be okay and “No” if you don’t think you’d be okay.

Finally, come back to the present and think about how you feel right now. Are you still just as worried, or did the exercise help you think a little more realistically? Write down how you’re feeling about it.

This worksheet can be an excellent resource for anyone who is worrying excessively about a potentially negative event.

You can download the Decatastrophizing Worksheet here.

Cataloging Your Inner Rules

Cognitive distortions include assumptions and rules that we hold dearly or have decided we must live by. Sometimes these rules or assumptions help us to stick to our values or our moral code, but often they can limit and frustrate us.

This exercise can help you to think more critically about an assumption or rule that may be harmful.

First, think about a recent scenario where you felt bad about your thoughts or behavior afterward. Write down a description of the scenario and the infraction (what you did to break the rule).

Next, based on your infraction, identify the rule or assumption that was broken. What are the parameters of the rule? How does it compel you to think or act?

Once you have described the rule or assumption, think about where it came from. Consider when you acquired this rule, how you learned about it, and what was happening in your life that encouraged you to adopt it. What makes you think it’s a good rule to have?

Now that you have outlined a definition of the rule or assumption and its origins and impact on your life, you can move on to comparing its advantages and disadvantages. Every rule or assumption we follow will likely have both advantages and disadvantages.

The presence of one advantage does not mean the rule or assumption is necessarily a good one, just as the presence of one disadvantage does not automatically make the rule or assumption a bad one. This is where you must think critically about how the rule or assumption helps and/or hurts you.

Finally, you have an opportunity to think about everything you have listed and decide to either accept the rule as it is, throw it out entirely and create a new one, or modify it into a rule that would suit you better. This may be a small change or a big modification.

If you decide to change the rule or assumption, the new version should maximize the advantages of the rule, minimize or limit the disadvantages, or both. Write down this new and improved rule and consider how you can put it into practice in your daily life.

You can download the Cataloging Your Inner Rules Worksheet.

Facts or Opinions?

This is one of the first lessons that participants in cognitive behavioral therapy (CBT) learn – that facts are not opinions. As obvious as this seems, it can be difficult to remember and adhere to this fact in your day to day life.

This exercise can help you learn the difference between fact and opinion, and prepare you to distinguish between your own opinions and facts.

The worksheet lists the following fifteen statements and asks the reader to decide whether they are fact or opinion:

  • I am a failure.
  • I’m uglier than him/her.
  • I said “no” to a friend in need.
  • A friend in need said “no” to me.
  • I suck at everything.
  • I yelled at my partner.
  • I can’t do anything right.
  • He said some hurtful things to me.
  • She didn’t care about hurting me.
  • This will be an absolute disaster.
  • I’m a bad person.
  • I said things I regret.
  • I’m shorter than him.
  • I am not loveable.
  • I’m selfish and uncaring.
  • Everyone is a way better person than I am.
  • Nobody could ever love me.
  • I am overweight for my height.
  • I ruined the evening.
  • I failed my exam.

Practicing making this distinction between fact and opinion can improve your ability to quickly differentiate between the two when they pop up in your own thoughts.

Here is the Facts or Opinions Worksheet .

In case you’re wondering which is which, here is the key:

  • I am a failure. False
  • I’m uglier than him/her. False
  • I said “no” to a friend in need. True
  • A friend in need said “no” to me. True
  • I suck at everything. False
  • I yelled at my partner. True
  • I can’t do anything right. False
  • He said some hurtful things to me. True
  • She didn’t care about hurting me. False
  • This will be an absolute disaster. False
  • I’m a bad person. False
  • I said things I regret. True
  • I’m shorter than him. True
  • I am not loveable. False
  • I’m selfish and uncaring. False
  • Everyone is a way better person than I am. False
  • Nobody could ever love me. False
  • I am overweight for my height. True
  • I ruined the evening. False
  • I failed my exam. True

Putting Thoughts on Trial

This exercise uses CBT theory and techniques to help you examine your irrational thoughts. You will act as the defense attorney, prosecutor, and judge all at once, providing evidence for and against the irrational thought and evaluating the merit of the thought based on this evidence.

The worksheet begins with an explanation of the exercise and a description of the roles you will be playing.

The first box to be completed is “The Thought.” This is where you write down the irrational thought that is being put on trial.

Next, you fill out “The Defense” box with evidence that corroborates or supports the thought. Once you have listed all of the defense’s evidence, do the same for “The Prosecution” box. Write down all of the evidence calling the thought into question or instilling doubt in its accuracy.

When you have listed all of the evidence you can think of, both for and against the thought, evaluate the evidence and write down the results of your evaluation in “The Judge’s Verdict” box.

This worksheet is a fun and engaging way to think critically about your negative or irrational thoughts and make good decisions about which thoughts to modify and which to embrace.

Click here to see this worksheet for yourself (TherapistAid).

common errors in critical thinking

World’s Largest Positive Psychology Resource

The Positive Psychology Toolkit© is a groundbreaking practitioner resource containing over 500 science-based exercises , activities, interventions, questionnaires, and assessments created by experts using the latest positive psychology research.

Updated monthly. 100% Science-based.

“The best positive psychology resource out there!” — Emiliya Zhivotovskaya , Flourishing Center CEO

Hopefully, this piece has given you a good understanding of cognitive distortions. These sneaky, inaccurate patterns of thinking and believing are common, but their potential impact should not be underestimated.

Even if you are not struggling with depression, anxiety, or another serious mental health issue, it doesn’t hurt to evaluate your own thoughts every now and then. The sooner you catch a cognitive distortion and mount a defense against it, the less likely it is to make a negative impact on your life.

What is your experience with cognitive distortions? Which ones do you struggle with? Do you think we missed any important ones? How have you tackled them, whether in CBT or on your own?

Let us know in the comments below. We love hearing from you.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .

  • Beck, A. T. (1976). Cognitive therapies and emotional disorders . New York, NY: New American Library.
  • Burns, D. D. (1980). Feeling good: The new mood therapy. New York, NY: New American Library.
  • Burns, D. D. (1989). The feeling good handbook. New York, NY: Morrow.
  • Burns, D. D., Shaw, B. F., & Croker, W. (1987). Thinking styles and coping strategies of depressed women: An empirical investigation. Behaviour Research and Therapy, 25, 223-225.
  • Feeling Good. (n.d.). About. Feeling Good. Retrieved from https://feelinggood.com/about/
  • GoodTherapy. (2015). Aaron Beck. GoodTherapy LLC. Retrieved from https://www.goodtherapy.org/famous-psychologists/aaron-beck.html
  • Summit for Clinical Excellence. (n.d.). David Burns, MD. Summit for clinical excellence faculty page. Retrieved from https://summitforclinicalexcellence.com/partners/faculty/david-burns/
  • TherapistAid. (n.d.). Cognitive restructuring: Thoughts on trial.  Retrieved from https://www.therapistaid.com/worksheets/putting-thoughts-on-trial.pdf

' src=

Share this article:

Article feedback

What our readers think.

Sophie

Wow! Very interesting! 😀 I wish you all the best <3

MG

Hello, I enjoyed the article. my only complaint is that we should focus on grammatical correctness. Him/Her is very bad grammar we have them and they which reads as much more adult

Julia Poernbacher

thank you for your valuable input! We are very much aware of it and have already implemented this in the newer articles, but we definitely still need to adapt the grammatical correctness in older content.

Warm regards, Julia | Community Manager

Geert Heymans

Grammar indeed. They/them is the plural form for he/she and him/her. Without any political pseudo-scientific marxist-leninist based imperialisme of a minority. Language follows its own rules. Let it be as it always has been. As a rose is a rose. And a cow is a cow.

Anzla Ahmed

It’s really helpful. I’m myself struggling with anxiety and I get to know that there are so many cognitive distortions I’ve encountered with .Thank you so much .And I hope that I came out of this phase as soon as possible .

ahmed wael

Great article, thanks for sharing these valuable information and exercises

John Fernandez

Great article! Was a good PDF to read!

Wail

Hi, from where I can download a questionnaire that will help to understand what is their thinking style?

Caroline Rou

Thanks for your question, I’d love to help! Could you clarify exactly which questionnaire you mean?

If you are referring to the Increasing Awareness of Cognitive Distortions exercise, you must subscribe to our PositivePsychology.com Toolkit© to gain access.

Let me know if I understood your request clearly 🙂

Kind regards, -Caroline | Community Manager

Patrick Maina

Hi Caroline, i have a problem with catastrophizing thinking error, i do imagine and magnify some things with no reason how can i prevent that

Hi Patrick,

It’s wonderful to see your self-awareness, as recognizing the issue is the vital first step towards resolution!

We are pleased to offer you two complimentary resources designed to assist you in addressing catastrophizing thought patterns: (1) the “ Challenging Catastrophic Thinking ” worksheet, and (2) the “ Decatastrophizing ” worksheet.

We’re truly hoping these tools are helpful. Just remember, it’s one step at a time. You’ve got this! Kind regards, Julia | Community Manager

Let us know your thoughts Cancel reply

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

Sleep deprivation

The Hidden Costs of Sleep Deprivation & Its Consequences

In the short term, a lack of sleep can leave us stressed and grumpy, unwilling, or unable to take on what is needed or expected [...]

Circadian Rhythm

Circadian Rhythm: The Science Behind Your Internal Clock

Circadian rhythms are the daily cycles of our bodily processes, such as sleep, appetite, and alertness. In a sense, we all know about them because [...]

Health Belief Model

What Is the Health Belief Model? An Updated Look

Early detection through regular screening is key to preventing and treating many diseases. Despite this fact, participation in screening tends to be low. In Australia, [...]

Read other articles by their category

  • Body & Brain (50)
  • Coaching & Application (58)
  • Compassion (25)
  • Counseling (51)
  • Emotional Intelligence (23)
  • Gratitude (18)
  • Grief & Bereavement (21)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (20)
  • Mindfulness (44)
  • Motivation & Goals (45)
  • Optimism & Mindset (34)
  • Positive CBT (30)
  • Positive Communication (22)
  • Positive Education (47)
  • Positive Emotions (32)
  • Positive Leadership (19)
  • Positive Parenting (16)
  • Positive Psychology (34)
  • Positive Workplace (37)
  • Productivity (18)
  • Relationships (43)
  • Resilience & Coping (39)
  • Self Awareness (21)
  • Self Esteem (38)
  • Strengths & Virtues (32)
  • Stress & Burnout Prevention (34)
  • Theory & Books (46)
  • Therapy Exercises (37)
  • Types of Therapy (64)

3 positive psychology exercises

Download 3 Free Positive Psychology Tools Pack (PDF)

3 Positive Psychology Tools (PDF)

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - Logical Fallacies

Critical thinking and decision-making  -, logical fallacies, critical thinking and decision-making logical fallacies.

GCFLearnFree Logo

Critical Thinking and Decision-Making: Logical Fallacies

Lesson 7: logical fallacies.

/en/problem-solving-and-decision-making/how-critical-thinking-can-change-the-game/content/

Logical fallacies

If you think about it, vegetables are bad for you. I mean, after all, the dinosaurs ate plants, and look at what happened to them...

illustration of a dinosaur eating leaves while a meteor falls in the background

Let's pause for a moment: That argument was pretty ridiculous. And that's because it contained a logical fallacy .

A logical fallacy is any kind of error in reasoning that renders an argument invalid . They can involve distorting or manipulating facts, drawing false conclusions, or distracting you from the issue at hand. In theory, it seems like they'd be pretty easy to spot, but this isn't always the case.

Watch the video below to learn more about logical fallacies.

Sometimes logical fallacies are intentionally used to try and win a debate. In these cases, they're often presented by the speaker with a certain level of confidence . And in doing so, they're more persuasive : If they sound like they know what they're talking about, we're more likely to believe them, even if their stance doesn't make complete logical sense.

illustration of a politician saying, "I know for a fact..."

False cause

One common logical fallacy is the false cause . This is when someone incorrectly identifies the cause of something. In my argument above, I stated that dinosaurs became extinct because they ate vegetables. While these two things did happen, a diet of vegetables was not the cause of their extinction.

illustration showing that extinction was not caused by some dinosaurs being vegetarians

Maybe you've heard false cause more commonly represented by the phrase "correlation does not equal causation ", meaning that just because two things occurred around the same time, it doesn't necessarily mean that one caused the other.

A straw man is when someone takes an argument and misrepresents it so that it's easier to attack . For example, let's say Callie is advocating that sporks should be the new standard for silverware because they're more efficient. Madeline responds that she's shocked Callie would want to outlaw spoons and forks, and put millions out of work at the fork and spoon factories.

illustration of Maddie accusing Callie of wanting to outlaw spoons and forks

A straw man is frequently used in politics in an effort to discredit another politician's views on a particular issue.

Begging the question

Begging the question is a type of circular argument where someone includes the conclusion as a part of their reasoning. For example, George says, “Ghosts exist because I saw a ghost in my closet!"

illustration of George claiming that ghosts exists and him seeing one in his closet

George concluded that “ghosts exist”. His premise also assumed that ghosts exist. Rather than assuming that ghosts exist from the outset, George should have used evidence and reasoning to try and prove that they exist.

illustration of George using math and reasoning to try and prove that ghosts exist

Since George assumed that ghosts exist, he was less likely to see other explanations for what he saw. Maybe the ghost was nothing more than a mop!

illustration of a splitscreen showing a ghost in a closet on the left, and that same closet with a mop in it on the right

False dilemma

The false dilemma (or false dichotomy) is a logical fallacy where a situation is presented as being an either/or option when, in reality, there are more possible options available than just the chosen two. Here's an example: Rebecca rings the doorbell but Ethan doesn't answer. She then thinks, "Oh, Ethan must not be home."

illustration showing the false dilemma of either Ethan being home or his home being empty

Rebecca posits that either Ethan answers the door or he isn't home. In reality, he could be sleeping, doing some work in the backyard, or taking a shower.

illustration of Ethan sleeping, doing yard work, and taking a shower

Most logical fallacies can be spotted by thinking critically . Make sure to ask questions: Is logic at work here or is it simply rhetoric? Does their "proof" actually lead to the conclusion they're proposing? By applying critical thinking, you'll be able to detect logical fallacies in the world around you and prevent yourself from using them as well.

previous

Sources of Errors in Thinking and How to Avoid Them

  • First Online: 20 December 2016

Cite this chapter

common errors in critical thinking

  • Balu H. Athreya 3 , 4 &
  • Chrystalla Mouza 5  

1217 Accesses

In this chapter we discuss ways in which our experiences might mislead our thinking. We identify sources of errors in thinking as well as time-honored strategies to avoid these errors. Being aware of these errors is crucial to developing critical thinking skills.

“Natural intelligence is no barrier to the propagation of error.” —John Dewey ( 1910 , p. 21) “Distortions in thinking are often due to unconscious bias and unrecognized ignorance” —Susan Stebbing ( 1939 , p. 5)

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Adler, M. J. (1978). Aristotle for everybody . New York: Bantam Books.

Google Scholar  

Baron, J. (1993). Why teach thinking? Applied Psychology, 42 (3), 191–214.

Article   Google Scholar  

Beall, J. C., & Restall, G. (2013). Logical consequences . Retrieved May 7, 2016, from Stanford Encyclopedia of Philosophy: plato.stanford.edu

Beveridge, W. I. (1957). The art of scientific investigation (3rd ed.). London: William Heinemann.

Boss, J. (2014). Think: Critical thinking and logic skills for everyday life (3rd ed.). Columbus, OH: McGraw-Hill Education.

Browne, N. M., & Keeley, S. M. (2015). Asking the right questions: A guide to critical thinking . Upper Saddle River, NJ: Pearson Education Inc.

Buckwalter, J. A., Tolo, V. T., & O’Keefe, R. J. (2015). How do you know it is true? Integrity in research and publications. Journal of Bone and Joint Surgery, 97 , e-2.

Croskerry, P. (2003). The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine, 78 (8), 775–780.

Dawes, R. M. (1988). Rational choice in an uncertain world . Orlando, FL: Harcourt Brace College Publishers.

de Bono, E. (1994). De Bono’s thinking course . New York: Facts on File Inc.

De Condorcet, M. (1802). A historical review of the progress of the human mind . Baltimore, MD: J. Frank.

Deutsch, D. (2011). The beginning of infinity . New York: Viking.

Dewey, J. (1910). How we think . Boston, MA: D.C. Heath & Co.

Book   Google Scholar  

Dimnet, E. (1928). The art of thinking . Greenwich, CT: Fawcett.

Durant, W., & Durant, A. (1961). The story of civilization . New York: Simon and Schuster.

Eddy, D. A., & Clanton, C. H. (1982). The art of diagnosis: Solving the clinicopathological exercise. New England Journal of Medicine, 306 , 1263–1268.

Elenjimittam, A. (1974). The yoga philosophy of Patanjali . Allahabad, India: Better Yourself Books.

Etkin, J., Evangelidis, I., & Aker, J. (2014). Pressed for time? Goal conflict shapes how time is perceived, spent and valued. Journal of Marketing Research . doi: 10.1509/jmr.14.0130 .

Flesch, R. (1951). The art of clear thinking . New York, NY: Harper & Row.

Fisher, M., Goddu, M. K., & Keil, F. C. (2015). Searching for explanations: How internet inflates estimates of internal knowledge. Journal of Experimental Psychology, 144 (3), 674–687.

Gilovich, T. (1991). How we know what isn’t so: The fallibility of human reason in everyday life . New York: Free Press.

Goodman, S. N. (2016). Aligning statistical and scientific reasoning: Misunderstanding and misuse of statistical significance impede science. Science, 352 (6290), 1180–1181.

Graber, M. L., Franklin, N., & Gordon, R. (2005). Diagnostic error in internal medicine. Archives of Internal Medicine, 165 , 1493–1499.

Guyatt, G., Drummond, R., Meade, M. O., & Cook, D. J. (2015). User’s guide to the medical literature: Essentials of evidence-based medical practice . New York: McGraw-Hill Education/Medicine.

Habits of the Mind. (1989). Retrieved May 25, 2016, from AAAS: Science for all Americans Online: http://www.project2061.org/publications/sfaa/online/chap12.htm

Hobbs, R. (2010). Digital and media literacy: A plan for action . Aspen, CO: The Aspen Institute.

Hofstadter, D., & Sander, E. (2013). Surfaces and essences: Analogy as a fuel and fire of thinking . New York: Basic Books.

Kahneman, D. (2011). Thinking fast and slow . New York: Farrar, Straus & Giroux.

Kornfield, J. (2008). Meditation for beginners . Boulder, CO: Sounds True Inc.

Kulka, A. (2006). Mental trap: Stupid things that sane people do to mess up their minds . New York: McGraw-Hill.

MacKinght, C. B. (2000). Teaching critical thinking through online discussions. Educause Quarterly, 4 , 38–41.

McIntyre, N., & Popper, K. (1983). The critical attitude in medicine: The need for a new ethics. British Medical Journal, 287 , 1919–1923.

Metzger, M. (2009). Credibility research to date . Retrieved from Credibility and Media @UCSB: http://www.credibilty.ucsb.edu/past_research.php

Miller, G. A. (1956). The magical Number seven, plus or minus two: Some limits on our capacity to processing information. Psychological Review, 63 , 81–97.

Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media multitaskers. Proceedings of the National Academy of Sciences, 106 (37), 15583–15587.

Patten, B. M. (2004). Truth, knowledge, just plain Bull. How to tell the difference . Amherst, NY: Prometheus Books.

Pei, M. (1978). Weasle words—The art of saying what you do not mean . New York: Harper & Row.

Pingdom. (2012). Internet 2012 in numbers . Retrieved June 7, 2016, from http://royal.pingdom.com : http://royal.pingdom.com/2013/01/16/internet-2012-in-numbers/

Potter, J. W. (2013). Media literacy (7th ed.). Thousand Oaks, CA: Sage Publications.

Priest, G. (2000). Logic: A very short introduction . Oxford, UK: Oxford University Press.

Ross, L., Greene, D., & House, P. (1977). The “false consensus” effect. An egocentric bias in social perception and attribution process. Journal of Experimental and Social Psychology, 13 , 279–301.

Rupert, C. W. (2012). The comforts of unreason . London: Forgotten Books.

Schulte, B. (2014). Overwhelmed: Work, love and play when no one has the time . New York: Sarah Crichton Books.

Sheridan, H., & Reingold, E. M. (2013). The mechanism and boundary conditions of the Einstellung effect in chess: Evidence from eye movements. PLoS One . doi: 10.1371/journal.pone.0075796 .

Shermer, M. (2015, March). Forging doubt. Scientific American , p. 74.

Stebbing, S. L. (1939). Thinking to some purpose . London, UK: Penguin.

Taleb, N. N. (2010). The black swan: The impact of the highly improbable (2nd ed.). New York: Random House Trade Paperbacks.

The National Council for Excellence in Critical Thinking. (2015). Retrieved October 24, 2016, from http://www.criticalthinking.org/pages/the-national-council-for-excellence-in-critical-thinking/406

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 , 1124–1131.

Wasserstein, R. L., & Lazar, N. A. (2016). The ASA statement on p-value: Context, process and purpose. The American Statistician, 70 (2), 129–133.

Download references

Author information

Authors and affiliations.

University of Pennsylvania – Perelman School of Medicine and Thomas Jefferson University – Sidney Kimmel Medical College, Philadelphia, PA, USA

Balu H. Athreya ( Professor Emeritus of Pediatrics, Teaching Consultant )

Nemours- A.I.duPont Hospital for Children, Wilmington, DE, USA

School of Education, University of Delaware, Newark, DE, USA

Chrystalla Mouza

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing Switzerland

About this chapter

Athreya, B.H., Mouza, C. (2017). Sources of Errors in Thinking and How to Avoid Them. In: Thinking Skills for the Digital Generation. Springer, Cham. https://doi.org/10.1007/978-3-319-12364-6_7

Download citation

DOI : https://doi.org/10.1007/978-3-319-12364-6_7

Published : 20 December 2016

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-12363-9

Online ISBN : 978-3-319-12364-6

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

common errors in critical thinking

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Book Insights

Work Disrupted: Opportunity, Resilience, and Growth in the Accelerated Future of Work

Jeff Schwartz and Suzanne Riss

Zenger and Folkman's 10 Fatal Leadership Flaws

Avoiding Common Mistakes in Leadership

Add comment

Comments (1)

priyanka ghogare

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Latest Updates

Article az45dcz

Pain Points Podcast - Presentations Pt 2

Article ad84neo

NEW! Pain Points - How Do I Decide?

Mind Tools Store

About Mind Tools Content

Discover something new today

Finding the Best Mix in Training Methods

Using Mediation To Resolve Conflict

Resolving conflicts peacefully with mediation

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Developing personal accountability.

Taking Responsibility to Get Ahead

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Therapy Central

Counselling, Psychotherapy, CBT & Mindfulness in Central London & Online

Home » Blog » Mental Health » What are Thinking Errors in CBT (and how to manage them)

Thinking Errors CBT Feature Image

  • 9 May / 2021

What are Thinking Errors in CBT (and how to manage them)

When generally talking about how effective CBT Therapy can be, we have discussed how two main components are involved in this type of work: Cognitive Therapy, which deals with the way we think, and Behavioural Therapy, which deals with our actions. 

In this post, we’ll focus on one aspect of Cognitive Therapy which is crucial to CBT: dealing with Thinking Errors. These are automatic, often unrealistic types of thinking that can rapidly affect our mood and keep us stuck in a cycle of anxiety, sadness or other difficult emotions. However, in CBT, learning how to identify and label them, can make the difference between escalating and containing our difficult emotions.

What are Thinking Errors?

Thinking Errors – also known as Cognitive Distortions – are irrational and extreme ways of thinking that can maintain mental and emotional issues. Anxiety, low mood, worry, anger management issues are often fuelled by this type of thinking.

Thinking errors, proposed initially by Aaron Beck (1963) (one of the leading CBT figures), are essential in how CBT works with anxiety and other issues.

Although we all fall prey to irrational and extreme thinking, Thinking Errors are a distinctive aspect of the everyday life of those who often experience unpleasant emotions. In anxiety, for instance, the unpleasant feelings are triggered by frequent negative and unbalanced thinking. This type of thinking then informs decisions on how to act, which are equally unhelpful. This chain of events keeps us stuck in a vicious anxiety cycle, as the one below. 

CBT Vicious Cycle of Anxiety

Share this Image On Your Site

</p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><p><strong>Please include attribution to https://therapy-central.com with this graphic.</strong></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><p><a href=’https://therapy-central.com/2021/05/09/thinking-errors-cbt-and-manage-them/’><img src=’https://therapy-central.com/wp-content/uploads/2021/05/Anxiety_CBT_Vicious_Cycle.png’ alt=’CBT Vicious Cycle of Anxiety’ 540px border=’0′ /></a></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><br /><p>

How do Thinking Errors Affect Anxiety?

Thinking errors play an essential role in keeping us anxious, low or frustrated. They’re what makes the difference between seeing the glass half full or half empty.

Cognitive distortions tend to be consistent with the expectations we have of a situation. For instance, if we have a generally negative outlook on how others see us, it’s more likely that our thinking errors will confirm such negative expectations. For example, after our boss expresses dissatisfaction with our department’s performance, typical thinking errors that may arise could be: “She thinks I am rubbish” (Mind Reading) and “I will lose my job” (Catastrophising). 

By falling trap of and believing these thinking errors are factual, we will sink deeper into the negative emotions associated with them, for example, worry, anxiety or fear.

This is why thinking errors are a critical component in increasing and maintaining our anxiety.

How do Thoughts Affect our Mental Health?

If you missed our main article on CBT therapy for anxiety , let’s briefly refresh why thoughts are so crucial to our mental health. CBT believes that we feel anxious, sad or angry because of the thoughts (or images) triggered by the situations that make us feel anxious, sad or angry. 

With anxiety, for instance, if you notice your heart racing and have the thought: “I’ll have a heart attack!” this is likely to make you feel anxious. Although it’s completely normal to feel anxious if you believe you’re having a heart attack, if you look closely (and if you don’t have a physical condition), the likelihood of that thought being true is generally very low. Yet, by believing in such a thought, you’re accidentally falling into the trap of anxiety. This is the reason we label these kinds of thoughts as negative or ‘unhelpful’. Your unhelpful thoughts become a prime target of CBT Therapy.

How does CBT work with Thinking Errors?

One of the aims of CBT Therapy for Anxiety (and other issues) is to work on challenging and reframing negative, unhelpful thoughts. This almost always involves some form of journaling and keeping a diary of the thoughts that affect your mood daily. The goal here is to help you identify your unhelpful thoughts and label them as irrational. Following that, with the help of a CBT therapist, the work shifts towards generating more balanced and rational, evidence-based alternative thoughts.

When new, realistic and balanced thoughts are adopted, replacing the unhelpful, irrational ones, you will start to notice a reduction in the intensity of your anxiety reaction. The more unhelpful thoughts are recognised and replaced with helpful, realistic ones, the more anxiety loses its grip on you. You’re able to live a life guided by your choices rather than by fear.

The Most Common Thinking Errors in CBT

</p><br /><br /><br /><br /><br /><br /><br /><br /><br /><p><strong>Please include attribution to https://therapy-central.com with this graphic.</strong></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><p><a href=’https://therapy-central.com/2021/05/09/thinking-errors-cbt-and-manage-them/’><img src=’https://therapy-central.com/wp-content/uploads/2021/05/MostCommonThinkingErrorsCBT.png’ alt=’The Most Common Thinking Errors in CBT’ 540px border=’0′ /></a></p><br /><br /><br /><br /><br /><br /><br /><br /><br /><p>

How Can We Identify Thinking Errors? A Practical Example

Challenging and reframing unhelpful thoughts can be significantly improved when you can categorise them as thinking errors.

Once you know how they sound, it’s easy to identify if one of your thoughts is a thinking error. When you identify a thought as a thinking error, it’s much more likely to lose its credibility, leading you to feel less anxious, sad or angry.

In other words, recognising and labelling thinking errors when they arise can significantly improve our ability to start escaping the anxiety trap.

Let’s see how it’s done with an example:  

  • Jennifer is at a work meeting with 10 other people; she’s usually a bit shy and tends to not talk too much when there are many people around for fear of being judged.
  • Jennifer’s boss explains that her department did not meet the targets and expresses her disappointment.
  • Jennifer feels anxious and on edge. Although she knows her team has done everything they could to meet the target, she chooses not to say a word for fear of confronting her boss.
  • After the meeting, Jennifer returns to her desk. She notices her anxiety increasing. Her heart is racing. In her mind, many thoughts crop up about her boss and the meeting: “ She thinks I am rubbish “, “ I will be fired “. Her mind is on a roll, and her thoughts escalate: “ I will never get another job “, “ I won’t be able to pay the rent and end up living on the street “.
  • The more these thoughts mount up, unchallenged, the more Jennifer’s Anxiety grows, to the point of needing to take the afternoon off to go home and cool down.
  • When she’s back home, Jennifer feels less anxious. However, other negative thoughts crop up, like: “ I’m such a failure for having left earlier “, “ others must have noticed my anxiety and believe I am rubbish “. Ultimately these thoughts contribute to maintaining her anxiety.

What’s the issue with Jennifer, then? The main problem is that whenever thoughts naturally arise, she accepts them as facts, no matter how potentially far-fetched or irrational they might be.

If we look closely, many of her thoughts are pretty unhelpful and irrational. Crucially, the most powerful ones are precisely thinking errors! Here they are:

– “ She thinks I am rubbish ” – Mind Reading

– “ I will never get another job ” Catastrophising/Overgeneralising

– “ I will end up living on the street ” Catastrophising/Overgeneralising

Tips to Correct Thinking Errors CBT

</p><br /><br /><br /><br /><br /><br /><br /><br /><p><strong>Please include attribution to https://therapy-central.com with this graphic.</strong></p><br /><br /><br /><br /><br /><br /><br /><br /><p><a href=’https://therapy-central.com/2021/05/09/thinking-errors-cbt-and-manage-them/’><img src=’https://therapy-central.com/wp-content/uploads/2021/05/TipsToCorrectThinkingErrors.png’ alt=’Tips to Correct Thinking Errors CBT’ 540px border=’0′ /></a></p><br /><br /><br /><br /><br /><br /><br /><br /><p>

How to Correct Thinking Errors? Some Practical Tips:

  Catching our minds engaging in thinking errors, and labelling them can be an effective way to avoid making them and tackling anxiety and other challenging emotions in our everyday lives.

If Jennifer could recognise and label her own thoughts as thinking errors in our example, she would have a chance to dismiss them and start decreasing her anxiety. She could then choose to not escape the situation by leaving work early and would ultimately feel better about herself. 

Over time, Jennifer would feel more confident and would less likely fall prey to her thinking errors in the future!

1) Start a daily journal.

Pick your favourite medium (a notebook, your notes app or anything else) and write down the negative emotions you feel daily (e.g., anxiety, worry, sadness, etc.). Then, next to them, jot down the thoughts associated with those emotions (e.g., “my boss thinks I am rubbish”). To do this, ask yourself, “What thought or image is making me feel distressed?”. 

2) Identify and Label your Thinking Errors.

After step one, take a look at the table (or infographic) below, with a list of some of the most well-known thinking errors, and see if any of the thoughts you wrote down can be labelled as thinking errors.

3) Reality Check!

Once you have identified any of your thoughts as thinking errors, it’s time for a reality check. Ask yourself whether they are actually true and remind yourself that these cognitive distortions are known to be unrealistic, extreme and irrational. There is very likely no good reason to believe them. 

Repeating these steps consistently has the potential to help you gradually reduce your anxiety (as well as other negative emotions).

What Are Common Thinking Errors?

Overcome thinking errors starting today.

Thinking errors (or Cognitive Distortions) are well-known mechanisms that keep our negative emotions going. Hopefully, this article helped you learn more about them and get you started on tackling them. Keep in mind that there are times in which we’re stuck in a vicious cycles of  anxiety, sadness and other difficult emotions, and we might not be able to get out of them on our own. If your difficult emotions have significantly started affecting your life, get in touch with us for professional help. Our CBT trained therapists have the expertise to help you work with your thinking errors and negative emotions, discover more about CBT Therapy .

With our help, you’ll have the chance to make the crucial changes to bring balance and fulfilment back into your life!

Get in touch with us for a FREE 15 min consultation today!

Beck, A. T. (1963). Thinking and depression: I. Idiosyncratic content and cognitive distortions . Archives of general psychiatry , 9 (4), 324-333. Chicago

  • Dr. Raffaello Antonino

common errors in critical thinking

Depression & Anxiety in the LGBT Community

Does CBT anxiety Work

Does CBT For Anxiety Work?

common errors in critical thinking

How To Be Less Insecure in a Relationship: A Guide to Building Confidence and Trust

Do you often get jealous in your relationship? Are you constantly seeking the validation of your partner? Maybe you tend to put their needs above your own to please them.

  • 13 Sep / 2021

Online Trauma Therapy: How Does It Work?

  • 18 Apr / 2022

What is Eco-Anxiety and How to Manage it

  • 4 Apr / 2023

common errors in critical thinking

Members of:

We're registered with AXA Health, AXA therapists, AXA counselling, Aviva, Vitality, Cigna, BPS, HCPC, BABCP

A Practical Guide to Critical Thinking, 2nd Edition by David A. Hunter

Get full access to A Practical Guide to Critical Thinking, 2nd Edition and 60K+ other titles, with a free 10-day trial of O'Reilly.

There are also live events, courses curated by job role, and more.

APPENDIX A CRITICAL THINKING MISTAKES

Critical thinking is reasonable and reflective thinking aimed at deciding what to believe and what to do. Throughout this book, we have identified mistakes that a good critical thinker should avoid. Some are mistakes that can arise in clarifying or defining a view. Others are mistakes that can arise as we collect or rely on evidence or reasons for a view. Still others arise when we try to draw conclusions for our evidence. And there are even mistakes that can arise as we assess other people's views or reasons. Knowing what they are will help us to avoid them in our own reasoning. But it will also help to make it clear just what the value is in being a critical thinker: thinking critically is valuable in part because it helps us to avoid some mistakes. This appendix lists all of the mistakes we have discussed.

Personalizing Reasons. It is a mistake to personalize reasons by treating them as if they belonged to someone. That is a mistake for two reasons. First, epistemic reasons are universal: if they are reasons for me to believe something, then they are equally reasons for anyone else to believe it. Second, epistemic reasons are objective: whether a piece of evidence is sufficient or acceptable is an objective matter. It has nothing to do with me or with anyone else. Personalizing reasons can obscure the fact that they are universal and objective. It can also allow emotion to get in the way of thinking critically, if one identifies too ...

Get A Practical Guide to Critical Thinking, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Don’t leave empty-handed

Get Mark Richards’s Software Architecture Patterns ebook to better understand how to design components—and how they should interact.

It’s yours, free.

Cover of Software Architecture Patterns

Check it out now on O’Reilly

Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.

common errors in critical thinking

Back Home

  • Search Search Search …
  • Search Search …

Common Critical Thinking Fallacies

Common Critical Thinking Fallacies

Critical thinking is the process of reaching a decision or judgment by analyzing, evaluating, and reasoning with facts and data presented. However, nobody is thinking critically 100% of the time. Logical reasoning can be prone to fallacies.

A fallacy is an error in reasoning. When there is a fallacy in the reasoning, conclusions are less credible and can be deemed invalid.

How can critical thinking fallacies be avoided? The first step is to be aware of the possible fallacies that can be committed. This article will highlight the most common logical fallacies.

Common fallacies fall under two categories:

  • Fallacies of Relevance
  • Fallacies of Unacceptable Premises

For fallacies of relevance, reasons are presented why a certain conclusion is reached, but these reasons may not be entirely true nor significant to the argument.

Under Fallacies of Relevance are:

“Ad Hominem” is Latin for “to the person”. It’s a fallacy that uses attacks on the person making the argument instead of the argument itself.

This is commonly seen in informal arguments where a person’s looks or characteristics are often attacked instead of the argument they’re making.

  • Red Herring

This is a fallacy of distraction. It sidetracks the main argument by offering a different issue and then claims that this new issue is relevant to the current one. People who do this aim to divert the audience or another person from their arguments.

  • Tu Quoque Fallacy

“Tu Quoque” means “you also” in Latin. This fallacy discredits a person’s argument based on the fact that the person does not practice what he or she preaches.

  • Strawman Fallacy

Where a person refutes another person’s argument by presenting a weakened version of the original argument.

  • Appeal to Authority

Appeal to Authority fallacy claims that an argument is true because someone who has the “authority” on the subject believes that it’s true. For example, a policeman believes that guns should not have permits. This argument should be accepted as the truth because policemen know what they are talking about. Policemen know how to use guns properly, therefore can be called “experts” to the subject matter.

  • Appeal to Popularity or Ad Populum

Much like the previous fallacy, Appeal to Popularity claims that something is true because a lot of people or the majority believe that it’s true. We should steer clear of this fallacy because having 100,000 believers doesn’t make a wrong argument true.

This is commonly used in advertising products. If a good number of people are using the product, why shouldn’t you?

  • Appeal to Tradition

This is very similar to appeal to popularity. The only difference is that this fallacy claims that something is true because it has been believed to be true for a long time. It doesn’t depend on how many believe on it, but rather on how long people have believed it.

  • Appeal to Ignorance or Argumentum Ad Ignorantiam

This fallacy claims that arguments are true because they have never been proven false or are false because they have never been proven to be true. It suggests that the best solution is to remain ignorant about the situation.

  • Appeal to Emotion

As the name suggests, one of the critical thinking fallacies appeal to the emotions of the audience. It aims to evoke feelings like sympathy and affection – both of which can be irrelevant to the original arguments.

  • Fallacy of Composition and Division

The fallacy of composition claims that because some parts of the whole are true, that means the whole must be true. The fallacy of division claims that because the whole is true, all parts of the whole must be true.

An example for composition is that just because with Stephen Curry and Kevin Durant are in Golden State and they are good, they will win championships for sure.

An example for division is: “On average, men tend to have longer legs than women. So, this means that Maria has shorter legs than Jerry.” This is false because for all we know, Maria might have longer legs than Jerry.

  • Equivocation

The fallacy of equivocation uses key words in an ambiguous way. The key words will mean different when used in one claim and then when used in another claim.

For example, Poppy claims that she has the right to watch whatever she wants to watch. Therefore, it’s just right that she watches television all day.

Fallacies of Unacceptable Premises on the other hand, introduces premises that are somehow relevant, but doesn’t completely support the conclusions for the argument.

Under Fallacies of Unacceptable Premises are:

  • False Dichotomy

This fallacy says that there are only two available options and only one of them are correct. In short, it creates a black or white choice. Both cannot be correct, and they are the only possible options.

  • Begging the Question

This critical thinking fallacy assumes that the premise under examination is true. It uses this assumed true premise to support other statements.

  • Slippery Slope

Slippery Slope claims that when one step is taken, it will snowball into something bigger very quickly. For example, Joe argues that if Maria will stop using straws, the economy will fall very fast.

  • Hasty Generalizations

This fallacy creates generalizations from hurried samples. The generalizations might have been made based on a small sample only or a sample that doesn’t entirely represent something properly.

Knowing and studying fallacies is important because this will help people avoid committing them. The presence of critical thinking fallacies weakens and invalidates arguments, so it’s best to steer clear from them.

When someone knows how to identify these fallacies, it will be easier to point out invalid arguments by other people as well. We live in a world where fallacies are often used in arguments – in fact, up to 13 fallacies were already listed and studied in Ancient Greece thousands of years ago.

This is just a testament that these fallacies have existed before and continue to exist now, so we must be aware of them.

You may also like

How to Demonstrate Critical Thinking

How to Demonstrate Critical Thinking

We live in the Information age—barraged by news and other content, and surrounded by information sources such as online archives, ebooks, webinars, […]

metacognition and critical thinking

Metacognition & Critical Thinking: Differences and Similarities

Two terms we usually confuse with each other are Critical Thinking and Metacognition. Even though they both describe the skill of being […]

How to overcome procrastination

How to Overcome Procrastination

About 65% of people surveyed see themselves as procrastinators. A good third of this number profess to be chronic procrastinators. When you […]

Boost Your Associative Thinking Skills in Problem Solving

Boost Your Associative Thinking Skills in Problem Solving: Tips and Strategies

Associative thinking is a powerful tool that can help us solve complex problems by making connections between seemingly unrelated pieces of information. […]

William R. Klemm Ph.D.

Preventing Common Thinking Errors

We would all benefit from thinking more about how we think..

Posted July 18, 2013

  • What Is Cognition?
  • Find a therapist near me

Everybody makes thinking errors. But two things have happened recently that prompt me to write on this subject. One is politics , because in the U.S. it seems like almost everything has become politicized. Politicians, of whatever stripe, seemingly can’t speak more than 30 seconds without making a thinking error. Sometimes it is deliberate, sometimes they are just clueless.

The other prompt comes from the new approach to science- education standards, as promoted by “A Framework for K-12 Science Education” and the “Next Generation Science Standards” (NRC 2012, Achieve, Inc. 2013). Practicing scientists know that the really important part of “doing” science is creative and critical thinking, and it is refreshing to see that science-education policy makers are trying to make such thinking a more prominent feature of K-12 science education. The new standards require students to perform such thinking tasks as “define problems, plan investigations, analyze and interpret data, construct explanations, engage in argument from evidence, and communicate information.” These can be demanding challenges for students, who are not trained in formal logic and who have probably had little explicit instruction in how to think. Teachers may not be ready for this new approach to teaching science, because teacher-education programs do not include formal logic courses.

Science thinking errors also apply to ordinary life. So, in the spirit of my own 50 years of experiences as a researcher, let me summarize some of the more common important thinking errors that even I and my colleagues sometimes make. Hopefully, this listing will help improve the sloppy thinking of politicians and help teachers recognize weaknesses in student argumentation and help students learn to recognize and avoid such errors. All of us could benefit from thinking more about how we think.

Common Thinking Errors

AD HOMINEN ARGUMENT: discounting a position or conclusion on the basis of the person who makes it, rather than the merits of the argument itself.

ALL-OR-NOTHING THINKING: thinking of things in absolute terms, like “always”, “all” or “never”.

ANTHROPOMORPHISM : to attribute qualities and properties that only people can have to non-people. Example: “the purpose of evolution is to ….” Evolution happens, but not because it has a purpose. Only people have purposes.

APPEAL TO AUTHORITY: attempts to justify the conclusion by quoting an authority in its support. Even experts can be wrong. Conclusions should be affirmed by evidence.

APPEAL TO CONSENSUS: arguments defended on the basis that many people hold the same view. This is sometimes called the “Bandwagon Fallacy.” Correctness of a position does not depend on who or how many hold it.

APPEAL TO FINAL CONSEQUENCES: claiming validity for one’s position on the basis of the expected outcome or consequence (also known as a teleological argument). Example: people have free will because otherwise they can’t be held responsible for bad behavior.

APPEAL TO IGNORANCE : using an opponent’s inability to disprove a conclusion as proof of the conclusion’s correctness. Sometimes wrong ideas are so entrenched or hard to disprove that people of special ability are needed to make the case against such ideas.

APPEAL TO LACK OF EVIDENCE: evidence is needed to affirm a concept, but lack of evidence can occur in situations where there is such evidence that has not yet been discovered. Absence of evidence is not evidence of absence.

ARGUMENT SELECTIVITY: using arguments supporting your position while glossing over the weaknesses and leaving out important alternative arguments. This is often called “cherry picking.” A variation of this error is “false dichotomy,” where a set of valid possibilities is reduced to only two. A related inappropriate selectivity is rejecting an idea altogether just because some part of it is wrong.

BEGGING THE QUESTION: an argument simply reasserts the conclusion in another form. This usually occurs when there is a lack of good evidence.

BIASED LABELING: how one labels a position can prejudice objective consideration of the position. For example, calling a position “Science-based” does not necessarily make it true. Or, conversely, calling a position “colloquial” does not necessary invalidate it.

CIRCULAR REASONING: reasoning where a belief in a central claim is both the starting point and the goal of the argument.

CONFIRMATION BIAS . People have a natural tendency to notice only the facts that support their position while discounting those that do not — in other words, believing what you want to believe.

common errors in critical thinking

CONFUSING CORRELATION WITH CAUSATION. When two things happen together, and especially when one occurs just before the other, students commonly think that one thing causes the other. Without other more direct evidence of causation, this assumption is invalid. Both events could be caused by something else. In case students need convincing, just remind them of this example: rain and lightning go together, but neither causes the other.

CONFUSING FORCE OF ARGUMENT WITH ITS VALIDITY: repeating erroneous argument does not validate it. Saying it more elegantly or louder doesn’t help either.

DEDUCTION FALLACIES: a valid deductive argument must have consistent premises and conclusions (both must be either true or both false). Failure to be consistent produces “non-sequiturs,” that is conclusions that are not logical extensions of the premise.

EMOTIONAL REASONING: Making decisions and arguments based on how you feel rather than objective reality. People who allow themselves to get caught up in emotional reasoning can become completely blinded to the difference between feelings and facts. For example, scientists sometimes unduly value a position because it is “parsimonious,” or elegant, or easily understood (or even complex and sophisticated), etc.

EXLUSIVITY CONFUSION. When several apparent ideas or facts are examined, it is important to know whether they are independent, compatible, or mutually exclusive. Example: concepts of evolution and creationism, as they are typically used, are mutually exclusive. However, stated in other ways, they might be more compatible.

FALSE ANALOGY: explaining an idea with an analogy that is not parallel, as in comparing apples and oranges.

JUMPING TO CONCLUSIONS. This error occurs under a variety of situations. The most common cause is failure to consider alternatives. An associated cause is failure to question and test assumptions used to arrive at a conclusion.

MAGNIFICATION & MINIMIZATION: exaggerating negatives and understating positives. One should be aware of how easy it is to exaggerate the positives of a position and understate the negatives.

MISSING THE POINT. Sometimes this happens unintentionally. But frequently recognition that one’s argument is weak creates the temptation to shift focus away from the central issue to related areas where one can make a stronger argument.

NOT LISTENING. Have a clear notion of the issue and the stance that others are taking. If you have to read another’s mind or “read between the lines,” seek clarification lest you end up putting your words in somebody else’s mouth.

OVER-GENERALIZATION. It is illogical to assume that what is true for one is true for all. Example: some scientists studying free will claim that the decision-making process for making a button press is the same for more complex decisions.

UNSUPPORTED ASSERTION. Recognize when a claim is made without supporting evidence. This also occurs when one confuses a judgment or opinion for a fact.

National Research Council (NRC). 2012. A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.

Achieve, Inc. 2013. Next generation science standards. www.nextgenscience.org/next-generation-science-standards .

William R. Klemm Ph.D.

William Klemm , Ph.D ., is a senior professor of Neuroscience at Texas A&M University.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10672018

Logo of jintell

Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An Integrative Review

Associated data.

This research did not involve collection of original data, and hence there are no new data to make available.

A review of the research shows that critical thinking is a more inclusive construct than intelligence, going beyond what general cognitive ability can account for. For instance, critical thinking can more completely account for many everyday outcomes, such as how thinkers reject false conspiracy theories, paranormal and pseudoscientific claims, psychological misconceptions, and other unsubstantiated claims. Deficiencies in the components of critical thinking (in specific reasoning skills, dispositions, and relevant knowledge) contribute to unsubstantiated belief endorsement in ways that go beyond what standardized intelligence tests test. Specifically, people who endorse unsubstantiated claims less tend to show better critical thinking skills, possess more relevant knowledge, and are more disposed to think critically. They tend to be more scientifically skeptical and possess a more rational–analytic cognitive style, while those who accept unsubstantiated claims more tend to be more cynical and adopt a more intuitive–experiential cognitive style. These findings suggest that for a fuller understanding of unsubstantiated beliefs, researchers and instructors should also assess specific reasoning skills, relevant knowledge, and dispositions which go beyond what intelligence tests test.

1. Introduction

Why do some people believe implausible claims, such as the QAnon conspiracy theory, that a cabal of liberals is kidnapping and trafficking many thousands of children each year, despite the lack of any credible supporting evidence? Are believers less intelligent than non-believers? Do they lack knowledge of such matters? Are they more gullible or less skeptical than non-believers? Or, more generally, are they failing to think critically?

Understanding the factors contributing to acceptance of unsubstantiated claims is important, not only to the development of theories of intelligence and critical thinking but also because many unsubstantiated beliefs are false, and some are even dangerous. Endorsing them can have a negative impact on an individual and society at large. For example, false beliefs about the COVID-19 pandemic, such as believing that 5G cell towers induced the spread of the COVID-19 virus, led some British citizens to set fire to 5G towers ( Jolley and Paterson 2020 ). Other believers in COVID-19 conspiracy theories endangered their own and their children’s lives when they refused to socially distance and be vaccinated with highly effective vaccines, despite the admonitions of scientific experts ( Bierwiaczonek et al. 2020 ). Further endangering the population at large, those who believe the false conspiracy theory that human-caused global warming is a hoax likely fail to respond adaptively to this serious global threat ( van der Linden 2015 ). Parents, who uncritically accept pseudoscientific claims, such as the false belief that facilitated communication is an effective treatment for childhood autism, may forego more effective treatments ( Lilienfeld 2007 ). Moreover, people in various parts of the world still persecute other people whom they believe are witches possessing supernatural powers. Likewise, many people still believe in demonic possession, which has been associated with mental disorders ( Nie and Olson 2016 ). Compounding the problems created by these various unsubstantiated beliefs, numerous studies now show that when someone accepts one of these types of unfounded claims, they tend to accept others as well; see Bensley et al. ( 2022 ) for a review.

Studying the factors that contribute to unfounded beliefs is important not only because of their real-world consequences but also because this can facilitate a better understanding of unfounded beliefs and how they are related to critical thinking and intelligence. This article focuses on important ways in which critical thinking and intelligence differ, especially in terms of how a comprehensive model of CT differs from the view of intelligence as general cognitive ability. I argue that this model of CT more fully accounts for how people can accurately decide if a claim is unsubstantiated than can views of intelligence, emphasizing general cognitive ability. In addition to general cognitive ability, thinking critically about unsubstantiated claims involves deployment of specific reasoning skills, dispositions related to CT, and specific knowledge, which go beyond the contribution of general cognitive ability.

Accordingly, this article begins with an examination of the constructs of critical thinking and intelligence. Then, it discusses theories proposing that to understand thinking in the real world requires going beyond general cognitive ability. Specifically, the focus is on factors related to critical thinking, such as specific reasoning skills, dispositions, metacognition, and relevant knowledge. I review research showing that that this alternative multidimensional view of CT can better account for individual differences in the tendency to endorse multiple types of unsubstantiated claims than can general cognitive ability alone.

2. Defining Critical Thinking and Intelligence

Critical thinking is an almost universally valued educational objective in the US and in many other countries which seek to improve it. In contrast, intelligence, although much valued, has often been viewed as a more stable characteristic and less amenable to improvement through specific short-term interventions, such as traditional instruction or more recently through practice on computer-implemented training programs. According to Wechsler’s influential definition, intelligence is a person’s “aggregate or global capacity to act purposefully, to think rationally, and to deal effectively with his environment” ( Wechsler 1944, p. 3 ).

Consistent with this definition, intelligence has long been associated with general cognitive or intellectual ability and the potential to learn and reason well. Intelligence (IQ) tests measure general cognitive abilities, such as knowledge of words, memory skills, analogical reasoning, speed of processing, and the ability to solve verbal and spatial problems. General intelligence or “g” is a composite of these abilities statistically derived from various cognitive subtests on IQ tests which are positively intercorrelated. There is considerable overlap between g and the concept of fluid intelligence (Gf) in the prominent Cattell–Horn–Carroll model ( McGrew 2009 ), which refers to “the ability to solve novel problems, the solution of which does not depend on previously acquired skills and knowledge,” and crystalized intelligence (Gc), which refers to experience, existing skills, and general knowledge ( Conway and Kovacs 2018, pp. 50–51 ). Although g or general intelligence is based on a higher order factor, inclusive of fluid and crystallized intelligence, it is technically not the same as general cognitive ability, a commonly used, related term. However, in this article, I use “general cognitive ability” and “cognitive ability” because they are the imprecise terms frequently used in the research reviewed.

Although IQ scores have been found to predict performance in basic real-world domains, such as academic performance and job success ( Gottfredson 2004 ), an enduring question for intelligence researchers has been whether g and intelligence tests predict the ability to adapt well in other real-world situations, which concerns the second part of Wechsler’s definition. So, in addition to the search for the underlying structure of intelligence, researchers have been perennially concerned with how general abilities associated with intelligence can be applied to help a person adapt to real-world situations. The issue is largely a question of how cognitive ability and intelligence can help people solve real-world problems and cope adaptively and succeed in dealing with various environmental demands ( Sternberg 2019 ).

Based on broad conceptual definitions of intelligence and critical thinking, both intelligence and CT should aid adaptive functioning in the real world, presumably because they both involve rational approaches. Their common association with rationality gives each term a positive connotation. However, complicating the definition of each of these is the fact that rationality also continues to have a variety of meanings. In this article, in agreement with Stanovich et al. ( 2018 ), rationality is defined in the normative sense, used in cognitive science, as the distance between a person’s response and some normative standard of optimal behavior. As such, degree of rationality falls on a continuous scale, not a categorical one.

Despite disagreements surrounding the conceptual definitions of intelligence, critical thinking, and rationality, a commonality in these terms is they are value-laden and normative. In the case of intelligence, people are judged based on norms from standardized intelligence tests, especially in academic settings. Although scores on CT tests seldom are, nor could be, used to judge individuals in this way, the normative and value-laden basis of CT is apparent in people’s informal judgements. They often judge others who have made poor decisions to be irrational or to have failed to think critically.

This value-laden aspect of CT is also apparent in formal definitions of CT. Halpern and Dunn ( 2021 ) defined critical thinking as “the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal-directed.” The positive conception of CT as helping a person adapt well to one’s environment is clearly implied in “desirable outcome”.

Robert Ennis ( 1987 ) has offered a simpler, yet useful definition of critical thinking that also has normative implications. According to Ennis, “critical thinking is reasonable, reflective thinking focused on deciding what to believe or do” ( Ennis 1987, p. 102 ). This definition implies that CT helps people know what to believe (a goal of epistemic rationality) and how to act (a goal of instrumental rationality). This is conveyed by associating “critical thinking” with the positive terms, “reasonable” and “reflective”. Dictionaries commonly define “reasonable” as “rational”, “logical”, “intelligent”, and “good”, all terms with positive connotations.

For critical thinkers, being reasonable involves using logical rules, standards of evidence, and other criteria that must be met for a product of thinking to be considered good. Critical thinkers use these to evaluate how strongly reasons or evidence supports one claim versus another, drawing conclusions which are supported by the highest quality evidence ( Bensley 2018 ). If no high-quality evidence is available for consideration, it would be unreasonable to draw a strong conclusion. Unfortunately, people’s beliefs are too often based on acceptance of unsubstantiated claims. This is a failure of CT, but is it also a failure of intelligence?

3. Does Critical Thinking “Go Beyond” What Is Meant by Intelligence?

Despite the conceptual overlap in intelligence and CT at a general level, one way that CT can be distinguished from the common view of intelligence as general cognitive ability is in terms of what each can account for. Although intelligence tests, especially measures of general cognitive ability, have reliably predicted academic and job performance, they may not be sufficient to predict other everyday outcomes for which CT measures have made successful predictions and have added to the variance accounted for in performance. For instance, replicating a study by Butler ( 2012 ), Butler et al. ( 2017 ) obtained a negative correlation ( r = −0.33) between scores on the Halpern Critical Thinking Appraisal (HCTA) and a measure of 134 negative, real-world outcomes, not expected to befall critical thinkers, such as engaging in unprotected sex or posting a message on social media which the person regretted. They found that higher HCTA scores not only predicted better life decisions, but also predicted better performance beyond a measure of general cognitive ability. These results suggest that CT can account for real-world outcomes and goes beyond general cognitive ability to account for additional variance.

Some theorists maintain that standardized intelligence tests do not capture the variety of abilities that people need to adapt well in the real world. For example, Gardner ( 1999 ), has proposed that additional forms of intelligence are needed, such as spatial, musical, and interpersonal intelligences in addition to linguistic and logical–mathematical intelligences, more typically associated with general cognitive ability and academic success. In other theorizing, Sternberg ( 1988 ) has proposed three additional types of intelligence: analytical, practical, and creative intelligence, to more fully capture the variety of intelligent abilities on which people differ. Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options ( Sternberg 2022 ). Regarding adaptive intelligence, Sternberg ( 2019 ) has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels. According to Sternberg, core components of intelligence have evolved in humans, but intelligence takes different forms in different cultures, with each culture valuing its own skills for adaptation. Thus, the construct of intelligence must go beyond core cognitive ability to encompass the specific abilities needed for adaptive behavior in specific cultures and settings.

Two other theories propose that other components be added to intelligent and rational thinking. Ackerman ( 2022 ) has emphasized the importance of acquiring domain-specific knowledge for engaging in intelligent functioning in the wide variety of tasks found in everyday life. Ackerman has argued that declarative, procedural, and tacit knowledge, as well as non-ability variables, are needed to better predict job performance and performance of other everyday activities. Taking another approach, Halpern and Dunn ( 2021 ) have proposed that critical thinking is essentially the adaptive application of intelligence for solving real-world problems. Elsewhere, Butler and Halpern ( 2019 ) have argued that dispositions such as open-mindedness are another aspect of CT and that domain-specific knowledge and specific CT skills are needed to solve real-world problems.

Examples are readily available for how CT goes beyond what IQ tests test to include specific rules for reasoning and relevant knowledge needed to execute real-world tasks. Take the example of scientific reasoning, which can be viewed as a specialized form of CT. Drawing a well-reasoned inductive conclusion about a theory or analyzing the quality of a research study both require that a thinker possess relevant specialized knowledge related to the question and specific reasoning skills for reasoning about scientific methodology. In contrast, IQ tests are deliberately designed to be nonspecialized in assessing Gc, broadly sampling vocabulary and general knowledge in order to be fair and unbiased ( Stanovich 2009 ). Specialized knowledge and reasoning skills are also needed in non-academic domains. Jurors must possess specialized knowledge to understand expert, forensic testimony and specific reasoning skills to interpret the law and make well-reasoned judgments about a defendant’s guilt or innocence.

Besides lacking specific reasoning skills and domain-relevant knowledge, people may fail to think critically because they are not disposed to use their reasoning skills to examine such claims and want to preserve their favored beliefs. Critical thinking dispositions are attitudes or traits that make it more likely that a person will think critically. Theorists have proposed numerous CT dispositions (e.g., Bensley 2018 ; Butler and Halpern 2019 ; Dwyer 2017 ; Ennis 1987 ). Some commonly identified CT dispositions especially relevant to this discussion are open-mindedness, skepticism, intellectual engagement, and the tendency to take a reflective, rational–analytic approach. Critical thinking dispositions are clearly value-laden and prescriptive. A good thinker should be open-minded, skeptical, reflective, intellectually engaged, and value a rational–analytic approach to inquiry. Conversely, corresponding negative dispositions, such as “close-mindedness” and “gullibility”, could obstruct CT.

Without the appropriate disposition, individuals will not use their reasoning skills to think critically about questions. For example, the brilliant mystery writer, Sir Arthur Conan Doyle, who was trained as a physician and created the hyper-reasonable detective Sherlock Holmes, was not disposed to think critically about some unsubstantiated claims. Conan Doyle was no doubt highly intelligent in cognitive ability terms, but he was not sufficiently skeptical (disposed to think critically) about spiritualism. He believed that he was talking to his dearly departed son though a medium, despite the warnings of his magician friend, Harry Houdini, who told him that mediums used trickery in their seances. Perhaps influenced by his Irish father’s belief in the “wee folk”, Conan Doyle also believed that fairies inhabited the English countryside, based on children’s photos, despite the advice of experts who said the photos could be faked. Nevertheless, he was skeptical of a new theory of tuberculosis proposed by Koch when he reported on it, despite his wife suffering from the disease. So, in professional capacities, Conan Doyle used his CT skills, but in certain other domains for which he was motivated to accept unsubstantiated claims, he failed to think critically, insufficiently disposed to skeptically challenge certain implausible claims.

This example makes two important points. Conan Doyle’s superior intelligence was not enough for him to reject implausible claims about the world. In general, motivated reasoning can lead people, even those considered highly intelligent, to accept claims with no good evidentiary support. The second important point is that we would not be able to adequately explain cases like this one, considering only the person’s intelligence or even their reasoning skills, without also considering the person’s disposition. General cognitive ability alone is not sufficient, and CT dispositions should also be considered.

Supporting this conclusion, Stanovich and West ( 1997 ) examined the influence of dispositions beyond the contribution of cognitive ability on a CT task. They gave college students an argument evaluation test in which participants first rated their agreement with several claims about real social and political issues made by a fictitious person. Then, they gave them evidence against each claim and finally asked them to rate the quality of a counterargument made by the same fictitious person. Participants’ ratings of the counterarguments were compared to the median ratings of expert judges on the quality of the rebuttals. Stanovich and West also administered a new measure of rational disposition called the Actively Open-minded Thinking (AOT) scale and the SAT as a proxy for cognitive ability. The AOT was a composite of items from several other scales that would be expected to measure CT disposition. They found that both SAT and AOT scores were significant predictors of higher argument analysis scores. Even after partialing out cognitive ability, actively open-minded thinking was significant. These results suggest that general cognitive ability alone was not sufficient to account for thinking critically about real-world issues and that CT disposition was needed to go beyond it.

Further examining the roles of CT dispositions and cognitive ability on reasoning, Stanovich and West ( 2008 ) studied myside bias, a bias in reasoning closely related to one-sided thinking and confirmation bias. A critical thinker would be expected to not show myside bias and instead fairly evaluate evidence on all sides of a question. Stanovich and West ( 2007 ) found that college students often showed myside bias when asked their opinions about real-world policy issues, such as those concerning the health risks of smoking and drinking alcohol. For example, compared to non-smokers, smokers judged the health risks of smoking to be lower. When they divided participants into higher versus lower cognitive ability groups based on SAT scores, the two groups showed little difference on myside bias. Moreover, on the hazards of drinking issue, participants who drank less had higher scores on the CT disposition measure.

Other research supports the need for both reasoning ability and CT disposition in predicting outcomes in the real world. Ren et al. ( 2020 ) found that CT disposition, as measured by a Chinese critical thinking disposition inventory, and a CT skill measure together contributed a significant amount of the variance in predicting academic performance beyond the contribution of cognitive ability alone, as measured by a test of fluid intelligence. Further supporting the claim that CT requires both cognitive ability and CT disposition, Ku and Ho ( 2010 ) found that a CT disposition measure significantly predicted scores on a CT test beyond the significant contribution of verbal intelligence in high school and college students from Hong Kong.

The contribution of dispositions to thinking is related to another way that CT goes beyond the application of general cognitive ability, i.e., by way of the motivation for reasoning. Assuming that all reasoning is motivated ( Kunda 1990 ), then CT is motivated, too, which is implicit within the Halpern and Dunn ( 2021 ) and Ennis ( 1987 ) definitions. Critical thinking is motivated in the sense of being purposeful and directed towards the goal of arriving at an accurate conclusion. For instance, corresponding to pursuit of the goal of accurate reasoning, the CT disposition of “truth-seeking” guides a person towards reaching the CT goal of arriving at an accurate conclusion.

Also, according to Kunda ( 1990 ), a second type of motivated reasoning can lead to faulty conclusions, often by directing a person towards the goal of maintaining favored beliefs and preconceptions, as in illusory correlation, belief perseverance, and confirmation bias. Corresponding to this second type, negative dispositions, such as close-mindedness and self-serving motives, can incline thinkers towards faulty conclusions. This is especially relevant in the present discussion because poorer reasoning, thinking errors, and the inappropriate use of heuristics are related to the endorsement of unsubstantiated claims, all of which are CT failures. The term “thinking errors” is a generic term referring to logical fallacies, informal reasoning fallacies, argumentation errors, and inappropriate uses of cognitive heuristics ( Bensley 2018 ). Heuristics are cognitive shortcuts, commonly used to simplify judgment tasks and reduce mental effort. Yet, when used inappropriately, heuristics often result in biased judgments.

Stanovich ( 2009 ) has argued that IQ tests do not test people’s use of heuristics, but heuristics have been found to be negatively correlated with CT performance ( West et al. 2008 ). In this same study, they found that college students’ cognitive ability, as measured by performance on the SAT, was not correlated with thinking biases associated with use of heuristics. Although Stanovich and West ( 2008 ) found that susceptibility to biases, such as the conjunction fallacy, framing effect, base-rate neglect, affect bias, and myside bias were all uncorrelated with cognitive ability (using SAT as a proxy), other types of thinking errors were correlated with SAT.

Likewise, two types of knowledge are related to the two forms of motivated reasoning. For instance, inaccurate knowledge, such as misconceptions, can derail reasoning from moving towards a correct conclusion, as in when a person reasons from false premises. In contrast, reasoning from accurate knowledge is more likely to produce an accurate conclusion. Taking into account inaccurate knowledge and thinking errors is important to understanding the endorsement of unsubstantiated claims because these are also related to negative dispositions, such as close-mindedness and cynicism, none of which are measured by intelligence tests.

Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the “Halpern Critical Thinking Assessment” (HCTA) provides respondents with a mock newspaper story about research showing that first-graders who attended preschool were better able to learn how to read. Then the question asks if preschool should be made mandatory. A correct response to this item requires recognizing that correlation does not imply causation, that is, avoiding a common reasoning error people make in thinking about research implications in everyday life. Another CT skills test, “Analyzing Psychological Statements” (APS) assesses the ability to recognize thinking errors and apply argumentation skills and psychology to evaluate psychology-related examples and simulations of real-life situations ( Bensley 2021 ). For instance, besides identifying thinking errors in brief samples of thinking, questions ask respondents to distinguish arguments from non-arguments, find assumptions in arguments, evaluate kinds of evidence, and draw a conclusion from a brief psychological argument. An important implication of the studies just reviewed is that efforts to understand CT can be further informed by assessing thinking errors and biases, which, as the next discussion shows, are related to individual differences in thinking dispositions and cognitive style.

4. Dual-Process Theory Measures and Unsubstantiated Beliefs

Dual-process theory (DPT) and measures associated with it have been widely used in the study of the endorsement of unsubstantiated beliefs, especially as they relate to cognitive style. According to a cognitive style version of DPT, people have two modes of processing, a fast intuitive–experiential (I-E) style of processing and a slower, reflective, rational–analytic (R-A) style of processing. The intuitive cognitive style is associated with reliance on hunches, feelings, personal experience, and cognitive heuristics which simplify processing, while the R-A cognitive style is a reflective, rational–analytic style associated with more elaborate and effortful processing ( Bensley et al. 2022 ; Epstein 2008 ). As such, the rational–analytic cognitive style is consistent with CT dispositions, such as those promoting the effortful analysis of evidence, objective truth, and logical consistency. In fact, CT is sometimes referred to as “critical-analytic” thinking ( Byrnes and Dunbar 2014 ) and has been associated with analytical intelligence Sternberg ( 1988 ) and with rational thinking, as discussed before.

People use both modes of processing, but they show individual differences in which mode they tend to rely upon, although the intuitive–experiential mode is the default ( Bensley et al. 2022 ; Morgan 2016 ; Pacini and Epstein 1999 ), and they accept unsubstantiated claims differentially based on their predominate cognitive style ( Bensley et al. 2022 ; Epstein 2008 ). Specifically, individuals who rely more on an I-E cognitive style tend to endorse unsubstantiated claims more strongly, while individuals who rely more on a R-A cognitive style tend to endorse those claims less. Note, however, that other theorists view the two processes and cognitive styles somewhat differently, (e.g., Kahneman 2011 ; Stanovich et al. 2018 ).

Researchers have often assessed the contribution of these two cognitive styles to endorsement of unsubstantiated claims, using variants of three measures: the Cognitive Reflection Test (CRT) of Frederick ( 2005 ), the Rational–Experiential Inventory of Epstein and his colleagues ( Pacini and Epstein 1999 ), and the related Need for Cognition scale of Cacioppo and Petty ( 1982 ). The CRT is a performance-based test which asks participants to solve problems that appear to require simple mathematical calculations, but which actually require more reflection. People typically do poorly on the CRT, which is thought to indicate reliance on an intuitive cognitive style, while better performance is thought to indicate reliance on the slower, more deliberate, and reflective cognitive style. The positive correlation of the CRT with numeracy scores suggests it also has a cognitive skill component ( Patel et al. 2019 ). The Rational–Experiential Inventory (REI) of Pacini and Epstein ( 1999 ) contains one scale designed to measure an intuitive–experiential cognitive style and a second scale intended to measure a rational–analytic (R-A) style. The R-A scale was adapted from the Need for Cognition (NFC) scale of Cacioppo and Petty ( 1982 ), another scale associated with rational–analytic thinking and expected to be negatively correlated with unsubstantiated beliefs. The NFC was found to be related to open-mindedness and intellectual engagement, two CT dispositions ( Cacioppo et al. 1996 ).

The cognitive styles associated with DPT also relate to CT dispositions. Thinking critically requires that individuals be disposed to use their reasoning skills to reject unsubstantiated claims ( Bensley 2018 ) and that they be inclined to take a rational–analytic approach rather than relying on their intuitions and feelings. For instance, Bensley et al. ( 2014 ) found that students who endorsed more psychological misconceptions adopted a more intuitive cognitive style, were less disposed to take a rational–scientific approach to psychology, and scored lower on a psychological critical thinking skills test. Further supporting this connection, West et al. ( 2008 ) found that participants who tended to use cognitive heuristics more, thought to be related to intuitive processing and bias, scored lower on a critical thinking measure. As the Bensley et al. ( 2014 ) results suggest, in addition to assessing reasoning skills and dispositions, comprehensive CT assessment research should assess knowledge and unsubstantiated beliefs because these are related to failures of critical thinking.

5. Assessing Critical Thinking and Unsubstantiated Beliefs

Assessing endorsement of unsubstantiated claims provides another way to assess CT outcomes related to everyday thinking, which goes beyond what intelligence tests test ( Bensley and Lilienfeld 2020 ). From the perspective of the multi-dimensional model of CT, endorsement of unsubstantiated claims could result from deficiencies in a person’s CT reasoning skills, a lack of relevant knowledge, and in the engagement of inappropriate dispositions. Suppose an individual endorses an unsubstantiated claim, such as believing the conspiracy theory that human-caused global warming is a hoax. The person may lack the specific reasoning skills needed to critically evaluate the conspiracy. Lantian et al. ( 2020 ) found that scores on a CT skills test were negatively correlated with conspiracy theory beliefs. The person also must possess relevant scientific knowledge, such as knowing the facts that each year humans pump about 40 billion metric tons of carbon dioxide into the atmosphere and that carbon dioxide is a greenhouse gas which traps heat in the atmosphere. Or, the person may not be scientifically skeptical or too cynical or mistrustful of scientists or governmental officials.

Although endorsing unsubstantiated beliefs is clearly a failure of CT, problems arise in deciding which ones are unsubstantiated, especially when considering conspiracy theories. Typically, the claims which critical thinkers should reject as unsubstantiated are those which are not supported by objective evidence. But of the many conspiracies proposed, few are vigorously examined. Moreover, some conspiracy theories which authorities might initially deny turn out to be real, such as the MK-Ultra theory that the CIA was secretly conducting mind-control research on American citizens.

A way out of this quagmire is to define unsubstantiated beliefs on a continuum which depends on the quality of evidence. This has led to the definition of unsubstantiated claims as assertions which have not been supported by high-quality evidence ( Bensley 2023 ). Those which are supported have the kind of evidentiary support that critical thinkers are expected to value in drawing reasonable conclusions. Instead of insisting that a claim must be demonstrably false to be rejected, we adopt a more tentative acceptance or rejection of claims, based on how much good evidence supports them. Many claims are unsubstantiated because they have not yet been carefully examined and so totally lack support or they may be supported only by low quality evidence such as personal experience, anecdotes, or non-scientific authority. Other claims are more clearly unsubstantiated because they contradict the findings of high-quality research. A critical thinker should be highly skeptical of these.

Psychological misconceptions are one type of claim that can be more clearly unsubstantiated. Psychological misconceptions are commonsense psychological claims (folk theories) about the mind, brain, and behavior that are contradicted by the bulk of high-quality scientific research. Author developed the Test of Psychological Knowledge and Misconceptions (TOPKAM), a 40-item, forced-choice measure with each item posing a statement of a psychological misconception and the other response option stating the evidence-based alternative ( Bensley et al. 2014 ). They found that higher scores on the APS, the argument analysis test applying psychological concepts to analyze real-world examples, were associated with more correct answers on the TOPKAM. Other studies have found positive correlations between CT skills tests and other measures of psychological misconceptions ( McCutcheon et al. 1992 ; Kowalski and Taylor 2004 ). Bensley et al. ( 2014 ) also found that higher correct TOPKAM scores were positively correlated with scores on the Inventory of Thinking Dispositions in Psychology (ITDP) of Bensley ( 2021 ), a measure of the disposition to take a rational and scientific approach to psychology but were negatively correlated with an intuitive cognitive style.

Bensley et al. ( 2021 ) conducted a multidimensional study, assessing beginner psychology students starting a CT course on their endorsement of psychological misconceptions, recognition of thinking errors, CT dispositions, and metacognition, before and after CT instruction. Two classes received explicit instruction involving considerable practice in argument analysis and scientific reasoning skills, with one class receiving CT instruction focused more on recognizing psychological misconceptions and a second class focused more on recognizing various thinking errors. Bensley et al. assessed both classes before and after instruction on the TOPKAM and on the Test of Thinking Errors, a test of the ability to recognize in real-world examples 17 different types of thinking errors, such as confirmation bias, inappropriate use of the availability and representativeness heuristics, reasoning from ignorance/possibility, gambler’s fallacy, and hasty generalization ( Bensley et al. 2021 ). Correct TOPKAM and TOTE scores were positively correlated, and after CT instruction both were positively correlated with the APS, the CT test of argument analysis skills.

Bensley et al. found that after explicit instruction of CT skills, students improved significantly on both the TOPKAM and TOTE, but those focusing on recognizing misconceptions improved the most. Also, those students who improved the most on the TOTE scored higher on the REI rational–analytic scale and on the ITDP, while those improving the most on the TOTE scored higher on the ITDP. The students receiving explicit CT skill instruction in recognizing misconceptions also significantly improved the accuracy of their metacognitive monitoring in estimating their TOPKAM scores after instruction.

Given that before instruction neither class differed in GPA nor on the SAT, a proxy for general cognitive ability, CT instruction provided a good accounting for the improvement in recognition of thinking errors and misconceptions without recourse to intelligence. However, SAT scores were positively correlated with both TOTE scores and APS scores, suggesting that cognitive ability contributed to CT skill performance. These results replicated the earlier findings of Bensley and Spero ( 2014 ) showing that explicit CT instruction improved performance on both CT skills tests and metacognitive monitoring accuracy while controlling for SAT, which was positively correlated with the CT skills test performance.

Taken together, these findings suggest that cognitive ability contributes to performance on CT tasks but that CT instruction goes beyond it to further improve performance. As the results of Bensley et al. ( 2021 ) show, and as discussed next, thinking errors and bias from heuristics are CT failures that should also be assessed because they are related to endorsement of unsubstantiated beliefs and cognitive style.

6. Dual-Processing Theory and Research on Unsubstantiated Beliefs

Consistent with DPT, numerous other studies have obtained significant positive correlations between intuitive cognitive style and paranormal belief, often using the REI intuitive–experiential scale and the Revised Paranormal Belief Scale (RPBS) of Tobacyk ( 2004 ) (e.g., Genovese 2005 ; Irwin and Young 2002 ; Lindeman and Aarnio 2006 ; Pennycook et al. 2015 ; Rogers et al. 2018 ; Saher and Lindeman 2005 ). Studies have also found positive correlations between superstitious belief and intuitive cognitive style (e.g., Lindeman and Aarnio 2006 ; Maqsood et al. 2018 ). REI intuitive–experiential thinking style was also positively correlated with belief in complementary and alternative medicine ( Lindeman 2011 ), conspiracy theory belief ( Alper et al. 2020 ), and with endorsement of psychological misconceptions ( Bensley et al. 2014 ; Bensley et al. 2022 ).

Additional evidence for DPT has been found when REI R-A and NFC scores were negatively correlated with scores on measures of unsubstantiated beliefs, but studies correlating them with measures of paranormal belief and conspiracy theory belief have shown mixed results. Supporting a relationship, REI rational–analytic and NFC scores significantly and negatively predicted paranormal belief ( Lobato et al. 2014 ; Pennycook et al. 2012 ). Other studies have also obtained a negative correlation between NFC and paranormal belief ( Lindeman and Aarnio 2006 ; Rogers et al. 2018 ; Stahl and van Prooijen 2018 ), but both Genovese ( 2005 ) and Pennycook et al. ( 2015 ) found that NFC was not significantly correlated with paranormal belief. Swami et al. ( 2014 ) found that although REI R-A scores were negatively correlated with conspiracy theory belief, NFC scores were not.

Researchers often refer to people who are doubtful of paranormal and other unfounded claims as “skeptics” and so have tested whether measures related to skepticism are associated with less endorsement of unsubstantiated claims. They typically view skepticism as a stance towards unsubstantiated claims taken by rational people who reject them, (e.g., Lindeman and Aarnio 2006 ; Stahl and van Prooijen 2018 ), rather than as a disposition inclining a person to think critically about unsubstantiated beliefs ( Bensley 2018 ).

Fasce and Pico ( 2019 ) conducted one of the few studies using a measure related to skeptical disposition, the Critical Thinking Disposition Scale (CTDS) of Sosu ( 2013 ), in relation to endorsement of unsubstantiated claims. They found that scores on the CTDS were negatively correlated with scores on the RPBS but not significantly correlated with either a measure of pseudoscience or of conspiracy theory belief. However, the CRT was negatively correlated with both RPBS and the pseudoscience measure. Because Fasce and Pico ( 2019 ) did not examine correlations with the Reflective Skepticism subscale of the CTDS, its contribution apart from full-scale CTDS was not found.

To more directly test skepticism as a disposition, we recently assessed college students on how well three new measures predicted endorsement of psychological misconceptions, paranormal claims, and conspiracy theories ( Bensley et al. 2022 ). The dispositional measures included a measure of general skeptical attitude; a second measure, the Scientific Skepticism Scale (SSS), which focused more on waiting to accept claims until high-quality scientific evidence supported them; and a third measure, the Cynicism Scale (CS), which focused on doubting the sincerity of the motives of scientists and people in general. We found that although the general skepticism scale did not predict any of the unsubstantiated belief measures, SSS scores were a significant negative predictor of both paranormal belief and conspiracy theory belief. REI R-A scores were a less consistent negative predictor, while REI I-E scores were more consistent positive predictors, and surprisingly CS scores were the most consistent positive predictors of the unsubstantiated beliefs.

Researchers commonly assume that people who accept implausible, unsubstantiated claims are gullible or not sufficiently skeptical. For instance, van Prooijen ( 2019 ) has argued that conspiracy theory believers are more gullible (less skeptical) than non-believers and tend to accept unsubstantiated claims more than less gullible people. van Prooijen ( 2019 ) reviewed several studies supporting the claim that people who are more gullible tend to endorse conspiracy theories more. However, he did not report any studies in which a gullible disposition was directly measured.

Recently, we directly tested the gullibility hypothesis in relation to scientific skepticism ( Bensley et al. 2023 ) using the Gullibility Scale of Teunisse et al. ( 2019 ) on which people skeptical of the paranormal had been shown to have lower scores. We found that Gullibility Scale and the Cynicism Scale scores were positively correlated, and both were significant positive predictors of unsubstantiated beliefs, in general, consistent with an intuitive–experiential cognitive style. In contrast, we found that scores on the Cognitive Reflection Test, the Scientific Skepticism Scale, and the REI rational–analytic scale were all positively intercorrelated and significant negative predictors of unsubstantiated beliefs, in general, consistent with a rational–analytic/reflective cognitive style. Scientific skepticism scores negatively predicted general endorsement of unsubstantiated claims beyond the REI R-A scale, but neither the CTDS nor the CTDS Reflective Skepticism subscale were significant. These results replicated findings from the Bensley et al. ( 2023 ) study and supported an elaborated dual-process model of unsubstantiated belief. The SSS was not only a substantial negative predictor, it was also negatively correlated with the Gullibility Scale, as expected.

These results suggest that both CT-related dispositions and CT skills are related to endorsement of unsubstantiated beliefs. However, a measure of general cognitive ability or intelligence must be examined along with measures of CT and unsubstantiated beliefs to determine if CT goes beyond intelligence to predict unsubstantiated beliefs. In one of the few studies that also included a measure of cognitive ability, Stahl and van Prooijen ( 2018 ) found that dispositional characteristics helped account for acceptance of conspiracies and paranormal belief beyond cognitive ability. Using the Importance of Rationality Scale (IRS), a rational–analytic scale designed to measure skepticism towards unsubstantiated beliefs, Stahl and van Prooijen ( 2018 ) found that the IRS was negatively correlated with paranormal belief and belief in conspiracy theories. In separate hierarchical regressions, cognitive ability was the strongest negative predictor of both paranormal belief and of conspiracy belief, but IRS scores in combination with cognitive ability negatively predicted endorsement of paranormal belief but did not significantly predict conspiracy theory belief. These results provided partial support that that a measure of rational–analytic cognitive style related to skeptical disposition added to the variance accounted for beyond cognitive ability in negatively predicting unsubstantiated belief.

In another study that included a measure of cognitive ability, Cavojova et al. ( 2019 ) examined how CT-related dispositions and the Scientific Reasoning Scale (SRS) were related to a measure of paranormal, pseudoscientific, and conspiracy theory beliefs. The SRS of Drummond and Fischhoff ( 2017 ) likely measures CT skill in that it measures the ability to evaluate scientific research and evidence. As expected, the unsubstantiated belief measure was negatively correlated with the SRS and a cognitive ability measure, similar to Raven’s Progressive Matrices. Unsubstantiated beliefs were positively correlated with dogmatism (the opposite of open-mindedness) but not with REI rational–analytic cognitive style. The SRS was a significant negative predictor of both unsubstantiated belief and susceptibility to bias beyond the contribution of cognitive ability, but neither dogmatism nor analytic thinking were significant predictors. Nevertheless, this study provides some support that a measure related to CT reasoning skill accounts for variance in unsubstantiated belief beyond cognitive ability.

The failure of this study to show a correlation between rational–analytic cognitive style and unsubstantiated beliefs, when some other studies have found significant correlations with it and related measures, has implications for the multidimensional assessment of unsubstantiated beliefs. One implication is that the REI rational–analytic scale may not be a strong predictor of unsubstantiated beliefs. In fact, we have recently found that the Scientific Skepticism Scale was a stronger negative predictor ( Bensley et al. 2022 ; Bensley et al. 2023 ), which also suggests that other measures related to rational–analytic thinking styles should be examined. This could help triangulate the contribution of self-report cognitive style measures to endorsement of unsubstantiated claims, recognizing that the use of self-report measures has a checkered history in psychological research. A second implication is that once again, measures of critical thinking skill and cognitive ability were negative predictors of unsubstantiated belief and so they, too, should be included in future assessments of unsubstantiated beliefs.

7. Discussion

This review provided different lines of evidence supporting the claim that CT goes beyond cognitive ability in accounting for certain real-world outcomes. Participants who think critically reported fewer problems in everyday functioning, not expected to befall critical thinkers. People who endorsed unsubstantiated claims less showed better CT skills, more accurate domain-specific knowledge, less susceptibility to thinking errors and bias, and were more disposed to think critically. More specifically, they tended to be more scientifically skeptical and adopt a more rational–analytic cognitive style. In contrast, those who endorsed them more tended to be more cynical and adopt an intuitive–experiential cognitive style. These characteristics go beyond what standardized intelligence tests test. In some studies, the CT measures accounted for additional variance beyond the variance contributed by general cognitive ability.

That is not to say that measures of general cognitive ability are not useful. As noted by Gottfredson ( 2004 ), “g” is a highly successful predictor of academic and job performance. More is known about g and Gf than about many other psychological constructs. On average, g is closely related to Gf, which is highly correlated with working memory ( r = 0.70) and can be as high as r = 0.77 ( r 2 = 0.60) based on a correlated two-factor model ( Gignac 2014 ). Because modern working memory theory is, itself, a powerful theory ( Chai et al. 2018 ), this lends construct validity to the fluid intelligence construct. Although cognitive scientists have clearly made progress in understanding the executive processes underlying intelligence, they have not yet identified the specific cognitive components of intelligence ( Sternberg 2022 ). Moreover, theorists have acknowledged that intelligence must also include components beyond g, including domain-specific knowledge ( Ackerman 2022 ; Conway and Kovacs 2018 ) which are not yet clearly understood,

This review also pointed to limitations in the research that should be addressed. So far, not only have few studies of unsubstantiated beliefs included measures of intelligence, but they have also often used proxies for intelligence test scores, such as SAT scores. Future studies, besides using more and better measures of intelligence, could benefit from inclusion of more specifically focused measures, such as measures of Gf and Gc. Also, more research should be carried out to develop additional high-quality measures of CT, including ones that assess specific reasoning skills and knowledge relevant to thinking about a subject, which could help resolve perennial questions about the domain-general versus domain-specific nature of intelligence and CT. Overall, the results of this review encourage taking a multidimensional approach to investigating the complex constructs of intelligence, CT, and unsubstantiated belief. Supporting these recommendations were results of studies in which the improvement accrued from explicit CT skill instruction could be more fully understood when CT skills, relevant knowledge, CT dispositions, metacognitive monitoring accuracy, and a proxy for intelligence were used.

8. Conclusions

Critical thinking, broadly conceived, offers ways to understand real-world outcomes of thinking beyond what general cognitive ability can provide and intelligence tests test. A multi-dimensional view of CT which includes specific reasoning and metacognitive skills, CT dispositions, and relevant knowledge can add to our understanding of why some people endorse unsubstantiated claims more than others do, going beyond what intelligence tests test. Although general cognitive ability and domain-general knowledge often contribute to performance on CT tasks, thinking critically about real-world questions also involves applying rules, criteria, and knowledge which are specific to the question under consideration, as well as the appropriate dispositions and cognitive styles for deploying these.

Despite the advantages of taking this multidimensional approach to CT in helping us to more fully understand everyday thinking and irrationality, it presents challenges for researchers and instructors. It implies the need to assess and instruct multidimensionally, including not only measures of reasoning skills but also addressing thinking errors and biases, dispositions, the knowledge relevant to a task, and the accuracy of metacognitive judgments. As noted by Dwyer ( 2023 ), adopting a more complex conceptualization of CT beyond just skills is needed, but it presents challenges for those seeking to improve students’ CT. Nevertheless, the research reviewed suggests that taking this multidimensional approach to CT can enhance our understanding of the endorsement of unsubstantiated claims beyond what standardized intelligence tests contribute. More research is needed to resolve remaining controversies and to develop evidence-based applications of the findings.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

This research involved no new testing of participants and hence did not require Institutional Review Board approval.

Informed Consent Statement

This research involved no new testing of participants and hence did not require an Informed Consent Statement.

Data Availability Statement

Conflicts of interest.

The author declares no conflict of interest.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Ackerman Phillip L. Intelligence … Moving beyond the lowest common denominator. American Psychologist. 2022; 78 :283–97. doi: 10.1037/amp0001057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Alper Sinan, Bayrak Faith, Yilmaz Onurcan. Psychological correlates of COVID-19 conspiracy beliefs and preventive measures: Evidence from Turkey. Current Psychology. 2020; 40 :5708–17. doi: 10.1007/s12144-020-00903-0. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan. Critical Thinking in Psychology and Everyday Life: A Guide to Effective Thinking. Worth Publishers; New York: 2018. [ Google Scholar ]
  • Bensley D. Alan. The Critical Thinking in Psychology Assessment Battery (CTPAB) and Test Guide. 2021. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan. “I can’t believe you believe that”: Identifying unsubstantiated claims. Skeptical Inquirer. 2023; 47 :53–56. [ Google Scholar ]
  • Bensley D. Alan, Spero Rachel A. Improving critical thinking skills and metacognitive monitoring through direct infusion. Thinking Skills and Creativity. 2014; 12 :55–68. doi: 10.1016/j.tsc.2014.02.001. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Lilienfeld Scott O. Assessment of Unsubstantiated Beliefs. Scholarship of Teaching and Learning in Psychology. 2020; 6 :198–211. doi: 10.1037/stl0000218. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Masciocchi Christopher M., Rowan Krystal A. A comprehensive assessment of explicit critical thinking instruction on recognition of thinking errors and psychological misconceptions. Scholarship of Teaching and Learning in Psychology. 2021; 7 :107. doi: 10.1037/stl0000188. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Watkins Cody, Lilienfeld Scott O., Masciocchi Christopher, Murtagh Michael, Rowan Krystal. Skepticism, cynicism, and cognitive style predictors of the generality of unsubstantiated belief. Applied Cognitive Psychology. 2022; 36 :83–99. doi: 10.1002/acp.3900. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Rodrigo Maria, Bravo Maria, Jocoy Kathleen. Dual-Process Theory and Cognitive Style Predictors of the General Endorsement of Unsubstantiated Claims. 2023. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan, Lilienfeld Scott O., Powell Lauren. A new measure of psychological. misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences. 2014; 36 :9–18. doi: 10.1016/j.lindif.2014.07.009. [ CrossRef ] [ Google Scholar ]
  • Bierwiaczonek Kinga, Kunst Jonas R., Pich Olivia. Belief in COVID-19 conspiracy theories reduces social distancing over time. Applied Psychology Health and Well-Being. 2020; 12 :1270–85. doi: 10.1111/aphw.12223. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Butler Heather A. Halpern critical thinking assessment predicts real-world outcomes of critical thinking. Applied Cognitive Psychology. 2012; 26 :721–29. doi: 10.1002/acp.2851. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Halpern Diane F. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Intelligence. Cambridge University Press; Cambridge: 2019. pp. 183–96. [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Byrnes James P., Dunbar Kevin N. The nature and development of critical-analytic thinking. Educational Research Review. 2014; 26 :477–93. doi: 10.1007/s10648-014-9284-0. [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E. The need for cognition. Journal of Personality and Social Psychology. 1982; 42 :116–31. doi: 10.1037/0022-3514.42.1.116. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E., Feinstein Jeffrey A., Jarvis W. Blair G. Dispositional differences in cognitive motivation: The life and times of individuals varying in need for cognition. Psychological Bulletin. 1996; 119 :197. doi: 10.1037/0033-2909.119.2.197. [ CrossRef ] [ Google Scholar ]
  • Cavojova Vladimira, Srol Jakub, Jurkovic Marek. Why we should think like scientists? Scientific reasoning and susceptibility to epistemically suspect beliefs and cognitive biases. Applied Cognitive Psychology. 2019; 34 :85–95. doi: 10.1002/acp.3595. [ CrossRef ] [ Google Scholar ]
  • Chai Wen Jia, Hamid Abd, Ismafairus Aini, Abdullah Jafri Malin. Working memory from the psychological and neuroscience perspective. Frontiers in Psychology. 2018; 9 :401. doi: 10.3389/fpsyg.2018.00401. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Conway Andrew R., Kovacs Kristof. The nature of the general factor of intelligence. In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 49–63. [ Google Scholar ]
  • Drummond Caitlin, Fischhoff Baruch. Development and validation of the Scientific Reasoning Scale. Journal of Behavioral Decision Making. 2017; 30 :26–38. doi: 10.1002/bdm.1906. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. An evaluative review of barriers to critical thinking in educational and real-world settings. Journal of Intelligence. 2023; 11 :105. doi: 10.3390/jintelligence11060105. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomy of critical thinking dispositions and abilities. In: Baron Joan, Sternberg Robert., editors. Teaching Thinking Skills: Theory and Practice. W. H. Freeman; New York: 1987. [ Google Scholar ]
  • Epstein Seymour. Intuition from the perspective of cognitive-experiential self-theory. In: Plessner Henning, Betsch Tilmann., editors. Intuition in Judgment and Decision Making. Erlbaum; Washington, DC: 2008. pp. 23–37. [ Google Scholar ]
  • Fasce Angelo, Pico Alfonso. Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education. 2019; 28 :109–25. doi: 10.1007/s11191-018-00022-0. [ CrossRef ] [ Google Scholar ]
  • Frederick Shane. Cognitive reflection and decision making. Journal of Economic Perspectives. 2005; 19 :25–42. doi: 10.1257/089533005775196732. [ CrossRef ] [ Google Scholar ]
  • Gardner Howard. Intelligence Reframed: Multiple Intelligence for the 21st Century. Basic Books; New York: 1999. [ Google Scholar ]
  • Genovese Jeremy E. C. Paranormal beliefs, schizotypy, and thinking styles among teachers and future teachers. Personality and Individual Differences. 2005; 39 :93–102. doi: 10.1016/j.paid.2004.12.008. [ CrossRef ] [ Google Scholar ]
  • Gignac Gilles E. Fluid intelligence shares closer to 60% of its variance with working memory capacity and is a better indicator of general intelligence. Intelligence. 2014; 47 :122–33. doi: 10.1016/j.intell.2014.09.004. [ CrossRef ] [ Google Scholar ]
  • Gottfredson Linda S. Life, death, and intelligence. Journal of Cognitive Education and Psychology. 2004; 4 :23–46. doi: 10.1891/194589504787382839. [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Dunn Dana. Critical thinking: A model of intelligence for solving real-world problems. Journal of Intelligence. 2021; 9 :22. doi: 10.3390/jintelligence9020022. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 183–196. [ Google Scholar ]
  • Irwin Harvey J., Young J. M. Intuitive versus reflective processes in the formation of paranormal beliefs. European Journal of Parapsychology. 2002; 17 :45–55. [ Google Scholar ]
  • Jolley Daniel, Paterson Jenny L. Pylons ablaze: Examining the role of 5G COVID-19 conspiracy beliefs and support for violence. British Journal of Social Psychology. 2020; 59 :628–40. doi: 10.1111/bjso.12394. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Farrar, Strauss and Giroux; New York: 2011. [ Google Scholar ]
  • Kowalski Patricia, Taylor Annette J. Ability and critical thinking as predictors of change in students’ psychological misconceptions. Journal of Instructional Psychology. 2004; 31 :297–303. [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional Factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Kunda Ziva. The case for motivated reasoning. Psychological Bulletin. 1990; 98 :480–98. doi: 10.1037/0033-2909.108.3.480. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lantian Anthony, Bagneux Virginie, Delouvee Sylvain, Gauvrit Nicolas. Maybe a free thinker but not a critical one: High conspiracy belief is associated with low critical thinking ability. Applied Cognitive Psychology. 2020; 35 :674–84. doi: 10.1002/acp.3790. [ CrossRef ] [ Google Scholar ]
  • Lilienfeld Scott O. Psychological treatments that cause harm. Perspectives on Psychological Science. 2007; 2 :53–70. doi: 10.1111/j.1745-6916.2007.00029.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana. Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology and Health. 2011; 26 :371–82. doi: 10.1080/08870440903440707. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana, Aarnio Kia. Paranormal beliefs: Their dimensionality and correlates. European Journal of Personality. 2006; 20 :585–602. [ Google Scholar ]
  • Lobato Emilio J., Mendoza Jorge, Sims Valerie, Chin Matthew. Explaining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology. 2014; 28 :617–25. doi: 10.1002/acp.3042. [ CrossRef ] [ Google Scholar ]
  • Maqsood Alisha, Jamil Farhat, Khalid Ruhi. Thinking styles and belief in superstitions: Moderating role of gender in young adults. Pakistan Journal of Psychological Research. 2018; 33 :335–348. [ Google Scholar ]
  • McCutcheon Lynn E., Apperson Jenneifer M., Hanson Esher, Wynn Vincent. Relationships among critical thinking skills, academic achievement, and misconceptions about psychology. Psychological Reports. 1992; 71 :635–39. doi: 10.2466/pr0.1992.71.2.635. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGrew Kevin S. CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence. 2009; 37 :1–10. doi: 10.1016/j.intell.2008.08.004. [ CrossRef ] [ Google Scholar ]
  • Morgan Jonathan. Religion and dual-process cognition: A continuum of styles or distinct types. Religion, Brain, & Behavior. 2016; 6 :112–29. doi: 10.1080/2153599X.2014.966315. [ CrossRef ] [ Google Scholar ]
  • Nie Fanhao, Olson Daniel V. A. Demonic influence: The negative mental health effects of belief in demons. Journal for the Scientific Study of Religion. 2016; 55 :498–515. doi: 10.1111/jssr.12287. [ CrossRef ] [ Google Scholar ]
  • Pacini Rosemary, Epstein Seymour. The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology. 1999; 76 :972–87. doi: 10.1037/0022-3514.76.6.972. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Patel Niraj, Baker S. Glenn, Scherer Laura D. Evaluating the cognitive reflection test as a measure of intuition/reflection, numeracy, and insight problem solving, and the implications for understanding real-world judgments and beliefs. Journal of Experimental Psychology: General. 2019; 148 :2129–53. doi: 10.1037/xge0000592. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Barr Nathaniel, Koehler Derek J., Fugelsang Jonathan A. On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making. 2015; 10 :549–63. doi: 10.1017/S1930297500006999. [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Seti Paul, Koehler Derek J., Fugelsang Jonathan A. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012; 123 :335–46. doi: 10.1016/j.cognition.2012.03.003. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ren Xuezhu, Tong Yan, Peng Peng, Wang Tengfei. Critical thinking predicts academic performance beyond cognitive ability: Evidence from adults and children. Intelligence. 2020; 82 :10187. doi: 10.1016/j.intell.2020.101487. [ CrossRef ] [ Google Scholar ]
  • Rogers Paul, Fisk John E., Lowrie Emma. Paranormal belief, thinking style preference and susceptibility to confirmatory conjunction errors. Consciousness and Cognition. 2018; 65 :182–95. doi: 10.1016/j.concog.2018.07.013. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saher Marieke, Lindeman Marjaana. Alternative medicine: A psychological perspective. Personality and Individual Differences. 2005; 39 :1169–78. doi: 10.1016/j.paid.2005.04.008. [ CrossRef ] [ Google Scholar ]
  • Sosu Edward M. The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity. 2013; 9 :107–19. doi: 10.1016/j.tsc.2012.09.002. [ CrossRef ] [ Google Scholar ]
  • Stahl Tomas, van Prooijen Jan-Wilem. Epistemic irrationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences. 2018; 122 :155–63. doi: 10.1016/j.paid.2017.10.026. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Reasoning independently of prior belief and individual differences in actively open-minded thinking. Journal of Educational Psychology. 1997; 89 :345–57. doi: 10.1037/0022-0663.89.2.342. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Natural myside bias is independent of cognitive ability. Thinking & Reasoning. 2007; 13 :225–47. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict myside and one-sided thinking bias. Thinking and Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. The MIT Press; Cambridge, MA: 2018. [ Google Scholar ]
  • Sternberg Robert J. The Triarchic Mind: A New Theory of Intelligence. Penguin Press; London: 1988. [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. The search for the elusive basic processes underlying human intelligence: Historical and contemporary perspectives. Journal of Intelligence. 2022; 10 :28. doi: 10.3390/jintelligence10020028. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Voracek Martin, Stieger Stefan, Tran Ulrich S., Furnham Adrian. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014; 133 :572–85. doi: 10.1016/j.cognition.2014.08.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Teunisse Alessandra K., Case Trevor I., Fitness Julie, Sweller Naomi. I should have known better: Development of a self-report measure of gullibility. Personality and Social Psychology Bulletin. 2019; 46 :408–23. doi: 10.1177/0146167219858641. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobacyk Jerome J. A revised paranormal belief scale. The International Journal of Transpersonal Studies. 2004; 23 :94–98. doi: 10.24972/ijts.2004.23.1.94. [ CrossRef ] [ Google Scholar ]
  • van der Linden Sander. The conspiracy-effect: Exposure to conspiracy theories (about global warming) leads to decreases pro-social behavior and science acceptance. Personality and Individual Differences. 2015; 87 :173–75. doi: 10.1016/j.paid.2015.07.045. [ CrossRef ] [ Google Scholar ]
  • van Prooijen Jan-Willem. Belief in conspiracy theories: Gullibility or rational skepticism? In: Forgas Joseph P., Baumeister Roy F., editors. The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. Routledge; London: 2019. pp. 319–32. [ Google Scholar ]
  • Wechsler David. The Measurement of Intelligence. 3rd ed. Williams & Witkins; Baltimore: 1944. [ Google Scholar ]
  • West Richard F., Toplak Maggie E., Stanovich Keith E. Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology. 2008; 100 :930–41. doi: 10.1037/a0012842. [ CrossRef ] [ Google Scholar ]

IMAGES

  1. 9 common thinking errors

    common errors in critical thinking

  2. 7 Barriers to Critical Thinking and How to Destroy Them

    common errors in critical thinking

  3. Common Thinking Errors

    common errors in critical thinking

  4. Thinking Errors

    common errors in critical thinking

  5. 13. Awareness of common thinking errors

    common errors in critical thinking

  6. Printable Thinking Errors Worksheet

    common errors in critical thinking

VIDEO

  1. Rectification of Errors

  2. 02 Logical Hazards

  3. Critical Thinking Part 15 : Common Errors of Relevance #critical #criticalthinking

  4. Every Cognitive Bias Explained in 6 minutes!

  5. Are You UNKOWINGLY Making These Logical Mistakes?

  6. Omission Bias: The Decision Errors We Make by Doing Nothing

COMMENTS

  1. 10 Thinking Errors That Will Crush Your Mental Strength

    The most common thinking errors can be divided into these 10 categories, which are adapted from David Burns's book, Feeling Good: The New Mood Therapy. 1. All-or-Nothing Thinking Sometimes we see ...

  2. Ten of the Most Common Errors in Thinking

    3. It would be a disaster if I happen to forget the last name of my new boss. 4. I dont understand why people enjoy the beach; I got sand in my shoes! 5. You must hate me because Im fat (mind ...

  3. Cognitive Distortions: 22 Examples & Worksheets (& PDF)

    Errors in thinking, or cognitive distortions, are particularly effective at provoking or exacerbating symptoms of depression. It is still a bit ambiguous as to whether these distortions cause depression or depression brings out these distortions (after all, correlation does not equal causation!) but it is clear that they frequently go hand-in-hand.

  4. 17 Common Thinking Errors: How to Identify and Replace Them

    17 Common Thinking Errors. 1. All or Nothing Thinking. Right/Wrong Thinking; Black/White; Good/Bad; Absolute or Extreme Categories. Desire for perfection. Can leave us feeling trapped, helpless, or lack choices and compromise.

  5. Critical Thinking and Decision-Making: Logical Fallacies

    Sometimes logical fallacies are intentionally used to try and win a debate. In these cases, they're often presented by the speaker with a certain level of confidence.And in doing so, they're more persuasive: If they sound like they know what they're talking about, we're more likely to believe them, even if their stance doesn't make complete logical sense.

  6. 7 Critical Thinking Barriers and How to Overcome Them

    Most importantly, we must discover how to get around these barriers. This article will explore seven common critical thinking barriers and how to effectively circumvent them. In our view, the 7 most common and harmful critical thinking barriers to actively overcome are: Egocentric Thinking. Groupthink. Drone Mentality.

  7. 15 Cognitive Distortions To Blame for Negative Thinking

    15 common cognitive distortions and examples of each. The most common cognitive distortions or distorted thoughts include: filtering. polarization. overgeneralization. discounting the positive ...

  8. Sources of Errors in Thinking and How to Avoid Them

    Being aware of these errors is crucial to developing critical thinking skills. ... Beveridge's list of common errors in scientific thinking includes: clinging to ideas that have been shown to be untrue or useless, an inability to subordinate ideas to facts, an inability to submit hypothesis to careful scrutiny, and a failure to recognize that ...

  9. 5 Barriers to Critical Thinking

    Of course, these are not the only barriers to CT; rather, they are five that may have the most impact on how one applies CT. 1. Trusting Your Gut. Trust your gut is a piece of advice often thrown ...

  10. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  11. Critical thinking

    Critical thinking is a key skill for HR and all people professionals - it's the ability to think well and to reflect objectively on the ideas, opinions and arguments of others. ... when in the room somebody may be actually using critical thinking technique or perhaps we would call them common errors, common errors in critical thinking, to ...

  12. What are Thinking Errors in CBT (and how to manage them)

    Thinking Errors - also known as Cognitive Distortions - are irrational and extreme ways of thinking that can maintain mental and emotional issues. Anxiety, low mood, worry, anger management issues are often fuelled by this type of thinking. Thinking errors, proposed initially by Aaron Beck (1963) (one of the leading CBT figures), are ...

  13. Analytical Thinking

    Here, I start a two-series posting on the other two basic learning competencies: critical thinking and creativity. In this post, I will summarize the more common thinking errors that corrupt analysis.

  14. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  15. APPENDIX A: CRITICAL THINKING MISTAKES

    CRITICAL THINKING MISTAKES. Critical thinking is reasonable and reflective thinking aimed at deciding what to believe and what to do. Throughout this book, we have identified mistakes that a good critical thinker should avoid. Some are mistakes that can arise in clarifying or defining a view. Others are mistakes that can arise as we collect or ...

  16. Appendix A Critical Thinking Mistakes

    Knowing what they are will help us to avoid them in our own reasoning. But it will also help to make it clear just what the value is in being a critical thinker: thinking critically is valuable in part because it helps us to avoid some mistakes. This appendix lists all of the mistakes we have discussed. Personalizing Reasons.

  17. Common-critical-thinking-fallacies

    It suggests that the best solution is to remain ignorant about the situation. Appeal to Emotion. As the name suggests, one of the critical thinking fallacies appeal to the emotions of the audience. It aims to evoke feelings like sympathy and affection - both of which can be irrelevant to the original arguments. Fallacy of Composition and ...

  18. PDF Seven Thinking Errors

    Here is a list of common off the mark thinking errors. Use this handout to "catch your self-talk" and examine the evidence to get more realistic, on the mark thoughts. Put one checkmark next to the types of thinking habits you sometimes fall into, two checkmarks next to thinking habits that come up pretty often, and three checkmarks

  19. Preventing Common Thinking Errors

    Common Thinking Errors. AD HOMINEN ARGUMENT: discounting a position or conclusion on the basis of the person who makes it, rather than the merits of the argument itself. ALL-OR-NOTHING THINKING ...

  20. 8 Common Thinking Mistakes Our Brains Make Every Day

    6. We make decisions based on the anchoring effect. Dan Ariely is a behavioural economist who gave one of my favorite TED talks ever about the irrationality of the human brain when it comes to making decisions. He illustrates this particular mistake in our thinking superbly, with multiple examples.

  21. 21 Mind Traps : The Ultimate Guide to your most common Thinking errors

    You probably think you are in complete control of your decisions and thoughts. But how often are they guided by something else? Something you don't even noti...

  22. Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An

    Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the "Halpern Critical Thinking Assessment" (HCTA) provides respondents with a mock newspaper story about research showing that ...

  23. Critical Thinking: Navigate Layoff Mistakes

    Learn critical thinking strategies to handle layoffs effectively and avoid common mistakes during this challenging time.