Robert Evans Wilson Jr.

Cognitive Bias Is the Loose Screw in Critical Thinking

Recognizing your biases enhances understanding and communication..

Posted May 17, 2021 | Reviewed by Jessica Schrader

  • People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality.
  • Cognitive biases are mental shortcuts people take in order to process the mass of information they receive daily.
  • Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias.

When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager , some of my friends began to smoke; I wanted to smoke too, but my parents forbid it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.

When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90% of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors: 11 of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I. I started smoking without concern because I had fallen prey to an authority bias , which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.

It's Likely You're Unaware of These Habits

Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you replied "yes” to any of these, then you may be guilty of relying on a cognitive bias.

In my last post, I wrote about the importance of critical thinking, and how in today’s information age, no one has an excuse for living in ignorance. Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all culpable of leaning on these mental crutches, even though we don’t do it intentionally.

What Are Cognitive Biases?

The Cambridge English Dictionary defines cognitive bias as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.

PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.

PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”

And, according to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.

In brief, a cognitive bias is a shortcut to thinking. And, it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It is simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they are told. Nevertheless, as understandable as depending on biases may be, it is still a severe deterrent to critical thinking.

Here's What to Watch Out For

Wikipedia lists 197 different cognitive biases. I am going to share with you a few of the more common ones so that in the future, you will be aware of the ones you may be using.

Confirmation bias is when you prefer to attend media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress . On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain, which will enable you to think more creatively (see my post: Surprise: Creativity Is a Skill, Not a Gift! ).

Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my post: Sometimes You Have to Rip the Cover Off ). Similar to anchoring is the halo effect , which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality . For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.

critical thinking vs bias

Hindsight bias is the inclination to see some events as more predictable than they are; also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.

Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it is vulnerable to revision when you receive new information.

Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke down, but your colleague failed to get a promotion because of incompetence.

False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.

Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that tend to run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.

Bandwagon effect, also known as herd mentality or groupthink , is the propensity to accept beliefs or values because many other people also hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people will not think through an opinion and will assume it is correct because so many others agree with it.

Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would for anyone else. This is especially true in medicine where experts are frequently seen as infallible. An example would be an advertiser showing a doctor, wearing a lab coat, touting their product.

Negativity bias is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias is not as necessary (see my post Fear: Lifesaver or Manipulator ).

Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.

Understand More and Communicate Better

Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.

Source: Cognitive Bias Codex by John Manoogian III/Wikimedia Commons

Robert Wilson is a writer and humorist based in Atlanta, Georgia.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

loading

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

What Is Cognitive Bias? 7 Examples & Resources (Incl. Codex)

defining cognitive biases

For example, we might:

  • Trust someone more if they’re an authority figure than if they’re not
  • Assume someone’s gender based on their profession
  • Make poor decisions based on the information that we’re given

The reasons for our poor decision making can be a consequence of heuristics and biases. In general, heuristics and biases describe a set of decision-making strategies and the way that we weigh certain types of information. The existing literature on cognitive biases and heuristics is extensive, but this post is a user-friendly summary.

Central to this post’s topic is how cognitive heuristics and biases influence our decision making. We will also learn more about how to overcome them.

Before you continue, we thought you might like to download these Positive CBT Exercises for free . These science-based exercises will provide you with detailed insight into Positive CBT and give you the tools to apply it in your therapy or coaching.

This Article Contains:

What are cognitive biases.

  • List and Types of Biases: The Codex

4 Examples of Cognitive Biases

Examples in business and everyday life, role of biases in decision making, 2 popular experiments, 4 ways to overcome your biases, bias modification exercises and activities, a look at cognitive bias modification apps, 5 relevant books, our favorite ted talks on the topic, resources from positivepsychology.com, a take-home message.

When considering the term ‘ cognitive biases ,’ it’s important to note that there is overlap between cognitive biases and heuristics . Sometimes these two terms are used interchangeably, as though they are synonyms; however, their relationship is nuanced.

In his book, Thinking, Fast and Slow , Professor Daniel Kahneman (2011, p. 98) defines heuristics as

“ a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. ”

Tversky and Kahneman (1974, p. 1130) define the relationship between biases and heuristics as follows:

“ … cognitive biases that stem from the reliance on judgmental heuristics. ”

Gonzalez (2017, p. 251) also described the difference between the two terms:

“ Heuristics are the ‘ shortcuts ’ that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behavior and the heuristically determined behavior. ”

Lists and Types of Biases: The Codex

cognitive bias codex

Created by John Manoogian III and Buster Benson, this codex is a useful tool for visually representing all of the known biases that exist to date.

The biases are arranged in a circle and can be divided into four quadrants. Each quadrant is dedicated to a specific group of cognitive biases:

  • What should we remember? Biases that affect our memory for people, events, and information
  • Too much information Biases that affect how we perceive certain events and people
  • Not enough meaning Biases that we use when we have too little information and need to fill in the gaps
  • Need to act fast Biases that affect how we make decisions

The Cognitive Bias Codex is a handy visual tool that organizes biases in a meaningful way; however, it is worth pointing out that the codex lists heuristics and biases both as ‘biases.’

If you decide to rely on the Cognitive Bias Codex, then keep in mind the distinction between heuristics and biases mentioned above.

gambler's falacy

1. Confirmation bias

This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998). For example, a police officer who is looking for physical signs of lying might mistakenly classify other behaviors as evidence of lying.

2. Gambler’s fallacy

This false belief describes our tendency to believe that something will happen because it hasn’t happened yet (Ayton & Fischer, 2004; Clotfelter & Cook, 1993).

For example, when betting on a roulette table, if previous outcomes have landed on red, then we might mistakenly assume that the next outcome will be black; however, these events are independent of each other (i.e., the probability of their results do not affect each other).

3. Gender bias

Gender bias describes our tendency to assign specific behavior and characteristics to a particular gender without supporting evidence (Garb, 1997).

For example, complaints of pain are taken more seriously when made by male, rather than female, patients (Gawande, 2014); women are perceived as better caregivers than men (Anthony, 2004); specific clinical syndromes are more readily diagnosed in women than in men (Garb, 1997); and students often rate female lecturers lower than male lecturers (MacNell, Driscoll, & Hunt; 2014; Mitchell & Martin, 2018).

4. Group attribution error

This error describes our tendency to overgeneralize how a group of people will behave based on an interaction with only one person from that group (Pettigrew, 1979).

For example, a negative experience with someone from a different group (e.g., a different culture, gender, religion, political party, etc.) might make us say that all members of that group share the same negative characteristics. Group attribution error forms part of the explanation for prejudice in social psychology.

critical thinking vs bias

Download 3 Free Positive CBT Exercises (PDF)

These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.

Download 3 Free Positive CBT Tools Pack (PDF)

By filling out your name and email address below.

Gender bias in the workplace is a well-documented and researched area of cognitive bias. Women often do not occupy top senior positions. For example, in 2010, only 15.2% of top positions in US Fortune-500 companies were held by women (Soares, 2010). Women tend to earn less than their male counterparts, and women’s salaries differ according to their marital status.

For example, consider these statistics reported by Güngör and Biernat (2009, p. 232):

“ [In 2005]  … 68.1% of married and 79.8% of single mothers in the U.S. participate in the workforce, but while non-mothers earn 90 cents to a man’s dollar, mothers earn 73 cents, and single mothers earn about 60 cents.”

The social desirability bias is a concern for anyone who uses self-report data. Companies that run internal surveys investigating topics that may cast an employee in a poor light must be aware of how the social desirability bias will affect the validity of their data.

Knowing that people adjust their answers to appear more socially desirable, investigators (such as researchers and clinicians) can try to reframe their questions to be less direct, use formal tests, or anonymize responses.

Another sphere of our lives where biases can have devastating effects is in personal finance. According to Hershey, Jacobs-Lawson, and Austin (2012), there are at least 40 cognitive biases that negatively affect our ability to make sound financial decisions, thus hindering our ability to plan for retirement properly. Some of these biases include:

  • Halo effect (just because that real estate agent was nice doesn’t mean it’s a good deal)
  • Optimistic overconfidence (“I’ll be fine in the future, so I don’t need to save that much now.”)
  • Confirmation bias (looking for information to confirm or validate unwise financial decisions)

Below you might find revealing insight into how biases affect our decision making.

The Monty Hall problem

the monty hall problem

Assume that there are three doors.

  • Behind one door is a fantastic prize: a car.
  • Behind the other two doors are mediocre prizes: $1,000.

You initially choose Door 1. Before revealing what’s behind your chosen door, the presenter opens a different door, Door 2, to reveal the mediocre prize. The presenter then gives you the option to either keep what’s behind your initial chosen door or change your choice, knowing what’s behind Door 2. What should you do now? Should you stay with your initial choice, Door 1, or should you switch to Door 3?

The correct answer is that you have the best chances of winning the car if you change your choice. This is called the Monty Hall problem. Here’s why you should switch:

  • When you made your initial decision, you didn’t know what the outcome would be (mediocre prize versus awesome prize).
  • After the host reveals more information, you have a better idea about which prizes are behind which doors.
  • Based on this information, you’re more likely to find the car if you change your chosen door, an improvement from odds of 1 in 3 for your initial choice, to 2 in 3 if you switch.

Despite the statistics being in favor of switching, most people are hesitant to abandon their first choice and don’t accept the offer to change it.

Other cognitive biases

The Monty Hall problem is an excellent example of how our intuitions and heuristics lead us to make poor decisions. However, there are lots of other cognitive biases and heuristics that also affect our decision making.

Kahneman, Slovic, Slovic, & Tversky (1982) list 13 biases that arise from the following three heuristics:

  • A cognitive bias that may result from this heuristic is that we ignore the base rate of events occurring when making decisions. For example, I am afraid of flying; however, it’s more likely that I might be in a car crash than in a plane crash. Despite this, I still hate flying but am indifferent to hopping into my car.
  • For example, when a violent crime occurs in a neighborhood, neighbors in that neighborhood will give a bigger estimate of the frequency of these crimes, compared to the reported statistics. The reason for their overestimate is that the memory of the violent crime is easy to retrieve, which makes it seems like violent crime happens more frequently than it actually does.
  • For example, assume that I offer to sell you a car and I ask for $250. You counter with $200. You might think that this is a good deal because you bought the car for less than the asking price; however, your counteroffer was heavily influenced by my asking price, and you’re not likely to deviate too much from it.

To further illustrate the effect of cognitive bias, below are two popular experiments.

1. Anchoring and adjustment

Tversky and Kahneman (1974) found that our estimates are heavily influenced by the first number given to us. For example, participants were asked to estimate the percentage of African countries in the United Nations.

Before giving their answer, each participant had to spin a ‘Wheel of Fortune,’ which would determine their initial starting percentage. The result of the ‘Wheel of Fortune’ was random and meaningless. Despite this, participants’ estimate of African UN member-countries didn’t differ much from whatever random ‘Wheel of Fortune’ amount they landed on, regardless of what that amount was.

2. The attractiveness halo effect

Male students were asked to rate essays written by female authors (Landy & Sigall, 1974). The quality of the essays varied: some were poorly written, and others were well written.

Additionally, some of the essays were accompanied by a photograph of the author (who was either attractive or unattractive), and others were not. Male college students rated the quality of the essay and the talent of the authors higher when:

  • the essay was written by an attractive author, and
  • this effect was evident only when the essay was of poor quality.

In this study, the male students demonstrated the halo effect, applying the perceived attractiveness of the female author to the quality of the paper.

ways to overcome cognitive bias

1. Reflect on past decisions

If you’ve been in a similar situation before, you can reflect on the outcomes of those previous decisions to learn how to overcome your biases.

An example of this is budgeting. We tend to underestimate how much money we need to budget for certain areas of our life. However, you can learn how much money to budget by tracking your expenditure for the last few months. Using this information from the past, you can better predict how much money you’ll need for different financial categories in the future.

2. Include external viewpoints

There is some evidence that we make better decisions and negotiations when we consult with other people who are objective, such as mediators and facilitators (Caputo, 2016).

Therefore, before making a decision, talk to other people to consider different viewpoints and have your own views challenged. Importantly, other people might spot your own cognitive biases.

3. Challenge your viewpoints

When making a decision, try to see the weaknesses in your thinking regardless of how small, unlikely, or inconsequential these weaknesses might seem. You can be more confident in your decision if it withstands serious, critical scrutiny.

4. Do not make decisions under pressure

A final way to protect yourself from relying on your cognitive biases is to avoid making any decisions under time pressure. Although it might not feel like it, there are very few instances when you need to make a decision immediately. Here are some tips for making a decision that can have substantial consequences:

  • Take the necessary time to ruminate.
  • List the pros and cons.
  • Talk to friends or family members for advice (but remember that they may have their own biases).
  • Try to poke holes in your reasoning.

critical thinking vs bias

World’s Largest Positive Psychology Resource

The Positive Psychology Toolkit© is a groundbreaking practitioner resource containing over 500 science-based exercises , activities, interventions, questionnaires, and assessments created by experts using the latest positive psychology research.

Updated monthly. 100% Science-based.

“The best positive psychology resource out there!” — Emiliya Zhivotovskaya , Flourishing Center CEO

In the last decade, research has looked at cognitive bias modification (CBM) since cognitive biases are associated with the severity of anxiety and depression. The relationship between cognitive biases and anxiety and depression is assumed to be causal; that is, cognitive biases cause an increase in the severity of symptoms.

CBM exercises are designed with this causal relationship in mind. If the cognitive bias is removed or reduced, then the severity of the symptoms should also lessen.

There are two categories of CBM exercises:

  • Changing attentional bias: In this type of exercise, participants are trained to pay more attention to positive stimuli instead of negative stimuli.
  • Changing interpretation bias: Participants are primed with positive information before completing an emotionally ambiguous task.

At least six meta-analyses report conflicting findings (Beard, Sawyer, & Hofmann, 2012; Cristea, Kok, & Cuijpers, 2015; Hakamata et al., 2010; Hallion & Ruscio, 2011; Heeren, Mogoașe, Philippot, & McNally, 2015; Mogoaşe, David, & Koster, 2014).

There are many reasons for these differences; for example, the types of studies included, the moderators included, the definition of the interventions, the outcome variable used, the clinical condition studied, and so forth. Therefore, the jury is still out on whether CBM affects symptom severity reliably.

There are many cognitive bias modification apps available for download. Before purchasing an app, research whether the creator of the app has followed sound research principles or done any research when developing the app (Zhang, Ying, Song, Fung, & Smith, 2018).

Most of the bias modification apps aim to change the attentional bias. For example, the following apps aim to train users to respond quicker to happy faces than to sad or angry faces. All hypothesize that repeated use will result in more positive moods.

  • Bias Modification
  • Upbeat Mind: Positivity Trainer

The Cognitive Bias Cheatsheet is a useful way to remind oneself of the different cognitive biases that exist.

Here is a list of books relevant for anyone interested in cognitive biases.

Firstly, any list about biases would be remiss without Thinking, Fast and Slow by Daniel Kahneman (2011). In this book, Kahneman unpacks some of the most common biases that we experience when making decisions. (Available on Amazon )

In the same vein is The Drunkard’s Walk: How Randomness Rules Our Lives by Leonard Mlodinow (2009). This book addresses how humans misjudge the effect that randomness has on our decision making. (Available on Amazon )

Predictably Irrational by Dan Ariely (2008) is an excellent and very accessible book about how our behavior is often governed by seemingly random and illogical thought processes. The opening chapter is jaw dropping. (Available on Amazon )

Nassim Nicholas Taleb published a series of books – five, in fact – and I include two of them on this list: Fooled by Randomness (2005) and The Black Swan (2007). The entire series discusses various aspects of uncertainty. (Available on Amazon )

We’ve put together a list of our favorite impressive TED talks on cognitive biases.

If you want to learn more about cognitive biases, then these talks are a great jumping-off point:

Are We in Control of Our Own Decisions? – Dan Ariely

Confirmation bias – nassor al hilal.

Confirmation Bias in 5 Minutes – Julia Galef

If you want to learn how to overcome your biases, then we can recommend the following:

How to Outsmart Your Own Unconscious Bias – Valerie Alexander

How to design gender bias out of your workplace – Sara Sanford

Unpacking the biases that shape our beliefs – Mike Hartmann

We have useful resources that you can use when tackling cognitive biases.

First, increasing awareness of Unhelpful Thinking Styles can change the way you think about yourself and your environment. Ultimately, users will increase their awareness of their cognitive biases, and through this awareness, be able to change their behavior.

Our Neutralizing Judgmental Thoughts worksheet is also useful for combating negative thoughts and biases. This exercise helps users apply the CLEAR acronym to adopt a less critical outlook when dealing with others.

The Core Beliefs Worksheet  is a useful tool for reflecting on the origin and validity of our core beliefs. This technique might help us ‘step away’ from our biases.

An approach that is always beneficial, is to understand and find ways to apply positive psychology to your every day, and this selection of positive psychology TED Talks is a good starting point.

If you’re looking for more science-based ways to help others through CBT, this collection contains 17 validated positive CBT tools for practitioners. Use them to help others overcome unhelpful thoughts and feelings and develop more positive behaviors.

critical thinking vs bias

17 Science-Based Ways To Apply Positive CBT

These 17 Positive CBT & Cognitive Therapy Exercises [PDF] include our top-rated, ready-made templates for helping others develop more helpful thoughts and behaviors in response to challenges, while broadening the scope of traditional CBT.

Created by Experts. 100% Science-based.

We often rely on cognitive heuristics and biases when making decisions.

Heuristics can be useful in certain circumstances; however, heuristics and biases can result in poor decision making and reinforce unhealthy behavior.

There are many different types of cognitive biases, and all of us are victim to one or more.

However, being aware of our biases and how they affect our behavior is the first step toward resisting them.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .

  • Anthony, A. S. (2004). Gender bias and discrimination in nursing education: Can we change it? Nurse Educator, 29 (3), 121–125.
  • Ariely, D. (2008). Predictably irrational. Harper Perennial.
  • Ayton, P., & Fischer, I. (2004). The hot hand fallacy and the gambler’s fallacy: Two faces of subjective randomness? Memory & Cognition, 32 (8), 1369–1378.
  • Beard, C., Sawyer, A. T., & Hofmann, S. G. (2012). Efficacy of attention bias modification using threat and appetitive stimuli: A meta-analytic review. Behavior Therapy, 43 (4), 724–740.
  • Caputo, A. (2016). Overcoming judgmental biases in negotiations: A scenario-based survey analysis on third party direct intervention. Journal of Business Research, 69 (10), 4304–4312.
  • Clotfelter, C. T., & Cook, P. J. (1993). The “gambler’s fallacy” in lottery play. Management Science, 39( 12), 1521–1525.
  • Cristea, I. A., Kok, R. N., & Cuijpers, P. (2015). Efficacy of cognitive bias modification interventions in anxiety and depression: Meta-analysis. The British Journal of Psychiatry, 206 (1), 7–16.
  • Edgar, G., & Edgar, H. (2016). Perception and attention: Errors and accidents. In D. Groome and M.W. Eysenck (Eds.), An introduction to applied cognitive psychology (2nd ed) (pp. 9–38). Routledge.
  • Garb, H. N. (1997). Race bias, social class bias, and gender bias in clinical judgment. Clinical Psychology: Science and Practice, 4 (2), 99–120.
  • Gawande, A. (2014). Being mortal: Medicine and what matters in the end. Metropolitan Books.
  • Gonzalez, C. (2017). Decision-making: A cognitive science perspective. In S. Chipman (Ed.), The Oxford handbook of cognitive science (pp. 249–264). Oxford University Press. Accessed on July 9, 2020 from https://www.cmu.edu/dietrich/sds/ddmlab/papers/oxfordhb-9780199842193-e-6.pdf
  • Güngör, G., & Biernat, M. (2009). Gender bias or motherhood disadvantage? Judgments of blue-collar mothers and fathers in the workplace. Sex Roles, 60 (3–4), 232–246.
  • Hakamata, Y., Lissek, S., Bar-Haim, Y., Britton, J. C., Fox, N. A., Leibenluft, E., … & Pine, D. S. (2010). Attention bias modification treatment: A meta-analysis toward the establishment of novel treatment for anxiety. Biological Psychiatry, 68 (11), 982–990.
  • Hallion, L. S., & Ruscio, A. M. (2011). A meta-analysis of the effect of cognitive bias modification on anxiety and depression. Psychological Bulletin, 137 (6), 940.
  • Heeren, A., Mogoașe, C., Philippot, P., & McNally, R. J. (2015). Attention bias modification for social anxiety: A systematic review and meta-analysis. Clinical Psychology Review, 40 , 76–90.
  • Hershey, D. A., Jacobs-Lawson, J. M., & Austin, J. T. (2012). Effective financial planning for retirement. In M. Wang (Ed.), Oxford handbook of retirement (pp. 402–430). Oxford University Press.
  • Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus, and Giroux.
  • Kahneman, D., Slovic, S. P., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press.
  • Landy, D., & Sigall, H. (1974). Beauty is talent: Task evaluation as a function of the performer’s physical attractiveness. Journal of Personality and Social Psychology, 29 (3), 299.
  • MacNell, L., Driscoll, A., & Hunt, A. N. (2014). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40 (4), 291–303.
  • Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics, 51 (3), 648–652.
  • Mlodinow, L. (2009). The drunkard’s walk: How randomness rules our lives. Vintage.
  • Mogoaşe, C., David, D., & Koster, E. H. (2014). Clinical efficacy of attentional bias modification procedures: An updated meta‐analysis. Journal of Clinical Psychology, 70 (12), 1133–1157.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175–220.
  • Pettigrew, T. F. (1979). The ultimate attribution error: Extending Allport’s cognitive analysis of prejudice. Personality and Social Psychology Bulletin, 5 (4), 461–476.
  • Soares, R. (2010). 2010 Catalyst census: Fortune 500 women board directors . Catalyst.
  • Taleb, N. N. (2005). Fooled by randomness: The hidden role of chance in life and in the markets (vol. 1). Random House.
  • Taleb, N. N. (2007). The black swan: The impact of the highly improbable (vol. 2). Random House.
  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 , 1124–1131.
  • Zhang, M., Ying, J., Song, G., Fung, D. S., & Smith, H. (2018). Attention and cognitive bias modification apps: Review of the literature and of commercially available apps. JMIR mHealth and uHealth , 6 (5).

' src=

Share this article:

Article feedback

What our readers think.

Joseph

It’s truly appreciated your efforts I am sure this is going to be of immense help in my lectures to diploma students

Thom Moore

Dumb. I wish the author would have used the “4 Ways to Overcome Bias” prior to writing this irrelevant article. Then maybe the contradictions could have been avoided such as the codex containing both heuristics and biases or the gender bias and marital status. Leading off with gender bias, by the way, is a dead giveaway that this is just propaganda disguised as pseudo-intellectual tripe. The author painted the issue as “women make less, so = bias.” Allow me to use the group attribution bias and call this as another article from majority female psychology “profession.”

Alex

I agree that awareness of our biases is the first step to overcoming them. But I think it’s important to also understand that we all have biases, and that they’re not necessarily a bad thing. We need biases to make decisions, and without them we would be paralyzed. The key is to be aware of our biases and to try to overcome them when they lead us astray.

Bruce

You may be conflating heuristics with bias. Heuristics are shortcuts to see pattern and simplify things based on our experience and intuition. Heuristics are needed to make decisions and solve problems. But heuristics are also prone to thinking errors. Biases are thinking errors.

Elizabeth Cortez

Thank you for this feast of information that I will be savoring over and over for weeks.

Cynthia Braccini

Thank you very much for the detailed information and resources. I plan to utilize this as a link with certain highlighted components for my social psychology unit with my students.

PATRICK MITI

VERY GOOD ARTICLE. MAY USE IT TO TRAIN MY STUDENTS RESILIENCE AND GRIT AND PERCEPTIVE CAPACITY IN COUNSELLING SESSIONS

PATRICK MITI

Chris Sanders

Thanks for the detailed blog. I’m going to provide this as a link on my critical thinking course that I teach at a university.

Let us know your thoughts Cancel reply

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

critical thinking vs bias

The Positive CBT Triangle Explained (+11 Worksheets)

Cognitive behavioral therapy (CBT) is a popular and highly effective intervention model for dealing with multiple mental health conditions (Early & Grady, 2017; Yarwood et [...]

FEA

Fundamental Attribution Error: Shifting the Blame Game

We all try to make sense of the behaviors we observe in ourselves and others. However, sometimes this process can be marred by cognitive biases [...]

Halo effect

Halo Effect: Why We Judge a Book by Its Cover

Even though we may consider ourselves logical and rational, it appears we are easily biased by a single incident or individual characteristic (Nicolau, Mellinas, & [...]

Read other articles by their category

  • Body & Brain (52)
  • Coaching & Application (39)
  • Compassion (23)
  • Counseling (40)
  • Emotional Intelligence (21)
  • Gratitude (18)
  • Grief & Bereavement (18)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (16)
  • Mindfulness (40)
  • Motivation & Goals (41)
  • Optimism & Mindset (29)
  • Positive CBT (28)
  • Positive Communication (23)
  • Positive Education (36)
  • Positive Emotions (32)
  • Positive Leadership (16)
  • Positive Parenting (14)
  • Positive Psychology (21)
  • Positive Workplace (35)
  • Productivity (16)
  • Relationships (46)
  • Resilience & Coping (38)
  • Self Awareness (20)
  • Self Esteem (37)
  • Strengths & Virtues (29)
  • Stress & Burnout Prevention (33)
  • Theory & Books (42)
  • Therapy Exercises (37)
  • Types of Therapy (54)

critical thinking vs bias

3 Positive CBT Exercises (PDF)

  • Mind Mapping: A Creative Problem Solving Tool
  • Design Thinking Techniques: A Comprehensive Overview
  • Exploring the Serendipity Technique of Creative Problem Solving
  • Brainwriting: A Group Brainstorming Technique
  • Analytical problem solving
  • Identifying root causes
  • Analyzing consequences
  • Brainstorming solutions
  • Heuristic problem solving
  • Using analogies
  • Applying existing solutions
  • Trial and error
  • Creative problem solving
  • Mind mapping
  • Brainstorming
  • Lateral thinking
  • Research skills
  • Interpreting information
  • Data collection and analysis
  • Identifying patterns
  • Critical thinking skills
  • Recognizing bias
  • Analyzing arguments logically
  • Questioning assumptions
  • Communication skills
  • Negotiation and compromise
  • Listening skills
  • Explaining ideas clearly
  • Planning techniques
  • SWOT analysis
  • Gantt charting
  • Critical path analysis
  • Decision making techniques
  • Force field analysis
  • Paired comparison analysis
  • Cost-benefit analysis
  • Root cause analysis
  • Five whys technique
  • Fault tree analysis
  • Cause and effect diagrams
  • Brainstorming techniques
  • Brainwriting
  • Brainwalking
  • Round-robin brainstorming
  • Creative thinking techniques
  • Serendipity technique
  • SCAMPER technique
  • Innovation techniques
  • Value innovation techniques
  • Design thinking techniques
  • Idea generation techniques
  • Personal problems
  • Deciding what career to pursue
  • Managing finances effectively
  • Solving relationship issues
  • Business problems
  • Increasing efficiency and productivity
  • Improving customer service quality
  • Reducing costs and increasing profits
  • Environmental problems
  • Preserving natural resources
  • Reducing air pollution levels
  • Finding sustainable energy sources
  • Individual brainstorming techniques
  • Thinking outside the box
  • Word association and random word generation
  • Mind mapping and listing ideas
  • Group brainstorming techniques
  • Synectics technique
  • Online brainstorming techniques
  • Online whiteboarding tools
  • Virtual brainstorming sessions
  • Collaborative mind mapping software
  • Team activities
  • Group decision making activities
  • Debate activities and role-play scenarios
  • Collaborative problem solving games
  • Creative activities
  • Creative writing exercises and storyboards
  • Imagination activities and brainstorming sessions
  • Visualization activities and drawing exercises
  • Games and puzzles
  • Crossword puzzles and Sudoku
  • Logic puzzles and brain teasers
  • Jigsaw puzzles and mazes
  • Types of decisions
  • Structured decisions
  • Simple decisions
  • Complex decisions
  • Problem solving skills
  • Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.

Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.

Confirmation bias

Cognitive bias.

This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.

Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.

Implications of Not Recognizing or Addressing Bias

Strategies for identifying and addressing bias.

Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.

It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.

Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.

Types of Bias

Halo effect, what is bias.

It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.

Sources of Bias

Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.

For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.

In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.

Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.

Collaborative Problem Solving Games: Exploring Creative Solutions for Teams

  • Collaborative Problem Solving Games: Exploring Creative Solutions for Teams

Explore creative solutions for teams with collaborative problem solving games. Learn how to use games to help foster collaboration and problem-solving skills.

Mind Mapping - Creative Problem Solving and Creative Thinking Techniques

  • Mind Mapping - Creative Problem Solving and Creative Thinking Techniques

Learn about mind mapping, a creative problem solving technique that can help you brainstorm ideas for projects and topics, and more.

Finding Sustainable Energy Sources

  • Finding Sustainable Energy Sources

This article explores the challenges of finding sustainable energy sources, as well as problem solving examples for environmental problems.

Critical Path Analysis: A Comprehensive Guide

  • Critical Path Analysis: A Comprehensive Guide

Learn all about critical path analysis and how to use it as a problem solving and planning tool. This comprehensive guide covers everything from introduction to conclusion.

  • Analyzing Arguments Logically
  • Interpreting Information: A Problem-Solving and Research Skills Primer

Debate Activities and Role-Play Scenarios

  • Mind Mapping and Listing Ideas
  • Preserving Natural Resources
  • Questioning Assumptions: A Critical Thinking Skill
  • Logic Puzzles and Brain Teasers: A Comprehensive Overview
  • Five Whys Technique: A Comprehensive Analysis
  • Choosing the Right Career: Problem-Solving Examples
  • Exploring Online Whiteboarding Tools for Brainstorming
  • Using Analogies to Solve Problems
  • Visualization Activities and Drawing Exercises
  • Cause and Effect Diagrams: A Problem-Solving Technique
  • Brainstorming Solutions: A Problem-Solving Guide
  • Analyzing Consequences: A Problem Solving Strategy
  • Applying Existing Solutions for Problem Solving Strategies
  • Virtual Brainstorming Sessions: A Comprehensive Overview
  • Exploring Synectics Technique: A Comprehensive Guide
  • Creative Writing Exercises and Storyboards
  • Identifying Root Causes
  • Exploring the SCAMPER Technique for Creative Problem Solving
  • Making Complex Decisions: A Comprehensive Overview
  • Round-robin brainstorming: Exploring a Group Brainstorming Technique
  • Maximizing Efficiency and Productivity
  • Gantt Charting: A Primer for Problem Solving & Planning Techniques
  • Group Decision Making Activities
  • Listening Skills: A Comprehensive Overview
  • Reducing Air Pollution Levels
  • Brainwriting: A Creative Problem-Solving Technique
  • Reducing Costs and Increasing Profits: A Problem Solving Example
  • Managing Your Finances Effectively
  • Paired Comparison Analysis: A Comprehensive Overview
  • Idea Generation Techniques: A Comprehensive Overview
  • Thinking Outside the Box: An Overview of Individual Brainstorming Techniques
  • Jigsaw Puzzles and Mazes: Problem Solving Activities for Fun and Learning

Identifying Patterns: A Practical Guide

  • Exploring Brainwalking: A Creative Problem-Solving Technique
  • Exploring Lateral Thinking: A Comprehensive Guide to Problem Solving Strategies
  • Force Field Analysis for Problem Solving and Decision Making
  • Solving Relationship Issues
  • Fault Tree Analysis: A Comprehensive Overview
  • Word Association and Random Word Generation
  • Cost-benefit Analysis: A Guide to Making Informed Decisions
  • Improving Customer Service Quality
  • Structured Decisions: An Overview of the Decision Making Process
  • Data Collection and Analysis - Problem Solving Skills and Research Skills
  • Negotiation and Compromise

Collaborative Mind Mapping Software

  • Round-robin Brainstorming: A Creative Problem Solving Tool
  • Exploring Trial and Error Problem Solving Strategies
  • SWOT Analysis: A Comprehensive Overview
  • Imagination Activities and Brainstorming Sessions
  • Value Innovation Techniques
  • How to Explain Ideas Clearly
  • Crossword Puzzles and Sudoku: A Problem-Solving Exploration
  • Simple Decisions - An Overview
  • Brainstorming: A Comprehensive Look at Creative Problem Solving

New Articles

Debate Activities and Role-Play Scenarios

Which cookies do you want to accept?

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Sweepstakes
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Signs of Different Types of Biases and How to Overcome Each of Them

These biases can unknowingly impact your thoughts and behaviors.

Fizkes / Getty Images

  • Confirmation Bias

Attribution Bias

Conformity bias, beauty bias, gender bias, the contrast effect.

Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions.

Our experiences, culture, social norms, and personal beliefs often shape these beliefs. The way we act on these biases can be either conscious or unconscious and can lead to prejudiced or discriminatory behaviors.

“Bias can play a significant role in day-to-day interactions and relationships, often influencing our thoughts, attitudes, and behaviors toward others,” says David Yadush, LPC, a licensed professional counselor at BetterHelp . “This can result in misinterpreting or overlooking facts and can change how we perceive people or events in our lives.”

Along with affecting our everyday interactions, being unaware of biases—or falling prey to them even when we know they exist—can hinder personal growth .

In this article, we’re outlining common types of biases, and discussing the signs of each type and ways to overcome them.

Why It's Important to Assess Your Biases

In order to recognize and work through bias, it’s important for us to challenge our assumptions and the subconscious stereotypes we make on a daily basis. This can be done by seeking out diverse perspectives, enjoying new experiences, and advocating for equal opportunity and treatment for everyone.

Of course, it also helps to understand the types of biases we’re apt to fall prey to so we can recognize and correct them in real-time.

How to Work Through Confirmation Bias

Confirmation bias is the tendency to seek out information that reaffirms our existing beliefs. In doing so, we tend to ignore information that contradicts our beliefs , which can lead us toward untruths.

Signs of confirmation bias may include:

  • Seeking information that confirms our beliefs
  • Ignoring or dismissing information that contradicts our beliefs
  • Failing to consider alternative opinions

“This bias can be harmful as it may prevent individuals from considering alternative viewpoints and may lead to closed-mindedness,” warns Yadush.

How to Overcome Confirmation Bias

“To recognize and work through confirmation bias, individuals should actively seek out diverse perspectives and information, consider alternative viewpoints, and engage in critical thinking and self-reflection ," says Yadush.

Attribution bias is a cognitive distortion where we view the behavior of others as impacted by internal motivation —such as morals and character—while considering your own behaviors as affected by external factors, such as circumstances and environment.

Signs of attribution bias may include:

  • Consistently blaming others for problems or failures
  • Being overly critical of others
  • Excusing our own mistakes without reflection

“Simply speaking, one tends to give themselves a break for their own mistakes or shortcomings as unavoidable but will blame others for similar mistakes or shortcomings as intentional,” explains Karri Francisco, LMFT, director of family programming at APN .

She says that this is intellectually dangerous because it leads to unfair judgments of others. It can also make it harder to learn from our own mistakes since this bias prevents us from taking responsibility for our actions.

How to Overcome Attribution Bias

Francisco says that practicing empathy and perspective-taking can help you move away from falling prey to attribution bias.

Conformity bias is when we simply agree—or conform—with the opinions and behaviors of others in a group setting even when it’s against our own personal beliefs or knowledge.

Signs of conformity bias may include:

  • Vocally agreeing with others even when you inwardly disagree
  • Not sharing your own thoughts and feelings out of fear of being “ousted” or judged in a group setting
  • Going along with a group that’s acting irresponsibly or cruelly when you know inwardly the behavior is wrong

Yadush says, “This is typically an unconscious process that we go through in an attempt to avoid social rejection or gain status. This bias can be harmful as it may prevent individuals from expressing their true thoughts and opinions and may lead to groupthink, where the desire for consensus overrides critical thinking.”

How to Overcome Conformity Bias

To recognize and work through conformity bias, focus on reflecting on your own beliefs and values. At the same time, you can engage in critical thinking and seek diverse perspectives and opinions from others.

If you’re in a leadership position , you can also reduce conformity bias by encouraging and rewarding diverse opinions. 

Beauty bias is either a subconscious or known propensity to treat conventionally beautiful people better or worse than those who aren’t as attractive .

Signs of beauty bias include:

  • You judge others on their appearance
  • You make assumptions about a conventionally attractive person’s capabilities
  • You treat others better or worse based on their appearance

The Halo Effect

The halo effect describes the phenomenon in which people assume that because a person has one favorable attribute, it must mean something favorable about them as a whole.

For example, if you think someone is attractive, you may assume that they are nicer or smarter than someone you deem less attractive.

For example, you might give favorable treatment to a beautiful person, or view them as more funny or interesting. This is referred to as The Halo Effect , and studies show that people have a tendency to do this without even thinking.

That said, you might also treat an unattractive person less favorably or make harsh judgments about them without getting to know them.

How to Overcome Beauty Bias

Francisco says, “The potential harm can lead to discrimination against those who do not present within conventional beauty standards. Are you making assumptions about a person's abilities or character based on their physical appearance, such as assuming that someone attractive is also intelligent or competent?”

She adds that in order to recognize and work through any bias, we must become aware of our own and challenge them as they occur.

One approach to challenging beauty bias is consciously focusing on a person's qualities and abilities when evaluating them.

Gender bias refers to the tendency we have to hold stereotypical or discriminatory attitudes towards people based solely on their gender. This not only affects our ability to socialize in meaningful ways, but it can also lead to unequal opportunities and treatment for others.

Signs of gender bias may include:

  • Making assumptions or judgments based on gender
  • Using gender-specific language
  • Treating individuals differently based on their gender

How to Overcome Gender Bias

According to Yadush, "To recognize and work through gender bias, individuals should challenge their assumptions and stereotypes and use gender-neutral language.”

Yadush adds that it’s also important to listen to and believe individuals about their experiences around gender bias and discrimination.

Similarly, ageism is the tendency we have to make judgments or assumptions about another person simply because of their age.

This tends to negatively impact people who are either young or old, as we subconsciously hold stereotypes about their capabilities or the “known characteristics” of their generation.

Signs of ageism may include:

  • Judging an individual's ability or intelligence based on age
  • Not interacting with someone because they’re a different age
  • Being rude or dismissive of others due to their age

Ageism and Its Impact on Mental Health

Yadush says that ageism has been shown to have serious effects on the mental health , physical health, and overall quality of life in the older adult population. It can hinder their ability to socialize, find employment, or make meaningful friendships.

For young people, it can also impact their ability to be taken seriously in professional settings. This is also referred to as "youngism."

How to Overcome Ageism

“To help combat ageism, seek out mentorship from individuals of all ages and be willing to learn from those with different lived experiences,” Yadush suggests. “When you do recognize ageism in the workplace or community, speak out and be an advocate as others may not have the opportunity or support to do so.”

The contrast effect tends to sneak up on us. It’s a cognitive bias where the comparison of two things influences your perception of both.

Other signs of the contrast effect include:

  • Comparing one person to another
  • Failing to focus on objective criteria when making decisions
  • Not considering the context of your evaluations

Karri Francisco, LMFT

[The contrast effect] can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.

How the Contrast Effect May Play Out in Everyday Life

Here are some examples of what the contrast effect may look like in the real world:

  • If you see someone casually dressed and standing next to someone looking unkempt, the casual attire may appear more professional in comparison. This might not seem important, but it demonstrates an important effect.
  • In another example, if someone is interviewed for a job immediately after a particularly impressive candidate, they may be judged more harshly than they would have been if someone had interviewed them alone. This creates space for perceptions to be distorted. 

“The contrast effect highlights how our perceptions are not solely based on objective measurements but can be influenced by the context in which we experience them,” explains Francisco. “This can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.”

How to Overcome the Contrast Effect

When making decisions, try to be as objective as possible. If you do have to make any comparisons, it can help to take breaks between comparisons and evaluations in order to clear your mind of influences, and to focus on objective criteria rather than subjective impressions.

We're all prone to cognitive distortions. Sometimes we're on the receiving end, while other times we're the ones making quick judgments. Reflecting on where these biases may exist in your daily life is the first step in understanding and overcoming them.

Merriam-Webster Dictionary. Halo effect .

Batres C, Shiramizu V.  Examining the “attractiveness halo effect” across cultures .  Curr Psychol . Published online August 25, 2022. doi:10.1007/s12144-022-03575-0

Francioli SP, North MS.  Youngism: The content, causes, and consequences of prejudices toward younger adults .  J Exp Psychol Gen . 2021;150(12):2591-2612. doi:10.1037/xge0001064

By Wendy Rose Gould Wendy Rose Gould is a lifestyle reporter with over a decade of experience covering health and wellness topics.

critical thinking vs bias

  • Clearer Thinking Team
  • Mar 30, 2023

A List of Common Cognitive Biases (With Examples)

Updated: Jun 13, 2023

list of cognitive biases

Cognitive biases are patterns of thinking that distort or skew information processing, often leading to errors. These biases often occur when we make a quick decision using intuition or heuristics, which are simple rules or shortcuts that we use to make decisions and solve problems quickly without necessarily considering all available information.

While human intuition is extremely useful for many things, and should not simply be ignored, there are also plenty of known situations in which using our intuition or "going with our gut" systematically leads us to inaccurate conclusions and unhelpful behaviors.

In the early 1970s, cognitive psychologists Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias' after studying perceptual bias in problem-solving that used heuristics. Since then, cognitive psychology has demonstrated that cognitive biases occur systematically and universally and are involuntary: no one is totally immune to them.

critical thinking vs bias

If you've found this article valuable so far, you may also like our free tool

List of the most common cognitive biases

Here, we list many of the most common cognitive biases. We strongly recommend reading the second part of this article, where we answer popular questions and clarify common misunderstandings about the topic.

Ambiguity Effect

The Ambiguity Effect is a cognitive bias whereby people who are faced with a decision tend to pick an option for which they know the probability of a good outcome, rather than an option for which the probability of a good outcome is unknown or ambiguous. This may occur even if the known probability is low and picking it isn't the best strategy.

Anchoring Bias

Anchoring Bias occurs when a person's expectation about one thing is affected by something mostly or entirely irrelevant they saw, heard, or thought before, such as an irrelevant number. In other words, it occurs when a person's beliefs or behaviors are influenced by a specific piece of information far more than they should be given how much evidence that information actually provides.

Attention Bias

Attention Bias occurs when some information or evidence holds a disproportionate amount of a person's attention because of that person's environment or history, or because of people's natural instincts.


Availability Bias

The Availability Bias occurs when someone's prediction about an event's frequency or probability is unduly influenced by how easily they can recall examples of that event. We have a whole mini-course about combating availability bias .

Bias Blind Spot

A Bias Blind Spot is a tendency to see oneself as being less biased or less susceptible to biases (such as those listed in this article) than others in the population.

: "In fact, viewing yourself as rational can backfire. The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. 'I'm an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,' we think." -

Choice-Supportive Bias

Choice-Supportive Bias is a cognitive bias whereby someone who has chosen between different options later remembers the option that they chose as having more positive attributes than it did at the time (while they remember options they did not choose as having more negative attributes than they'd had at the time).

Confirmation Bias

Confirmation Bias refers to a tendency for people to seek out, favor, or give more weight to information that confirms their preconceptions or hypotheses (even if the information isn't true) than information that contradicts their prior beliefs.

Denomination Effect

The Denomination Effect is a cognitive bias whereby people tend to be more likely to spend a given amount of money if it is composed of smaller individual sums than if it is composed of larger individual sums.

Hindsight Bias

Hindsight Bias refers to a tendency to perceive past events as being more predictable than they were before they took place.

Optimism Bias

Optimism Bias is the tendency to be unduly optimistic about the probability of future good and bad events, overestimating the probability of positive ones while underestimating the probability of negative ones.

Motivated Reasoning

Motivated reasoning occurs when you are disposed to interpret new evidence in ways that support your existing beliefs, or that lead to the outcome you wish was true, even when that evidence doesn't truly support your beliefs.

Frequently Asked Questions (FAQ) about cognitive biases

What are the types of bias.

There are three main types of bias.

1. Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X.

2. Implicit biases are unconscious beliefs that lead people to form opinions or judgments, often without being fully aware they hold the unconscious beliefs. If you subtely distrust people of group X without even realizing you're doing it, then you have an implicit bias against people of group X.

3. Cognitive biases differ from explicit and implicit biases: they are a group of systematic patterns in how our beliefs, judgments, and actions differ from what they would if we were completely rational. If most people systemtaically misjudge certain types of information in such a way that you come to false conclusions, then people have a cognitive bias related to that type of information.

How many cognitive biases are there?

There is no consensus among academics regarding how many cognitive biases exist. Some have found ~40 , others find >100 , and Wikipedia lists over 180 .

What are the common causes of cognitive bias?

As we’ve seen above, cognitive biases often appear when one is faced with a decision and has limited resources (such as time, understanding, and cognitive capacity).

For instance, when buying a banana, you can't consider every single possible other use of that money to determine whether a banana is truly the single best use. You are limited in both how much time you have to think and how much total cognitive capacity you have.

Using fast heuristics or relying on our intuition is often an effective way of coming to conclusions in these situations because such approaches require fewer resources than careful thinking. While our intuition is often reliable, there are certain cases where our intuitions systematically produce inaccurate beliefs and unhelpful behaviors - these are what we refer to as "cognitive biases".

Even when we have plenty of time to think and aren't hitting a limit on our cognitive resources, people can still be prone to cognitive biases. For instance, there are certain automatic rules of thumb that our minds evolved to use since they worked quite well for the survival of our ancestors. Unfortunately, these rules of thumb can sometimes lead us to false conclusions and unhelpful behaviors in the modern world.

Is cognitive bias a good or bad thing?

Cognitive biases are not good or bad in themselves. They are an unavoidable effect of not having infinite intelligence and infinite time to think, and hence the need to rely on heuristics and intuition. We call a tendency a cognitive bias when it leads to systemic inaccuracies in our beliefs or unhelpful behaviors. In that sense, by definition, cognitive biases cause systematic problems.

However, cognitive biases do not always lead to negative outcomes in every instance. For instance, overconfidence may cause a person to try something very difficult, that they ultimately succeed at. On the other hand, for every one person who succeeds due to overconfidence, there may be multiple other people that try something that's unrealistic due to overconfidence and end up failing.

How do you identify cognitive biases?

Just knowing about specific cognitive biases is a great first step to identifying them in yourself, but knowledge of the biases is often not sufficient to cause you to identify them. Once you’ve done that, it can be helpful to get to know the most common cognitive biases (such as the ones presented above) so that you can look out for them in your own thinking.

Can you avoid cognitive bias?

Yes and no. It is possible to reduce the influence of cognitive biases on your thinking (and this can be very beneficial!). So you may be able to avoid a cognitive bias in many particular instances. But it's not possible to completely remove all of your cognitive biases.

How do you overcome cognitive biases?

Unfortunately, it’s impossible to overcome all of your cognitive biases completely. However, that doesn’t mean you can’t do anything. A good first step on the path to getting your cognitive biases under control is familiarizing yourself with them

Here are a few of our interactive tools that might help:

The Planning Fallacy

The Sunk Cost Fallacy

Improve Your Frequency Predictions

Political Bias Test

Rhetorical Fallacies

Are Your Overconfident?

Calibrate Your Judgement

How Rational Are You, Really?

Metal Traps ,

However, just knowing about your cognitive biases isn’t enough . You need to take action! Here are some practical steps we recommend:

Biases such as overconfidence, confirmation bias, and the illusion of control can be reduced or avoided by having multiple points of view. Surrounding yourself and listening to people with diverse experiences, systems of beliefs, and expertise reduces the chances of falling into one of the said biases. This is also true for the source of information: it is less likely that you fall into a cognitive bias if you look for other data sources and conflict.

Actively seeking evidence against your current point of view (on important decisions) can be a helpful way to combat biases like overconfidence, confirmation bias, and motivated reasoning.

Another strategy recommended by researchers who studied cognitive biases in physicians, is to consciously consider the options you dismissed at first, so you can reach a more considered answer.

What is a cognitive vs. an emotional bias?

Emotional biases can be considered a subcategory of cognitive biases. What separates them from other cognitive biases is that they are based on e motions such as anger, disgust, fear, happiness, sadness, and surprise . When we're experiencing emotions, we may act in a biased way that is concordant with that emotion. For instance, anxiety may cause us to overestimate the chance of something being dangerous.

Emotional biases are linked to emotional dispositions (commonly known as ‘temperament’). Different emotional dispositions may even lead to different emotional reactions to the same occurrence of events.

Emotional biases may help us explain optimism and pessimism biases .

How do cognitive biases affect critical thinking ?

Cognitive biases interfere with impartiality, and they can negatively impact critical thinking in a myriad of different ways. Here are several:

Motivated reasoning leads us to underestimate the arguments for conclusions we don’t believe in and overestimate the arguments for conclusions we want to believe;

Availability bias messes with our critical thinking because it leads us to asses risk by how readily examples come to mind, rather than considering all of the relevant examples;

We are also prone to blind spot bias, meaning that we are less likely to identify biases in our own judgment than in other people's.

How do cognitive biases affect decision-making?

Cognitive biases affect decision-making in at least two ways: they help decision-making by speeding it up and cutting necessary corners when we have limited time or cognitive power, but they also hinder decision-making by causing us to come to false conclusions or take unhelpful actions in certain cases.

Is gender a factor for cognitive biases?

Research has shown some correlation between gender or sex and specific biases. For instance, researchers found that male investors tend to show greater overconfidence and optimism biases, while female investors tend to exhibit more anchoring and hindsight biases. The research makes no claims about what causes such gendered differences - e.g., socialization or biology or a mix of both.

Are gender stereotypes cognitive bias?

Gender stereotypes are explicit biases, which means they are not cognitive biases. However, there are many cognitive biases that involve gender stereotypes. For example, masculine bias is the tendency to assume a person is a male based on stereotypes after hearing gender-neutral information about them, and the tendency to use gender as a description only when describing women.

Gender stereotypes are also a sign of binary thinking .

Do cognitive biases cause depression?

Research has shown some cognitive biases are correlated with depression . This has been found to be the case for negative interpretation bias (the tendency to interpret ambiguous scenarios as negative) and pessimistic biases, which lead people to predict future situations as unrealistically negative.

Cognitive behavioral therapy is based on the assumption that individuals with depression have distorted negative beliefs about themselves or the world (known in CBT as "cognitive distortions").

Are cognitive biases scientific (is their existence scientifically proven)?

Yes. They have been studied since the early 1970s by cognitive psychologists, sociologists, and behavioral economists.

Do scientists exhibit cognitive biases?

Just like every other human being, scientists can exhibit cognitive biases.They may exhibit overconfidence bias or fall prety to selection biases, for example. This has been researched as it relates to the replication crisis social psychology faces today .

There is even research on the presence of cognitive biases in scientific contexts and occuring within academic publications. Nobody, not even scientists, are immune to cognitive biases!

Are cognitive biases learned? Or are we born with cognitive biases?

Both. We are born with a tendency for some cognitive biases, but we can also learn specific aspects of these biases. Our brains have evolved to be prone to all sorts of cognitive biases because those biases have been helpful in the survival of our ancestors in the environment (and under the constraints) in which they lived.

But the details of some specific cognitive biases are learned as we move through the world. For example, humans have evolved a tendency to engage in motivated reasoning, but which conclusions motivate your reasoning is something you aren’t born with and are impacted by your experiences and learning.

Keep learning by trying our mini-course on Mental Traps

Want to understand cognitive biases on a deeper level? Learn about a few of the mind's mistakes with our interactive introduction to cognitive biases!

Recent Posts

The Missing Heritability Problem: Are We About to Overturn 100 Years of Research?

Who is right about your money: traditional economists or self-help authors?

Does astrology work? We put 152 astrologers to the test

Bookmark this page

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

Defining Critical Thinking


Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed or down-right prejudiced. Yet the quality of our life and that of what we produce, make, or build depends precisely on the quality of our thought. Shoddy thinking is costly, both in money and in quality of life. Excellence in thought, however, must be systematically cultivated.


Critical thinking is that mode of thinking - about any subject, content, or problem - in which the thinker improves the quality of his or her thinking by skillfully taking charge of the structures inherent in thinking and imposing intellectual standards upon them.



Foundation for Critical Thinking Press, 2008)

Teacher’s College, Columbia University, 1941)



Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

critical thinking vs bias

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved August 29, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of jintell

Critical Thinking: A Model of Intelligence for Solving Real-World Problems

Diane f. halpern.

1 Department of Psychology, Claremont McKenna College, Emerita, Altadena, CA 91001, USA

Dana S. Dunn

2 Department of Psychology, Moravian College, Bethlehem, PA 18018, USA; ude.naivarom@nnud

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. A high IQ is correlated with many important outcomes (e.g., academic prominence, reduced crime), but it does not protect against cognitive biases, partisan thinking, reactance, or confirmation bias, among others. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests. Similarly, some scholars argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach.

1. Introduction

The editors of this Special Issue asked authors to respond to a deceptively simple statement: “How Intelligence Can Be a Solution to Consequential World Problems.” This statement holds many complexities, including how intelligence is defined and which theories are designed to address real-world problems.

2. The Problem with Using Standardized IQ Measures for Real-World Problems

For the most part, we identify high intelligence as having a high score on a standardized test of intelligence. Like any test score, IQ can only reflect what is on the given test. Most contemporary standardized measures of intelligence include vocabulary, working memory, spatial skills, analogies, processing speed, and puzzle-like elements (e.g., Wechsler Adult Intelligence Scale Fourth Edition; see ( Drozdick et al. 2012 )). Measures of IQ correlate with many important outcomes, including academic performance ( Kretzschmar et al. 2016 ), job-related skills ( Hunter and Schmidt 1996 ), reduced likelihood of criminal behavior ( Burhan et al. 2014 ), and for those with exceptionally high IQs, obtaining a doctorate and publishing scholarly articles ( McCabe et al. 2020 ). Gottfredson ( 1997, p. 81 ) summarized these effects when she said the “predictive validity of g is ubiquitous.” More recent research using longitudinal data, found that general mental abilities and specific abilities are good predictors of several work variables including job prestige, and income ( Lang and Kell 2020 ). Although assessments of IQ are useful in many contexts, having a high IQ does not protect against falling for common cognitive fallacies (e.g., blind spot bias, reactance, anecdotal reasoning), relying on biased and blatantly one-sided information sources, failing to consider information that does not conform to one’s preferred view of reality (confirmation bias), resisting pressure to think and act in a certain way, among others. This point was clearly articulated by Stanovich ( 2009, p. 3 ) when he stated that,” IQ tests measure only a small set of the thinking abilities that people need.”

3. Which Theories of Intelligence Are Relevant to the Question?

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. For example, Grossmann et al. ( 2013 ) cite many studies in which IQ scores have not predicted well-being, including life satisfaction and longevity. Using a stratified random sample of Americans, these investigators found that wise reasoning is associated with life satisfaction, and that “there was no association between intelligence and well-being” (p. 944). (critical thinking [CT] is often referred to as “wise reasoning” or “rational thinking,”). Similar results were reported by Wirthwein and Rost ( 2011 ) who compared life satisfaction in several domains for gifted adults and adults of average intelligence. There were no differences in any of the measures of subjective well-being, except for leisure, which was significantly lower for the gifted adults. Additional research in a series of experiments by Stanovich and West ( 2008 ) found that participants with high cognitive ability were as likely as others to endorse positions that are consistent with their biases, and they were equally likely to prefer one-sided arguments over those that provided a balanced argument. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests (e.g., Sternberg 2019 ). Similarly, Stanovich and West ( 2014 ) argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Halpern and Butler ( 2020 ) advocate for CT as a useful model of intelligence for addressing real-world problems because it was designed for this purpose. Although there is much overlap among these more recent theories, often using different terms for similar concepts, we use Halpern and Butler’s conceptualization to make our point: Yes, intelligence (i.e., CT) can be enhanced and used for solving a real-world problem like COVID-19.

4. Critical Thinking as an Applied Model for Intelligence

One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson ( 2020, p. 205 ): “the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life.” Using this definition, the question of whether intelligent thinking can solve a world problem like the novel coronavirus is a resounding “yes” because solutions to real-world novel problems are part of his definition. This is a popular idea in the general public. For example, over 1000 business managers and hiring executives said that they want employees who can think critically based on the belief that CT skills will help them solve work-related problems ( Hart Research Associates 2018 ).

We define CT as the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal directed--the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions, when the thinker is using skills that are thoughtful and effective for the particular context and type of thinking task. International surveys conducted by the OECD ( 2019, p. 16 ) established “key information-processing competencies” that are “highly transferable, in that they are relevant to many social contexts and work situations; and ‘learnable’ and therefore subject to the influence of policy.” One of these skills is problem solving, which is one subset of CT skills.

The CT model of intelligence is comprised of two components: (1) understanding information at a deep, meaningful level and (2) appropriate use of CT skills. The underlying idea is that CT skills can be identified, taught, and learned, and when they are recognized and applied in novel settings, the individual is demonstrating intelligent thought. CT skills include judging the credibility of an information source, making cost–benefit calculations, recognizing regression to the mean, understanding the limits of extrapolation, muting reactance responses, using analogical reasoning, rating the strength of reasons that support and fail to support a conclusion, and recognizing hindsight bias or confirmation bias, among others. Critical thinkers use these skills appropriately, without prompting, and usually with conscious intent in a variety of settings.

One of the key concepts in this model is that CT skills transfer in appropriate situations. Thus, assessments using situational judgments are needed to assess whether particular skills have transferred to a novel situation where it is appropriate. In an assessment created by the first author ( Halpern 2018 ), short paragraphs provide information about 20 different everyday scenarios (e.g., A speaker at the meeting of your local school board reported that when drug use rises, grades decline; so schools need to enforce a “war on drugs” to improve student grades); participants provide two response formats for every scenario: (a) constructed responses where they respond with short written responses, followed by (b) forced choice responses (e.g., multiple choice, rating or ranking of alternatives) for the same situations.

There is a large and growing empirical literature to support the assertion that CT skills can be learned and will transfer (when taught for transfer). See for example, Holmes et al. ( 2015 ), who wrote in the prestigious Proceedings of the National Academy of Sciences , that there was “significant and sustained improvement in students’ critical thinking behavior” (p. 11,199) for students who received CT instruction. Abrami et al. ( 2015, para. 1 ) concluded from a meta-analysis that “there are effective strategies for teaching CT skills, both generic and content specific, and CT dispositions, at all educational levels and across all disciplinary areas.” Abrami et al. ( 2008, para. 1 ), included 341 effect sizes in a meta-analysis. They wrote: “findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation.” A strong test of whether CT skills can be used for real-word problems comes from research by Butler et al. ( 2017 ). Community adults and college students (N = 244) completed several scales including an assessment of CT, an intelligence test, and an inventory of real-life events. Both CT scores and intelligence scores predicted individual outcomes on the inventory of real-life events, but CT was a stronger predictor.

Heijltjes et al. ( 2015, p. 487 ) randomly assigned participants to either a CT instruction group or one of six other control conditions. They found that “only participants assigned to CT instruction improved their reasoning skills.” Similarly, when Halpern et al. ( 2012 ) used random assignment of participants to either a learning group where they were taught scientific reasoning skills using a game format or a control condition (which also used computerized learning and was similar in length), participants in the scientific skills learning group showed higher proportional learning gains than students who did not play the game. As the body of additional supportive research is too large to report here, interested readers can find additional lists of CT skills and support for the assertion that these skills can be learned and will transfer in Halpern and Dunn ( Forthcoming ). There is a clear need for more high-quality research on the application and transfer of CT and its relationship to IQ.

5. Pandemics: COVID-19 as a Consequential Real-World Problem

A pandemic occurs when a disease runs rampant over an entire country or even the world. Pandemics have occurred throughout history: At the time of writing this article, COVID-19 is a world-wide pandemic whose actual death rate is unknown but estimated with projections of several million over the course of 2021 and beyond ( Mega 2020 ). Although vaccines are available, it will take some time to inoculate most or much of the world’s population. Since March 2020, national and international health agencies have created a list of actions that can slow and hopefully stop the spread of COVID (e.g., wearing face masks, practicing social distancing, avoiding group gatherings), yet many people in the United States and other countries have resisted their advice.

Could instruction in CT encourage more people to accept and comply with simple life-saving measures? There are many possible reasons to believe that by increasing citizens’ CT abilities, this problematic trend can be reversed for, at least, some unknown percentage of the population. We recognize the long history of social and cognitive research showing that changing attitudes and behaviors is difficult, and it would be unrealistic to expect that individuals with extreme beliefs supported by their social group and consistent with their political ideologies are likely to change. For example, an Iranian cleric and an orthodox rabbi both claimed (separately) that the COVID-19 vaccine can make people gay ( Marr 2021 ). These unfounded opinions are based on deeply held prejudicial beliefs that we expect to be resistant to CT. We are targeting those individuals who beliefs are less extreme and may be based on reasonable reservations, such as concern about the hasty development of the vaccine and the lack of long-term data on its effects. There should be some unknown proportion of individuals who can change their COVID-19-related beliefs and actions with appropriate instruction in CT. CT can be a (partial) antidote for the chaos of the modern world with armies of bots creating content on social media, political and other forces deliberately attempting to confuse issues, and almost all media labeled “fake news” by social influencers (i.e., people with followers that sometimes run to millions on various social media). Here, are some CT skills that could be helpful in getting more people to think more critically about pandemic-related issues.

Reasoning by Analogy and Judging the Credibility of the Source of Information

Early communications about the ability of masks to prevent the spread of COVID from national health agencies were not consistent. In many regions of the world, the benefits of wearing masks incited prolonged and acrimonious debates ( Tang 2020 ). However, after the initial confusion, virtually all of the global and national health organizations (e.g., WHO, National Health Service in the U. K., U. S. Centers for Disease Control and Prevention) endorse masks as a way to slow the spread of COVID ( Cheng et al. 2020 ; Chu et al. 2020 ). However, as we know, some people do not trust governmental agencies and often cite the conflicting information that was originally given as a reason for not wearing a mask. There are varied reasons for refusing to wear a mask, but the one most often cited is that it is against civil liberties ( Smith 2020 ). Reasoning by analogy is an appropriate CT skill for evaluating this belief (and a key skill in legal thinking). It might be useful to cite some of the many laws that already regulate our behavior such as, requiring health inspections for restaurants, setting speed limits, mandating seat belts when riding in a car, and establishing the age at which someone can consume alcohol. Individuals would be asked to consider how the mandate to wear a mask compares to these and other regulatory laws.

Another reason why some people resist the measures suggested by virtually every health agency concerns questions about whom to believe. Could training in CT change the beliefs and actions of even a small percentage of those opposed to wearing masks? Such training would include considering the following questions with practice across a wide domain of knowledge: (a) Does the source have sufficient expertise? (b) Is the expertise recent and relevant? (c) Is there a potential for gain by the information source, such as financial gain? (d) What would the ideal information source be and how close is the current source to the ideal? (e) Does the information source offer evidence that what they are recommending is likely to be correct? (f) Have you traced URLs to determine if the information in front of you really came from the alleged source?, etc. Of course, not everyone will respond in the same way to each question, so there is little likelihood that we would all think alike, but these questions provide a framework for evaluating credibility. Donovan et al. ( 2015 ) were successful using a similar approach to improve dynamic decision-making by asking participants to reflect on questions that relate to the decision. Imagine the effect of rigorous large-scale education in CT from elementary through secondary schools, as well as at the university-level. As stated above, empirical evidence has shown that people can become better thinkers with appropriate instruction in CT. With training, could we encourage some portion of the population to become more astute at judging the credibility of a source of information? It is an experiment worth trying.

6. Making Cost—Benefit Assessments for Actions That Would Slow the Spread of COVID-19

Historical records show that refusal to wear a mask during a pandemic is not a new reaction. The epidemic of 1918 also included mandates to wear masks, which drew public backlash. Then, as now, many people refused, even when they were told that it was a symbol of “wartime patriotism” because the 1918 pandemic occurred during World War I ( Lovelace 2020 ). CT instruction would include instruction in why and how to compute cost–benefit analyses. Estimates of “lives saved” by wearing a mask can be made meaningful with graphical displays that allow more people to understand large numbers. Gigerenzer ( 2020 ) found that people can understand risk ratios in medicine when the numbers are presented as frequencies instead of probabilities. If this information were used when presenting the likelihood of illness and death from COVID-19, could we increase the numbers of people who understand the severity of this disease? Small scale studies by Gigerenzer have shown that it is possible.

Analyzing Arguments to Determine Degree of Support for a Conclusion

The process of analyzing arguments requires that individuals rate the strength of support for and against a conclusion. By engaging in this practice, they must consider evidence and reasoning that may run counter to a preferred outcome. Kozyreva et al. ( 2020 ) call the deliberate failure to consider both supporting and conflicting data “deliberate ignorance”—avoiding or failing to consider information that could be useful in decision-making because it may collide with an existing belief. When applied to COVID-19, people would have to decide if the evidence for and against wearing a face mask is a reasonable way to stop the spread of this disease, and if they conclude that it is not, what are the costs and benefits of not wearing masks at a time when governmental health organizations are making them mandatory in public spaces? Again, we wonder if rigorous and systematic instruction in argument analysis would result in more positive attitudes and behaviors that relate to wearing a mask or other real-world problems. We believe that it is an experiment worth doing.

7. Conclusions

We believe that teaching CT is a worthwhile approach for educating the general public in order to improve reasoning and motivate actions to address, avert, or ameliorate real-world problems like the COVID-19 pandemic. Evidence suggests that CT can guide intelligent responses to societal and global problems. We are NOT claiming that CT skills will be a universal solution for the many real-world problems that we confront in contemporary society, or that everyone will substitute CT for other decision-making practices, but we do believe that systematic education in CT can help many people become better thinkers, and we believe that this is an important step toward creating a society that values and practices routine CT. The challenges are great, but the tools to tackle them are available, if we are willing to use them.

Author Contributions

Conceptualization, D.F.H. and D.S.D.; resources, D.F.H.; data curation, writing—original draft preparation, D.F.H.; writing—review and editing, D.F.H. and D.S.D. All authors have read and agreed to the published version of the manuscript.

This research received no external funding.

Institutional Review Board Statement

No IRB Review.

Informed Consent Statement

No Informed Consent.

Conflicts of Interest

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Wade C. Anne, Surkes Michael A., Tamim Rana, Zhang Dai. Instructional interventions affecting critical thinking skills and dispositions: A Stage 1 meta-analysis. Review of Educational Research. 2008; 78 :1102–34. doi: 10.3102/0034654308326084. [ CrossRef ] [ Google Scholar ]
  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Waddington David I., Wade C. Anne. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research. 2015; 85 :275–341. doi: 10.3102/0034654314551063. [ CrossRef ] [ Google Scholar ]
  • Burhan Nik Ahmad Sufian, Kurniawan Yohan, Sidek Abdul Halim, Mohamad Mohd Rosli. Crimes and the Bell curve: Th e role of people with high, average, and low intelligence. Intelligence. 2014; 47 :12–22. doi: 10.1016/j.intell.2014.08.005. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Cheng Vincent Chi-Chung, Wong Shuk-Ching, Chuang Vivien Wai-Man, So Simon Yung-Chun, Chen Jonathan Hon-Kwan, Sridhar Sidharth, To Kelvin Kai-Wwang, Chan Jasper Fuk-Wu, Hung Ivan Fan-Ngai, Ho Pak-Leung, et al. The role of community-wide wearing of face mask for control of coronavirus disease 2019 (COVID-19) epidemic due to SARS-CoV-2. Journal of Infectious Disease. 2020; 81 :107–14. doi: 10.1016/j.jinf.2020.04.024. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chu Derek K., Aki Elie A., Duda Stephanie, Solo Karla, Yaacoub Sally, Schunemann Holger J. Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: A system atic review and meta-analysis. Lancet. 2020; 395 :1973–87. doi: 10.1016/S0140-6736(20)31142-9. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Donovan Sarah J., Guss C. Dominick, Naslund Dag. Improving dynamic decision-making through training and self-re flection. Judgment and Decision Making. 2015; 10 :284–95. [ Google Scholar ]
  • Drozdick Lisa Whipple, Wahlstrom Dustin, Zhu Jianjun, Weiss Lawrence G. The Wechsler Adult Intelligence Scale—Fourth Edition and the Wechsler Memory Scale—Fourth Edition. In: Flanagan Dawn P., Harrison Patti L., editors. Contemporary Intellectual as Sessment: Theories, Tests, and Issues. The Guilford Press; New York: 2012. pp. 197–223. [ Google Scholar ]
  • Gigerenzer Gerd. When all is just a click away: Is critical thinking obsolete in the digital age? In: Sternberg Robert J., Halpern Diane F., editors. Critical Thinking IN Psychology. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 197–223. [ Google Scholar ]
  • Gottfredson Linda S. Why g matters: The complexity of everyday life. Intelligence. 1997; 24 :79–132. doi: 10.1016/S0160-2896(97)90014-3. [ CrossRef ] [ Google Scholar ]
  • Grossmann Igor, Varnum Michael E. W., Na Jinkyung, Kitayama Shinobu, Nisbett Richard E. A route to well-being: Intelligence ver sus wise reasoning. Journal of Experimental Psychology: General. 2013; 142 :944–53. doi: 10.1037/a0029560. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F. Halpern Critical Thinking Assessment. Schuhfried Test Publishers; Modling: 2018. [(accessed on 30 March 2021)]. Available online: www.schuhfried.com [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The nature of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 183–96. [ Google Scholar ]
  • Halpern Diane F., Dunn Dana S. Thought and Knowledge: An Introduction to Critical Thinking. 6th ed. Taylor & Francis; New York: Forthcoming. in press. [ Google Scholar ]
  • Halpern Diane F., Millis Keith, Graesser Arthur, Butler Heather, Forsyth Carol, Cai Zhiqiang. Operation ARA: A computerized learn ing game that teaches critical thinking and scientific reasoning. Thinking Skills and Creativity. 2012; 7 :93–100. doi: 10.1016/j.tsc.2012.03.006. [ CrossRef ] [ Google Scholar ]
  • Hart Research Associates [(accessed on 30 March 2021)]; Employers Express Confidence in Colleges and Universities: See College as Worth the Investment, New Research Finds. 2018 Aug 29; Available online: https://hartresearch.com/employers-express-confidence-in-colleges-and-universities-see-college-as-worth-the-investment-new-research-finds/
  • Heijltjes Anita, Gog Tamara van, Lippink Jimmie, Paas Fred. Unraveling the effects of critical thinking instructions, practice, and self-explanation on students’ reasoning performance. Instructional Science. 2015; 43 :487–506. doi: 10.1007/s11251-015-9347-8. [ CrossRef ] [ Google Scholar ]
  • Holmes Natasha G., Wieman Carl E., Bonn DougA. Teaching critical thinking. Proceedings of the National Academy of Sciences. 2015; 112 :11199–204. doi: 10.1073/pnas.1505329112. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hunter John E., Schmidt Frank L. Intelligence and job performance: Economic and social implications. Psychology, Public Policy, and Law. 1996; 2 :447–72. doi: 10.1037/1076-8971.2.3-4.447. [ CrossRef ] [ Google Scholar ]
  • Kozyreva Anastasia, Lewandowsky Stephan, Hertwig Ralph. Citizens versus the internet: Confronting digital challenges with cognitive tools. [(accessed on 30 March 2021)]; Psychological Science in the Public Interest. 2020 21 doi: 10.1177/1529100620946707. Available online: https://www.psychologi calscience.org/publications/confronting-digital-challenges-with-cognitive-tools.html [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kretzschmar Andre, Neubert Jonas C., Wusternberg Sascha, Greiff Samuel. Construct validity of complex problem- solv ing: A comprehensive view on different facts of intelligence and school grades. Intelligence. 2016; 54 :55–69. doi: 10.1016/j.intell.2015.11.004. [ CrossRef ] [ Google Scholar ]
  • Lang Jonas W.B., Kell Harrison J. General mental ability and specific abilities: Their relative importance for extrinsic career success. Journal of Applied Psychology. 2020; 105 :1047–61. doi: 10.1037/apl0000472. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lovelace Berkeley., Jr. Medical Historians Compare the Coronavirus to the 1918 Flu Pandemic: Both Were Highly Political. [(accessed on 30 March 2021)]; CNBC. 2020 Available online: https://www.cnbc.com/2020/09/28/comparing-1918-flu-vs-corona virus.html?fbclid=IwAR1RAVRUOIdN9qqvNnMPimf5Q4XfV-pn_qdC3DwcfnPu9kavwumDI2zq9Xs
  • Marr Rhuaridh. Iranian Cleric Claims COVID-19 Vaccine Can Make People Gay. [(accessed on 30 March 2021)]; Metro Weekly. 2021 Available online: https://www.metroweekly.com/2021/02/iranian-cleric-claims-covid-19-vaccine-can-make-people-gay/
  • McCabe Kira O., Lubinski David, Benbow Camilla P. Who shines most among the brightest?: A 25-year longitudinal study of elite STEM graduate students. Journal of Personality and Social Psychology. 2020; 119 :390–416. doi: 10.1037/pspp0000239. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mega Emiliano R. COVID Has Killed more than One Million People. How Many more will Die? [(accessed on 30 March 2021)]; Nature. 2020 Available online: https://www.nature.com/articles/d41586-020-02762-y [ PubMed ]
  • Nickerson Raymond S. Developing intelligence through instruction. In: Sternberg Robert J., editor. The Cambridge Handbook of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 205–37. [ Google Scholar ]
  • OECD . The Survey of Adult Skills: Reader’s Companion. 3rd ed. OECD Publishing; Paris: 2019. OECD Skills Studies. [ CrossRef ] [ Google Scholar ]
  • Smith Matthew. Why won’t Britons Wear Face Masks? [(accessed on 30 March 2021)]; YouGov. 2020 Available online: https://yougov.co.uk/topics/health/articles-reports/2020/07/15/why-wont-britons-wear-face-masks
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict my-side bias and one-sided thinking biases. Thinking & Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. What intelligence tests miss. The Psychologist. 2014; 27 :80–83. doi: 10.5840/inquiryctnews201126216. [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tang Julian W. COVID-19: Interpreting scientific evidence—Uncertainty, confusion, and delays. BMC Infectious Diseases. 2020; 20 :653. doi: 10.1186/s12879-020-05387-8. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wirthwein Linda, Rost Detlef H. Giftedness and subjective well-being: A study with adults. Learning and Individuals Differences. 2011; 21 :182–86. doi: 10.1016/j.lindif.2011.01.001. [ CrossRef ] [ Google Scholar ]

Implicit Bias (Unconscious Bias): Definition & Examples

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

On This Page:

Implicit bias refers to the beliefs and attitudes that affect our understanding, actions and decisions in an unconscious way.

Take-home Messages

  • Implicit biases are unconscious attitudes and stereotypes that can manifest in the criminal justice system, workplace, school setting, and in the healthcare system.
  • Implicit bias is also known as unconscious bias or implicit social cognition.
  • There are many different examples of implicit biases, ranging from categories of race, gender, and sexuality.
  • These biases often arise from trying to find patterns and navigate the overwhelming stimuli in this complicated world. Culture, media, and upbringing can also contribute to the development of such biases.
  • Removing these biases is a challenge, especially because we often don’t even know they exist, but research reveals potential interventions and provides hope that levels of implicit biases in the United States are decreasing.

implicit bias

The term implicit bias was first coined in 1995 by psychologists Mahzarin Banaji and Anthony Greenwald, who argued that social behavior is largely influenced by unconscious associations and judgments (Greenwald & Banaji, 1995).

So, what is implicit bias?

Specifically, implicit bias refers to attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious way, making them difficult to control.

Since the mid-90s, psychologists have extensively researched implicit biases, revealing that, without even knowing it, we all possess our own implicit biases.

System 1 and System 2 Thinking

Kahneman (2011) distinguishes between two types of thinking: system 1 and system 2.
  • System 1 is the brain’s fast, emotional, unconscious thinking mode. This type of thinking requires little effort, but it is often error-prone. Most everyday activities (like driving, talking, cleaning, etc.) heavily use the type 1 system.
  • System 2 is slow, logical, effortful, conscious thought, where reason dominates.

Daniel Kahnemans Systems

Implicit Bias vs. Explicit Bias

Implicit BiasExplicit Bias
Unconscious attitudes or stereotypes that affect our understanding, actions, and decisions.Conscious beliefs and attitudes about a person or group.
Can influence decisions and behavior subconsciously.Usually apparent in a person’s language and behavior.
A hiring manager unknowingly favors candidates who went to the same university as them.A person making a conscious decision not to hire someone based on their ethnicity.
Can lead to unintentional discrimination and bias in many areas like hiring, law enforcement, healthcare, etc.A person making a conscious decision not to hire someone based on ethnicity.
Measured using implicit association tests and other indirect methods.Can be assessed directly through surveys, interviews, etc.
Very common, as everyone holds unconscious biases to some degree.Less common, as societal norms have shifted to view explicit bias as unacceptable.
Improve self-awareness, undergo bias training, diversify your experiences and interactions.Education, awareness, promoting inclusivity and diversity.

What is meant by implicit bias?

Implicit bias (unconscious bias) refers to attitudes and beliefs outside our conscious awareness and control. Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006).

An implicit bias may counter a person’s conscious beliefs without realizing it. For example, it is possible to express explicit liking of a certain social group or approval of a certain action while simultaneously being biased against that group or action on an unconscious level.

Therefore, implicit and explicit biases might differ for the same person.

It is important to understand that implicit biases can become explicit biases. This occurs when you become consciously aware of your prejudices and beliefs. They surface in your mind, leading you to choose whether to act on or against them.

What is meant by explicit bias?

Explicit biases are biases we are aware of on a conscious level (for example, feeling threatened by another group and delivering hate speech as a result). They are an example of system 2 thinking.

It is also possible that your implicit and explicit biases differ from your neighbor, friend, or family member. Many factors can control how such biases are developed.

What Are the Implications of Unconscious Bias?

Implicit biases become evident in many different domains of society. On an interpersonal level, they can manifest in simply daily interactions.

This occurs when certain actions (or microaggressions) make others feel uncomfortable or aware of the specific prejudices you may hold against them.

Implicit Prejudice

Implicit prejudice is the automatic, unconscious attitudes or stereotypes that influence our understanding, actions, and decisions. Unlike explicit prejudice, which is consciously controlled, implicit prejudice can occur even in individuals who consciously reject prejudice and strive for impartiality.

Unconscious racial stereotypes are a major example of implicit prejudice. In other words, having an automatic preference for one race over another without being aware of this bias.

This bias can manifest in small interpersonal interactions and has broader implications in society’s legal system and many other important sectors.

Examples may include holding an implicit stereotype that associates Black individuals as violent. As a result, you may cross the street at night when you see a Black man walking in your direction without even realizing why you are crossing the street.

The action taken here is an example of a microaggression. A microaggression is a subtle, automatic, and often nonverbal that communicates hostile, derogatory, or negative prejudicial slights and insults toward any group (Pierce, 1970). Crossing the street communicates an implicit prejudice, even though you might not even be aware.

Another example of an implicit racial bias is if a Latino student is complimented by a teacher for speaking perfect English, but he is a native English speaker. Here, the teacher assumed that English would not be his first language simply because he is Latino.

Gender Stereotypes

Gender biases are another common form of implicit bias. Gender biases are the ways in which we judge men and women based on traditional feminine and masculine assigned traits.

For example, a greater assignment of fame to male than female names (Banaji & Greenwald, 1995) reveals a subconscious bias that holds men at a higher level than their female counterparts. Whether you voice the opinion that men are more famous than women is independent of this implicit gender bias.

Another common implicit gender bias regards women in STEM (science, technology, engineering, and mathematics).

In school, girls are more likely to be associated with language over math. In contrast, males are more likely to be associated with math over language (Steffens & Jelenec, 2011), revealing clear gender-related implicit biases that can ultimately go so far as to dictate future career paths.

Even if you outwardly say men and women are equally good at math, it is possible you subconsciously associate math more strongly with men without even being aware of this association.

Health Care

Healthcare is another setting where implicit biases are very present. Racial and ethnic minorities and women are subject to less accurate diagnoses, curtailed treatment options, less pain management, and worse clinical outcomes (Chapman, Kaatz, & Carnes, 2013).

Additionally, Black children are often not treated as children or given the same compassion or level of care provided for White children (Johnson et al., 2017).

It becomes evident that implicit biases infiltrate the most common sectors of society, making it all the more important to question how we can remove these biases.

LGBTQ+ Community Bias

Similar to implicit racial and gender biases, individuals may hold implicit biases against members of the LGBTQ+ community. Again, that does not necessarily mean that these opinions are voiced outwardly or even consciously recognized by the beholder, for that matter.

Rather, these biases are unconscious. A really simple example could be asking a female friend if she has a boyfriend, assuming her sexuality and that heterosexuality is the norm or default.

Instead, you could ask your friend if she is seeing someone in this specific situation. Several other forms of implicit biases fall into categories ranging from weight to ethnicity to ability that come into play in our everyday lives.

Legal System

Both law enforcement and the legal system shed light on implicit biases. An example of implicit bias functioning in law enforcement is the shooter bias – the tendency among the police to shoot Black civilians more often than White civilians, even when they are unarmed (Mekawi & Bresin, 2015).

This bias has been repeatedly tested in the laboratory setting, revealing an implicit bias against Black individuals. Blacks are also disproportionately arrested and given harsher sentences, and Black juveniles are tried as adults more often than their White peers.

Black boys are also seen as less childlike, less innocent, more culpable, more responsible for their actions, and as being more appropriate targets for police violence (Goff, 2014).

Together, these unconscious stereotypes, which are not rooted in truth, form an array of implicit biases that are extremely dangerous and utterly unjust.

Implicit biases are also visible in the workplace. One experiment that tracked the success of White and Black job applicants found that stereotypically White received 50% more callbacks than stereotypically Black names, regardless of the industry or occupation (Bertrand & Mullainathan, 2004).

This reveals another form of implicit bias: the hiring bias – Anglicized‐named applicants receiving more favorable pre‐interview impressions than other ethnic‐named applicants (Watson, Appiah, & Thornton, 2011).

We’re susceptible to bias because of these tendencies:

We tend to seek out patterns

A key reason we develop such biases is that our brains have a natural tendency to look for patterns and associations to make sense of a very complicated world.

Research shows that even before kindergarten, children already use their group membership (e.g., racial group, gender group, age group, etc.) to guide inferences about psychological and behavioral traits.

At such a young age, they have already begun seeking patterns and recognizing what distinguishes them from other groups (Baron, Dunham, Banaji, & Carey, 2014).

And not only do children recognize what sets them apart from other groups, they believe “what is similar to me is good, and what is different from me is bad” (Cameron, Alvarez, Ruble, & Fuligni, 2001).

Children aren’t just noticing how similar or dissimilar they are to others; dissimilar people are actively disliked (Aboud, 1988).

Recognizing what sets you apart from others and then forming negative opinions about those outgroups (a social group with which an individual does not identify) contributes to the development of implicit biases.

We like to take shortcuts

Another explanation is that the development of these biases is a result of the brain’s tendency to try to simplify the world.

Mental shortcuts make it faster and easier for the brain to sort through all of the overwhelming data and stimuli we are met with every second of the day. And we take mental shortcuts all the time. Rules of thumb, educated guesses, and using “common sense” are all forms of mental shortcuts.

Implicit bias is a result of taking one of these cognitive shortcuts inaccurately (Rynders, 2019). As a result, we incorrectly rely on these unconscious stereotypes to provide guidance in a very complex world.

And especially when we are under high levels of stress, we are more likely to rely on these biases than to examine all of the relevant, surrounding information (Wigboldus, Sherman, Franzese, & Knippenberg, 2004).

Social and Cultural influences

Influences from media, culture, and your individual upbringing can also contribute to the rise of implicit associations that people form about the members of social outgroups. Media has become increasingly accessible, and while that has many benefits, it can also lead to implicit biases.

The way TV portrays individuals or the language journal articles use can ingrain specific biases in our minds.

For example, they can lead us to associate Black people with criminals or females as nurses or teachers. The way you are raised can also play a huge role. One research study found that parental racial attitudes can influence children’s implicit prejudice (Sinclair, Dunn, & Lowery, 2005).

And parents are not the only figures who can influence such attitudes. Siblings, the school setting, and the culture in which you grow up can also shape your explicit beliefs and implicit biases.

Implicit Attitude Test (IAT)

What sets implicit biases apart from other forms is that they are subconscious – we don’t know if we have them.

However, researchers have developed the Implicit Association Test (IAT) tool to help reveal such biases.

The Implicit Attitude Test (IAT) is a psychological assessment to measure an individual’s unconscious biases and associations. The test measures how quickly a person associates concepts or groups (such as race or gender) with positive or negative attributes, revealing biases that may not be consciously acknowledged.

The IAT requires participants to categorize negative and positive words together with either images or words (Greenwald, McGhee, & Schwartz, 1998).

Tests are taken online and must be performed as quickly as possible, the faster you categorize certain words or faces of a category, the stronger the bias you hold about that category.

For example, the Race IAT requires participants to categorize White faces and Black faces and negative and positive words. The relative speed of association of black faces with negative words is used as an indication of the level of anti-black bias.

Kahneman

Professor Brian Nosek and colleagues tested more than 700,000 subjects. They found that more than 70% of White subjects more easily associated White faces with positive words and Black faces with negative words, concluding that this was evidence of implicit racial bias (Nosek, Greenwald, & Banaji, 2007).

Outside of lab testing, it is very difficult to know if we do, in fact, possess these biases. The fact that they are so hard to detect is in the very nature of this form of bias, making them very dangerous in various real-world settings.

How to Reduce Implicit Bias

Because of the harmful nature of implicit biases, it is critical to examine how we can begin to remove them.

Practicing mindfulness is one potential way, as it reduces the stress and cognitive load that otherwise leads to relying on such biases.

A 2016 study found that brief mediation decreased unconscious bias against black people and elderly people (Lueke & Gibson, 2016), providing initial insight into the usefulness of this approach and paving the way for future research on this intervention.

Adjust your perspective

Another method is perspective-taking – looking beyond your own point of view so that you can consider how someone else may think or feel about something.

Researcher Belinda Gutierrez implemented a videogame called “Fair Play,” in which players assume the role of a Black graduate student named Jamal Davis.

As Jamal, players experience subtle race bias while completing “quests” to obtain a science degree.

Gutierrez hypothesized that participants who were randomly assigned to play the game would have greater empathy for Jamal and lower implicit race bias than participants randomized to read narrative text (not perspective-taking) describing Jamal’s experience (Gutierrez, 2014), and her hypothesis was supported, illustrating the benefits of perspective taking in increasing empathy towards outgroup members.

Specific implicit bias training has been incorporated in different educational and law enforcement settings. Research has found that diversity training to overcome biases against women in STEM improved with men (Jackson, Hillard, & Schneider, 2014).

Training programs designed to target and help overcome implicit biases may also be beneficial for police officers (Plant & Peruche, 2005), but there is not enough conclusive evidence to completely support this claim. One pitfall of such training is a potential rebound effect.

Actively trying to inhibit stereotyping actually results in the bias eventually increasing more so than if it had not been initially suppressed in the first place (Macrae, Bodenhausen, Milne, & Jetten, 1994). This is very similar to the white bear problem that is discussed in many psychology curricula.

This concept refers to the psychological process whereby deliberate attempts to suppress certain thoughts make them more likely to surface (Wegner & Schneider, 2003).

Education is crucial. Understanding what implicit biases are, how they can arise how, and how to recognize them in yourself and others are all incredibly important in working towards overcoming such biases.

Learning about other cultures or outgroups and what language and behaviors may come off as offensive is critical as well. Education is a powerful tool that can extend beyond the classroom through books, media, and conversations.

On the bright side, implicit biases in the United States have been improving.

From 2007 to 2016, implicit biases have changed towards neutrality for sexual orientation, race, and skin-tone attitudes (Charlesworth & Banaji, 2019), demonstrating that it is possible to overcome these biases.

Books for further reading

As mentioned, education is extremely important. Here are a few places to get started in learning more about implicit biases:

  • Biased: Uncovering the Hidden Prejudice That Shapes What We See Think and Do by Jennifer Eberhardt
  • Blindspot by Anthony Greenwald and Mahzarin Banaji
  • Implicit Racial Bias Across the Law by Justin Levinson and Robert Smith

Keywords and Terminology

To find materials on implicit bias and related topics, search databases and other tools using the following keywords:

“implicit bias” “implicit gender bias”
“unconscious bias” “implicit prejudices”
“hidden bias” “implicit racial bias”
“cognitive bias” “Implicit Association Test” or IAT
“implicit association” “implicit social cognition”
bias prejudices
“prejudice psychological aspects” stereotypes

Is unconscious bias the same as implicit bias?

Yes, unconscious bias is the same as implicit bias. Both terms refer to the biases we carry without awareness or conscious control, which can affect our attitudes and actions toward others.

In what ways can implicit bias impact our interactions with others?

Implicit bias can impact our interactions with others by unconsciously influencing our attitudes, behaviors, and decisions. This can lead to stereotyping, prejudice, and discrimination, even when we consciously believe in equality and fairness.

It can affect various domains of life, including workplace dynamics, healthcare provision, law enforcement, and everyday social interactions.

What are some implicit bias examples?

Some examples of implicit biases include assuming a woman is less competent than a man in a leadership role, associating certain ethnicities with criminal behavior, or believing that older people are not technologically savvy.

Other examples include perceiving individuals with disabilities as less capable or assuming that someone who is overweight is lazy or unmotivated.

Aboud, F. E. (1988). Children and prejudice . B. Blackwell.

Banaji, M. R., & Greenwald, A. G. (1995). Implicit gender stereotyping in judgments of fame. Journal of Personality and Social Psychology , 68 (2), 181.

Baron, A. S., Dunham, Y., Banaji, M., & Carey, S. (2014). Constraints on the acquisition of social category concepts. Journal of Cognition and Development , 15 (2), 238-268.

Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American economic review , 94 (4), 991-1013.

Cameron, J. A., Alvarez, J. M., Ruble, D. N., & Fuligni, A. J. (2001). Children’s lay theories about ingroups and outgroups: Reconceptualizing research on prejudice. Personality and Social Psychology Review , 5 (2), 118-128.

Chapman, E. N., Kaatz, A., & Carnes, M. (2013). Physicians and implicit bias: how doctors may unwittingly perpetuate health care disparities. Journal of general internal medicine , 28 (11), 1504-1510.

Charlesworth, T. E., & Banaji, M. R. (2019). Patterns of implicit and explicit attitudes: I. Long-term change and stability from 2007 to 2016. Psychological science , 30(2), 174-192.

Goff, P. A., Jackson, M. C., Di Leone, B. A. L., Culotta, C. M., & DiTomasso, N. A. (2014). The essence of innocence: consequences of dehumanizing Black children. Journal of personality and socialpsychology,106(4), 526.

Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychological review, 102(1), 4.

Greenwald, A. G., McGhee, D. E., & Schwartz, J. L. (1998). Measuring individual differences in implicit cognition: the implicit association test. Journal of personality and social psychology , 74(6), 1464.

Greenwald, A. G., & Krieger, L. H. (2006). Implicit bias: Scientific foundations. California Law Review , 94 (4), 945-967.

Gutierrez, B., Kaatz, A., Chu, S., Ramirez, D., Samson-Samuel, C., & Carnes, M. (2014). “Fair Play”: a videogame designed to address implicit race bias through active perspective taking. Games for health journal , 3 (6), 371-378.

Jackson, S. M., Hillard, A. L., & Schneider, T. R. (2014). Using implicit bias training to improve attitudes toward women in STEM. Social Psychology of Education , 17 (3), 419-438.

Johnson, T. J., Winger, D. G., Hickey, R. W., Switzer, G. E., Miller, E., Nguyen, M. B., … & Hausmann, L. R. (2017). Comparison of physician implicit racial bias toward adults versus children. Academic pediatrics , 17 (2), 120-126.

Kahneman, D. (2011). Thinking, fast and slow . Macmillan.

Lueke, A., & Gibson, B. (2016). Brief mindfulness meditation reduces discrimination. Psychology of Consciousness: Theory, Research, and Practice , 3 (1), 34.

Macrae, C. N., Bodenhausen, G. V., Milne, A. B., & Jetten, J. (1994). Out of mind but back in sight: Stereotypes on the rebound. Journal of personality and social psychology , 67 (5), 808.

Mekawi, Y., & Bresin, K. (2015). Is the evidence from racial bias shooting task studies a smoking gun? Results from a meta-analysis. Journal of Experimental Social Psychology , 61 , 120-130.

Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior , 4 , 265-292.

Pierce, C. (1970). Offensive mechanisms. The black seventies , 265-282.

Plant, E. A., & Peruche, B. M. (2005). The consequences of race for police officers’ responses to criminal suspects. Psychological Science , 16 (3), 180-183.

Rynders, D. (2019). Battling Implicit Bias in the IDEA to Advocate for African American Students with Disabilities. Touro L. Rev. , 35 , 461.

Sinclair, S., Dunn, E., & Lowery, B. (2005). The relationship between parental racial attitudes and children’s implicit prejudice. Journal of Experimental Social Psychology , 41 (3), 283-289.

Steffens, M. C., & Jelenec, P. (2011). Separating implicit gender stereotypes regarding math and language: Implicit ability stereotypes are self-serving for boys and men, but not for girls and women. Sex Roles , 64(5-6), 324-335.

Watson, S., Appiah, O., & Thornton, C. G. (2011). The effect of name on pre‐interview impressions and occupational stereotypes: the case of black sales job applicants. Journal of Applied Social Psychology , 41 (10), 2405-2420.

Wegner, D. M., & Schneider, D. J. (2003). The white bear story. Psychological Inquiry , 14 (3-4), 326-329.

Wigboldus, D. H., Sherman, J. W., Franzese, H. L., & Knippenberg, A. V. (2004). Capacity and comprehension: Spontaneous stereotyping under cognitive load. Social Cognition , 22 (3), 292-309.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes a transcript
  • Understanding Your Racial Biases With John Dovidio, Ph.D., Yale University From the American Psychological Association11:09 minutes; includes a transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American Journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

An infographic titled '6 ways to combat implicit bias' with elaborations on each point and an associated image for each.

Stack Exchange Network

Stack Exchange network consists of 183 Q&A communities including Stack Overflow , the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

What is the difference between a bias and a heuristic explained in layman terms?

Can anyone explain the difference in a way which can be explained to some one who does not a have a clue about psychology? May be examples could help.

Ayush Garg's user avatar

  • $\begingroup$ @faustus Not all cognitive biases are heuristics, and not all cognitive biases are systemic errors. $\endgroup$ –  Arnon Weinberg ♦ Commented Dec 4, 2017 at 17:53
  • $\begingroup$ @ArnonWeinberg i agree with both claims. with regards to the latter re: systematic errors, you're correct. however, i would also suggest that the way you're describing a cognitive bias is also used inconsistently in the literature i.e. i can cite you references that describe it the way i have, as i am sure you can your way. $\endgroup$ –  faustus Commented Dec 4, 2017 at 18:05
  • $\begingroup$ @faustus It would be good to add references for both versions to your answer so we can have a balanced view. I look forward to the edited answer. $\endgroup$ –  Arnon Weinberg ♦ Commented Dec 4, 2017 at 20:55
  • $\begingroup$ The difference between a heuristic and a bias is: a heuristic can have a number of biases within it, but a bias cannot have a number of heuristics within. $\endgroup$ –  Darren Commented Feb 6, 2020 at 10:46
A heuristic is an approach to problem solving, a bias is a prejudice; so in what way do these terms confuse you?

I respectfully disagree. I have noticed that the term bias and heuristic are used interchangeably in the literature that could lead to confusion. The difference between them is subtle.

Can anyone explain the difference in a way which can be explained to some one who does not a have

Heuristics, biases and algorithms are all related terms. The simplest way to describe them is as follows:

A heuristic is a rule, strategy or similar mental shortcut that one can use to derive a solution to a problem. A heuristic that works all of the time is known as an algorithm . Consider the following scenario: you get lost in a maze, what can you do to escape? The solution is to place your hand on the outer edge of the maze and keep walking until you find the exit. Heuristics can help us make sense of the world in a reliable way and reduce our mental load. However, they aren't always perfect and there is also a tendency to use them inappropriately to form beliefs. A systematic error that results from the use of a heuristic is called a cognitive bias .

Per suggestions, the following should be noted:

While the use of heuristics of can lead to cognitive biases, not all cognitive biases are the result of heuristics. Generally, however, when an error in judgement or belief is identified empirically or anecdotally, there will be an attempt to account for it using heuristics.

Additionally, and more broadly, there is just as much ambiguity regarding the definition of cognitive bias in the literature.

For example, in this account of attribution biases, the following definition is provided:

Cognitive biases refer to systematic mistakes that derive from limits that are inherent in our capacity to process information. Because we are not capable of perceiving everything in our environment, our focus is automatically drawn to the most prominent or “eye-catching”—that is, perceptually salient—stimuli. This can lead us to formulate biased and inaccurate causal attributions (Taylor & Fiske, 1975). Specifically, we are prone to equate the most perceptually salient stimuli with the most causally influential stimuli. Shiraev, E. B., Shiraev, E. B., & Levy, D. A. (2016). Cross-cultural psychology: Critical thinking and contemporary applications. Taylor & Francis.
When the correct answer is described, they see for themselves how their own judgment is biased in systematic ways. Such personal demonstrations make Tversky and Kahneman's points in a salient, simple, and powerful way. What was innovative about their work was that it drew a connection between heuristics and biases: A heuristic is a rule of thumb used as such in different contexts. A cognitive bias is a systematic error in our thinking. Morvan, C., & Jenkins, W. J. (2017). Judgment Under Uncertainty: Heuristics and Biases. CRC Press.
Heuristics are the "shortcuts" that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behavior and the heuristically determined behavior (Kahneman et al., 1982). Chipman, S. E. (Ed.). (2016). The Oxford Handbook of Cognitive Science. Oxford University Press.
Cognitive biases are subconscious deviations in judgement leading to perceptual distortion, inaccurate judgement and illogical interpretation. From an evolutionary point of view, they have developed because often speed was more important than accuracy. Biases occur due to information processing shortcuts (or heuristics – see Box 5.1), the brain's limited capacity to process information, social influence, and emotional and moral motivations. Cooper, N., & Frain, J. (Eds.). (2016). ABC of clinical reasoning. John Wiley & Sons.

Although this isn't a psychology reference, the definition is a tad weird:

Architects rely on precedent knowledge to design their next building, believing the past buildings worked well and the knowledge can be transferred to their new design, promising similar performances. Cognitive biases (illusions) and potential errors can occur when using precedent knowledge for analogical, pre-parametric and qualitative design thinking. Zarzar, K. M., & Guney, A. (Eds.). (2008). Understanding Meaningful Environments: Architectural Precedents and the Question of Identity in Creative Design (Vol. 4). IOS Press.

Although I am speculating, I would suggest the following explanation: strictly speaking, an error in normative reasoning need only occur once for it to potentially be the result of a cognitive bias. However, a cognitive bias would need to occur in a systematic manner in a study sample before it could be identified in empirical research and reported in the literature.

And that, my friends, concludes my demonstration on overkill .

faustus's user avatar

  • $\begingroup$ I tried googling it obviously but as has been pointed out, it is used interchangeably at many places which confused me at first. $\endgroup$ –  Ayush Garg Commented Dec 5, 2017 at 17:48

Your Answer

Sign up or log in, post as a guest.

Required, but never shown

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy .

Not the answer you're looking for? Browse other questions tagged bias behavior or ask your own question .

  • Featured on Meta
  • Bringing clarity to status tag usage on meta sites
  • We've made changes to our Terms of Service & Privacy Policy - July 2024
  • Announcing a change to the data-dump process

Hot Network Questions

  • What is an intuitive way to rename a column in a Dataset?
  • What are some refutations to the etymological fallacy?
  • The size of elementary particles
  • Where to donate foreign-language academic books?
  • Too many \setmathfont leads to "Too many symbol fonts declared" error
  • Expected number of numbers that stay at their place after k swaps
  • How much payload could the Falcon 9 send to geostationary orbit?
  • DATEDIFF Rounding
  • Command-line script that strips out all comments in given source files
  • Do metal objects attract lightning?
  • Distinctive form of "לאהוב ל-" instead of "לאהוב את"
  • Relation between stopping times and their filtrations
  • Should I report a review I suspect to be AI-generated?
  • Regression techniques for a “triangular” scatterplot
  • Is this a mistake or am I misunderstanding how to calculate a capacitor's impedance with ESR and ESL?
  • How do I make a command that makes a comma-separated list where all the items are bold?
  • Purpose of burn permit?
  • If inflation/cost of living is such a complex difficult problem, then why has the price of drugs been absoultly perfectly stable my whole life?
  • How do eradicated diseases make a comeback?
  • Book or novel about an intelligent monolith from space that crashes into a mountain
  • Chromatic homotopy + algebraic geometry =?
  • How can I install an application in Ubuntu Server and ensure only it runs in the GUI?
  • Why is {_Integer..} matching {_Rule..}?
  • A way to move an object with the 3D cursor location as the moving point?

critical thinking vs bias

IMAGES

  1. Critical thinking

    critical thinking vs bias

  2. Infographic : 18 Cognitive Bias Examples Show Why Mental Mistakes Get Made

    critical thinking vs bias

  3. Critical Thinking

    critical thinking vs bias

  4. Bias and Critical Thinking

    critical thinking vs bias

  5. Critical Thinking & Bias

    critical thinking vs bias

  6. Critical Thinking: Media Bias by Roger Chanes on Prezi

    critical thinking vs bias

VIDEO

  1. critical thinking vs MSM

  2. Let your Creativity and Critical Thinking unlock your Dr. Jekyll instead! 🔥

  3. Critical Thinking vs. Shariah Part 57 Farwell Circumambulation

  4. Critical Thinking vs. Shariah Part 45 ZAKAT Distribution

  5. Critical Thinking vs. Shariah Part 50 Ramadan Fasting Part C

  6. Top Critical Thinking Skills

COMMENTS

  1. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  2. 12 Common Biases That Affect How We Make Everyday Decisions

    Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door. 6. The Sunk Cost Fallacy. Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty ...

  3. Cognitive Bias: What It Is, Signs, How to Overcome It

    Thinking about these things and challenging your biases can make you a more critical thinker. Challenging others' biases: Respectfully pointing out biases in others can also help others think more critically, especially if they're unconsciously acting or speaking on their biases. "Even if we can't change others' biases, remember they may be at ...

  4. Cognitive Bias List: 13 Common Types of Bias

    The Misinformation Effect. The False Consensus Effect. The Halo Effect. The Availability Heuristic. The Optimism Bias. Other Kinds. Trending Videos. Close this video player. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.

  5. 10 Types Of Cognitive Biases And How It Affects Your Thinking

    Being aware of these biases is crucial as they can impact our mental health functioning and overall well-being and hinder our ability to make informed decisions. Different Types of Cognitive Biases. Some examples of different types of cognitive biases include: 1. Actor-Observer Bias

  6. Cognitive Bias Is the Loose Screw in Critical Thinking

    Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...

  7. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  8. Critical Thinking

    Critical Thinking. Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. ... the representativeness heuristic, confirmation bias, attentional bias, and the anchoring effect. The field of behavioral economics, made popular by Dan Ariely (2008; 2010; 2012) and ...

  9. Critical Thinking

    12.2 Bias in Critical Thinking Theory and Pedagogy. Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are ...

  10. What Is Cognitive Bias? 7 Examples & Resources (Incl. Codex)

    There are numerous examples of cognitive biases, and the list keeps growing. Here are a few examples of some of the more common ones. 1. Confirmation bias. This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998).

  11. Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

    Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.

  12. 7 Different Types of Bias and How to Work Through Them

    Confirmation Bias. Attribution Bias. Conformity Bias. Beauty Bias. Gender Bias. Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions. Our experiences, culture, social norms, and personal beliefs often shape these beliefs.

  13. A List of Common Cognitive Biases (With Examples)

    Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X. 2. Implicit biases are unconscious beliefs that lead ...

  14. What Is Cognitive Bias?

    Cognitive bias is the tendency to act in an irrational way due to our limited ability to process information objectively. It is not always negative, but it can cloud our judgment and affect how clearly we perceive situations, people, or potential risks. Example: Cognitive bias. One common manifestation of cognitive bias is the stereotype that ...

  15. Defining Critical Thinking

    Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism.

  16. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  17. How to Identify Cognitive Bias: 12 Examples of Cognitive Bias

    Contact Sales Learn More. Cognitive biases are inherent in the way we think, and many of them are unconscious. Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions.

  18. Bridging critical thinking and transformative learning: The role of

    A weak sense critical thinker is skilled at using critical thinking tools to serve 'egocentric' and 'sociocentric' biases . Weak-sense critical thinkers can make strong and logical arguments, but they are not fair-minded as they lack the ability to take on the perspective of others ( Paul, 1992 ).

  19. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  20. Implicit Bias: What It Is, Examples, & Ways to Reduce It

    Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006). An implicit bias may counter a person's conscious beliefs without realizing it. For example, it is possible to express explicit liking of a certain social group or approval of a certain action while simultaneously being biased ...

  21. behavior

    Additionally, and more broadly, there is just as much ambiguity regarding the definition of cognitive bias in the literature. For example, in this account of attribution biases, the following definition is provided: Cognitive biases refer to systematic mistakes that derive from limits that are inherent in our capacity to process information.