Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

critical thinking a methodology

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Don't submit your assignments before you do this

The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.

critical thinking a methodology

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved June 24, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

loading

Encyclopedia Britannica

  • Games & Quizzes
  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • What was education like in ancient Athens?
  • How does social class affect education attainment?
  • When did education become compulsory?
  • What are alternative forms of education?
  • Do school vouchers offer students access to better education?

Aristotle (384-322 BC), Ancient Greek philosopher and scientist. One of the most influential philosophers in the history of Western thought, Aristotle established the foundations for the modern scientific method of enquiry. Statue

critical thinking

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Stanford Encyclopedia of Philosophy - Critical Thinking
  • Internet Encyclopedia of Philosophy - Critical Thinking
  • Monash University - Student Academic Success - What is critical thinking?
  • Oklahoma State University Pressbooks - Critical Thinking - Introduction to Critical Thinking
  • University of Louisville - Critical Thinking

critical thinking , in educational theory, mode of cognition using deliberative reasoning and impartial scrutiny of information to arrive at a possible solution to a problem. From the perspective of educators, critical thinking encompasses both a set of logical skills that can be taught and a disposition toward reflective open inquiry that can be cultivated . The term critical thinking was coined by American philosopher and educator John Dewey in the book How We Think (1910) and was adopted by the progressive education movement as a core instructional goal that offered a dynamic modern alternative to traditional educational methods such as rote memorization.

Critical thinking is characterized by a broad set of related skills usually including the abilities to

Socrates

  • break down a problem into its constituent parts to reveal its underlying logic and assumptions
  • recognize and account for one’s own biases in judgment and experience
  • collect and assess relevant evidence from either personal observations and experimentation or by gathering external information
  • adjust and reevaluate one’s own thinking in response to what one has learned
  • form a reasoned assessment in order to propose a solution to a problem or a more accurate understanding of the topic at hand

Theorists have noted that such skills are only valuable insofar as a person is inclined to use them. Consequently, they emphasize that certain habits of mind are necessary components of critical thinking. This disposition may include curiosity, open-mindedness, self-awareness, empathy , and persistence.

Although there is a generally accepted set of qualities that are associated with critical thinking, scholarly writing about the term has highlighted disagreements over its exact definition and whether and how it differs from related concepts such as problem solving . In addition, some theorists have insisted that critical thinking be regarded and valued as a process and not as a goal-oriented skill set to be used to solve problems. Critical-thinking theory has also been accused of reflecting patriarchal assumptions about knowledge and ways of knowing that are inherently biased against women.

Dewey, who also used the term reflective thinking , connected critical thinking to a tradition of rational inquiry associated with modern science. From the turn of the 20th century, he and others working in the overlapping fields of psychology , philosophy , and educational theory sought to rigorously apply the scientific method to understand and define the process of thinking. They conceived critical thinking to be related to the scientific method but more open, flexible, and self-correcting; instead of a recipe or a series of steps, critical thinking would be a wider set of skills, patterns, and strategies that allow someone to reason through an intellectual topic, constantly reassessing assumptions and potential explanations in order to arrive at a sound judgment and understanding.

In the progressive education movement in the United States , critical thinking was seen as a crucial component of raising citizens in a democratic society. Instead of imparting a particular series of lessons or teaching only canonical subject matter, theorists thought that teachers should train students in how to think. As critical thinkers, such students would be equipped to be productive and engaged citizens who could cooperate and rationally overcome differences inherent in a pluralistic society.

Beginning in the 1970s and ’80s, critical thinking as a key outcome of school and university curriculum leapt to the forefront of U.S. education policy. In an atmosphere of renewed Cold War competition and amid reports of declining U.S. test scores, there were growing fears that the quality of education in the United States was falling and that students were unprepared. In response, a concerted effort was made to systematically define curriculum goals and implement standardized testing regimens , and critical-thinking skills were frequently included as a crucially important outcome of a successful education. A notable event in this movement was the release of the 1980 report of the Rockefeller Commission on the Humanities that called for the U.S. Department of Education to include critical thinking on its list of “basic skills.” Three years later the California State University system implemented a policy that required every undergraduate student to complete a course in critical thinking.

Critical thinking continued to be put forward as a central goal of education in the early 21st century. Its ubiquity in the language of education policy and in such guidelines as the Common Core State Standards in the United States generated some criticism that the concept itself was both overused and ill-defined. In addition, an argument was made by teachers, theorists, and others that educators were not being adequately trained to teach critical thinking.

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - What is Critical Thinking?

Critical thinking and decision-making  -, what is critical thinking, critical thinking and decision-making what is critical thinking.

GCFLearnFree Logo

Critical Thinking and Decision-Making: What is Critical Thinking?

Lesson 1: what is critical thinking, what is critical thinking.

Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. But when you stop to think about it, what exactly is critical thinking and how do you do it ?

Watch the video below to learn more about critical thinking.

Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions . It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better.

illustration of the terms logic, reasoning, and creativity

This may sound like a pretty broad definition, and that's because critical thinking is a broad skill that can be applied to so many different situations. You can use it to prepare for a job interview, manage your time better, make decisions about purchasing things, and so much more.

The process

illustration of "thoughts" inside a human brain, with several being connected and "analyzed"

As humans, we are constantly thinking . It's something we can't turn off. But not all of it is critical thinking. No one thinks critically 100% of the time... that would be pretty exhausting! Instead, it's an intentional process , something that we consciously use when we're presented with difficult problems or important decisions.

Improving your critical thinking

illustration of the questions "What do I currently know?" and "How do I know this?"

In order to become a better critical thinker, it's important to ask questions when you're presented with a problem or decision, before jumping to any conclusions. You can start with simple ones like What do I currently know? and How do I know this? These can help to give you a better idea of what you're working with and, in some cases, simplify more complex issues.  

Real-world applications

illustration of a hand holding a smartphone displaying an article that reads, "Study: Cats are better than dogs"

Let's take a look at how we can use critical thinking to evaluate online information . Say a friend of yours posts a news article on social media and you're drawn to its headline. If you were to use your everyday automatic thinking, you might accept it as fact and move on. But if you were thinking critically, you would first analyze the available information and ask some questions :

  • What's the source of this article?
  • Is the headline potentially misleading?
  • What are my friend's general beliefs?
  • Do their beliefs inform why they might have shared this?

illustration of "Super Cat Blog" and "According to survery of cat owners" being highlighted from an article on a smartphone

After analyzing all of this information, you can draw a conclusion about whether or not you think the article is trustworthy.

Critical thinking has a wide range of real-world applications . It can help you to make better decisions, become more hireable, and generally better understand the world around you.

illustration of a lightbulb, a briefcase, and the world

/en/problem-solving-and-decision-making/why-is-it-so-hard-to-make-decisions/content/

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to Critical Thinking

Educational methods.

Experiments have shown that educational interventions can improve critical thinking abilities and dispositions, as measured by standardized tests. Glaser (1941) developed teaching materials suitable for senior primary school, high school and college students. To test their effectiveness, he developed with his sponsor Goodwin Watson the Watson-Glaser Tests of Critical Thinking, whose descendants are in widespread global use under the name “Watson-Glaser Critical Thinking Appraisal” (Watson & Glaser 1980a, 1980b, 1994). He found that senior secondary school students receiving 10 weeks of instruction using these materials improved their scores on these tests more than other such students receiving the standard English curriculum during the 10 weeks, to a degree that was statistically significant (i.e., probably not due to chance). More recently, Abrami et al. (2015) summarized in a meta-analysis the best available evidence on the effectiveness of various strategies for teaching students to think critically. The meta-analysis used as a measure of effectiveness a modified version of a statistical measure known as “Cohen’s d”: the ratio of a difference in mean score to the statistical deviation (SD) of the scores in a reference group. A difference of 0.2 SD is a small effect, a difference of 0.5 SD is a moderate effect, and a difference of 0.8 is a large effect (Cohen 1988: 25–27). Abrami et al. (2015) found a weighted mean effect size of 0.30 among 341 effect sizes, with effect sizes ranging from −1 to +2. This methodologically careful meta-analysis provides strong statistical evidence that explicit instruction for critical thinking can improve critical thinking abilities and dispositions, as measured by standardized tests.

Although contemporary meta-analysis provides a more justified verdict on claims of causal effectiveness than other methods of investigation, it does not give the reader an intuitive grasp of what difference a particular intervention makes to the lives of those who receive it. To get an appreciation of this difference, it helps to read the testimony of the teachers and students in the Laboratory School of Chicago where Dewey’s ideas obtained concreteness. The history of the school, written by two of its former teachers in collaboration with Dewey, makes the following claim for the effects of its approach:

As a result of this guarding and direction of their freedom, the children retained the power of initiative naturally present in young children through their inquisitive interests. This spirit of inquiry was given plenty of opportunity and developed with most of the children into the habit of trying a thing out for themselves. Thus, they gradually became familiar with, and to varying degrees skilled in, the use of the experimental method to solve problems in all areas of their experience. (Mayhew & Edwards 1936: 402–403)

A science teacher in the school wrote:

I think the children did get the scientific attitude of mind. They found out things for themselves. They worked out the simplest problems that may have involved a most commonplace and everyday fact in the manner that a really scientific investigator goes to work. (Mayhew & Edwards 1936: 403)

An alumna of the school summed up the character of its former students as follows:

It is difficult for me to be restrained about the character building results of the Dewey School. As the years have passed and as I have watched the lives of many Dewey School children, I have always been astonished at the ease which fits them into all sorts and conditions of emergencies. They do not vacillate and flounder under unstable emotions; they go ahead and work out the problem in hand, guided by their positively formed working habits. Discouragement to them is non-existent, almost ad absurdum. For that very fact, accomplishment in daily living is inevitable. Whoever has been given the working pattern of tackling problems has a courage born of self-confidence and achieves. (Mayhew & Edwards 1936: 406–407)

In the absence of control groups, of standardized tests, and of statistical methods of controlling for confounding variables, such testimonies are weak evidence of the effectiveness of educational interventions in developing the abilities and dispositions of a critical thinker—in Dewey’s conception, a scientific attitude. But they give a vivid impression of what might be accomplished in an educational system that takes the development of critical thinking as a goal.

Dewey established the Laboratory School explicitly as an experiment to test his theory of knowledge, which

emphasized the part in the development of thought of problems which originated in active situations and also the necessity of testing thought by action if thought was to pass over into knowledge. (Dewey 1936: 464)

Hence the curriculum of the school started from situations familiar to children from their home life (such as preparing food and making clothing) and posed problems that the children were to solve by doing things and noting the consequences. This curriculum was adjusted in the light of its observed results in the classroom.

The school’s continued experimentation with the subject matter of the elementary curriculum proved that classroom results were best when activities were in accord with the child’s changing interests, his growing consciousness of the relation of means and ends, and his increasing willingness to perfect means and to postpone satisfactions in order to arrive at better ends…. The important question for those guiding this process of growth, and of promoting the alignment and cooperation of interest and effort, is this. What specific subject-matter or mode of skill has such a vital connection with the child’s interest, existing powers, and capabilities as will extend the one [the interest–DH] and stimulate, exercise, and carry forward the others [the powers and capabilities–DH] in a progressive course of action? (Mayhew & Edwards 1936: 420–421)

In an appendix to the history of the Laboratory School, Dewey (1936: 468–469) acknowledges that the school did not solve the problem of finding things in the child’s present experience out of which would grow more elaborate, technical and organized knowledge. Passmore (1980: 91) notes one difficulty of starting from children’s out-of-school experiences: they differ a lot from one child to another. More fundamentally, the everyday out-of-school experiences of a child provide few links to the systematic knowledge of nature and of human history that humanity has developed and that schools should pass on to the next generation. If children are to acquire such knowledge through investigation of problems, teachers must first provide information as a basis for formulating problems that interest them (Passmore 1980: 93–94).

More than a century has passed since Dewey’s experiment. In the interim, researchers have refined the methodology of experimenting with human subjects, in educational research and elsewhere. They have also developed the methodology of meta-analysis for combining the results of various experiments to form a comprehensive picture of what has been discovered. Abrami et al. (2015) report the results of such a meta-analysis of all the experimental and quasi-experimental studies published or archived before 2010 that used as outcome variables standardized measures of critical thinking abilities or dispositions of the sort enumerated in Facione 1990a and described in sections 8 and 9 of the main entry. By an experimental study, they mean one in which participants are divided randomly into two groups, one of which receives the educational intervention designed to improve critical thinking and the other of which serves as a control; they found few such experiments, because of the difficulty of achieving randomization in the classrooms where the studies were conducted. By a quasi-experiment, they mean a study with an intervention group that receives an educational intervention designed to improve critical thinking and a control group, but without random allocation to the two groups. Initially, they included also what they called “pre-experiments”, with single-group pretest-posttest designs, but decided at the analysis stage not to include these studies. By a standardized measure, they mean a test with norms derived from previous administration of the test, as set out in the test’s manual, such as the Watson-Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), the Cornell Critical Thinking Tests (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985; 2005), the California Critical Thinking Skills Test (Facione 1990b, 1992) and the California Critical Thinking Dispositions Inventory (Facione & Facione 1992; Facione, Facione, & Giancarlo 2001). They included all such studies in which the educational intervention lasted at least three hours and the participants were at least six years old.

In these studies they found 341 effect sizes. They rated each educational intervention according to the degree to which it involved dialogue, anchored instruction, and mentoring. They found that each of these factors increased the effectiveness of the educational intervention, and that they were most effective when combined. They explained the three factors as follows.

Dialogue : In critical dialogue, which historically goes back to Socrates, individuals discuss a problem together. The dialogue can be oral or written, and cooperative or adversarial. It can take the form of asking questions, discussion, or debate. Some curricula designed to promote critical thinking establish “communities of inquiry” among the students. Such communities were a prominent feature of Dewey’s Laboratory School, incorporated as a means of promoting the primary moral objective of fostering a spirit of social cooperation among the children.

An important aspect of this conditioning process by means of the school’s daily practices was to aid each child in forming a habit of thinking before doing in all of his various enterprises. The daily classroom procedure began with a face-to-face discussion of the work of the day and its relation to that of the previous period. The new problem was then faced, analyzed, and possible plans and resources for its solution suggested by members of the group. The children soon grew to like this method. It gave both individual and group a sense of power to be intelligent, to know what they wanted to do before they did it, and to realize the reasons why one plan was preferred to another. It also enlisted their best effort to prove the validity of their judgment by testing the plan in action. Each member of the group thus acquired a habit of observing, criticizing, and integrating values in thought, in order that they should guide the action that would integrate them in fact. The value of thus previsioning consequences of action before they became fixed as fact was emphasized in the school’s philosophy. The social implication is evident. The conscious direction of his actions toward considered social ends became an unfailing index of the child’s progress toward maturity. (Mayhew & Edwards 1936: 423–424)

Communities of inquiry are also a feature of the Montessori method described by Thayer-Bacon (2000) and of the Philosophy for Children program developed by Matthew Lipman (Splitter 1987). Lipman (2003) examines theoretically what is involved in creating communities of inquiry. Hitchcock (2021) argues that the most obvious way for schools to develop critical thinking is to foster development of communities of inquiry.

Anchored instruction : In anchored instruction, whose advocacy goes back to Rousseau (1762) and Dewey (1910), there is an effort to present students with problems that make sense to them, engage them, and stimulate them to inquire. Simulations, role-playing and presentation of ethical or medical dilemmas are methods of anchoring.

Mentoring : Mentoring is a one-on-one relationship in which someone with more relevant expertise (the mentor) interacts with someone with less (the mentee). The mentor acts as a model and as a critic correcting errors by the mentee. Examples of mentoring are an advisor talking to a student, a physician modeling a procedure for a medical student, and an employee correcting an intern. Abrami et al. (2015) identified three kinds of mentoring in the studies that they analyzed: one-on-one teacher-student interaction, peer-led dyads, and internships.

Abrami et al. (2015) also compared educational interventions with respect to whether they were part of subject-matter instruction. For this purpose, they used a distinction among four types of intervention articulated by Ennis (1989). A general approach tries to teach critical thinking separately from subject-matter instruction. An infusion approach combines deep subject-matter instruction in which students are encouraged to think critically with explicit reference to critical thinking principles. An immersion approach provides deep subject-matter instruction with encouragement to think critically, but without explicit reference to critical thinking principles. A mixed approach combines the general approach with either the infusion or the immersion approach; students combine a separate thread or course aimed at teaching general critical thinking principles with deep subject-matter instruction in which they are encouraged to think critically about the subject-matter. Although the average effect size in the studies using a mixed intervention (+0.38) was greater than the average effect sizes in the studies using general (+0.26), infusion (+0.29) and immersion (+0.23) interventions, the difference was not statistically significant; in other words, it might have been due to chance.

Cleghorn (2021), Makaiau (2021), and Hiner (2021) make specific suggestions for fostering critical thinking respectively in elementary, secondary and post-secondary education. Vincent-Lancrin et al. (2019) report the results of a project of the Organization for Economic Cooperation and Development to develop with teachers and schools in 11 countries resources for fostering creativity and critical thinking in elementary and secondary schools.

Ennis (2013, 2018) has made a detailed proposal for a mixed approach to teaching critical thinking across the curriculum of undergraduate education. Attempts at implementing such an approach have faced difficulties. Weinstein (2013: 209–213) describes the attempt at Montclair State University in Montclair, New Jersey, from 1987 through the 1990s. He reports that the university’s requirement to include critical thinking in all general education courses led to the use of the concept in identifying topics and tasks in course syllabi, but without a unifying theoretical basis. The committee that approved courses as satisfying a general education requirement ignored the relation of curricular outcomes to critical thinking, and focused instead on work requirements with a prima facie relation to reflective thought: term papers, projects, group work, and dialogue. Sheffield (2018) reports similar difficulties encountered in his position from 2012 to 2015 as the inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology (RIT) in Rochester, New York. A cross-disciplinary faculty advisory group was not ready to accept RIT’s approved definition of critical thinking, but never reached a consensus on an alternative. Payette and Ross (2016), on the other hand, report widespread acceptance of the Paul-Elder framework, which involves elements of thought, intellectual standards, and intellectual virtues (Paul & Elder 2006). Sheffield (2018) reports that many colleges and universities in the United States have received funding for so-called “Quality Enhancement Plans” (QEPs) devoted to critical thinking, many of them written by Paul and Elder or developed in consultation with them. He faults the plans for having a typical time frame of five years, which he argues is probably too short for meaningful results, since lasting institutional change is often extremely slow.

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Bookmark this page

  • Call for Volunteers!
  • Our Team of Presenters
  • Fellows of the Foundation
  • Dr. Richard Paul
  • Dr. Linda Elder
  • Dr. Gerald Nosich
  • Contact Us - Office Information
  • Permission to Use Our Work
  • Create a CriticalThinking.Org Account
  • Contributions to the Foundation for Critical Thinking
  • Testimonials
  • Center for Critical Thinking
  • The National Council for Excellence in Critical Thinking
  • International Center for the Assessment of Higher Order Thinking
  • Library of Critical Thinking Resources
  • Professional Development
  • Inservice Information Request Form
  • The State of Critical Thinking Today
  • Higher Education
  • K-12 Instruction
  • Customized Webinars and Online Courses for Faculty
  • Business & Professional Groups
  • The Center for Critical Thinking Community Online
  • Certification in the Paul-Elder Approach to Critical Thinking
  • Certification Online Course
  • Professional Development Model - College and University
  • Professional Development Model for K-12
  • Workshop Descriptions
  • Online Courses in Critical Thinking
  • Critical Thinking Training for Law Enforcement
  • Consulting for Leaders and Key Personnel at Your Organization
  • Critical Thinking Therapy
  • Conferences & Events
  • Upcoming Learning Opportunities
  • 44th Annual International Conference on Critical Thinking
  • Focal Session Descriptions
  • Daily Schedule
  • Presuppositions of the 44th Annual International Conference on Critical Thinking
  • Recommended Reading
  • 2024 Fall Academy on Critical Thinking
  • Transportation, Lodging, and Leisure
  • Call for Proposals
  • Academy Presuppositions
  • Conference Archives
  • 43rd Annual International Conference on Critical Thinking
  • Guest Presentation Program
  • Register as an Ambassador
  • Testimonials from Past Attendees
  • Thank You to Our Donors
  • Presuppositions of the Conference
  • 42nd Annual International Conference on Critical Thinking
  • Overview of Sessions (Flyer)
  • Presuppositions of the Annual International Conference
  • Testimonials from Past Conferences
  • 41st Annual International Conference on Critical Thinking
  • Recommended Publications
  • Dedication to Our Donors
  • 40th Annual International Conference on Critical Thinking
  • Session Descriptions
  • Testimonials from Prior Conferences
  • International Critical Thinking Manifesto
  • Scholarships Available
  • 39th Annual International Conference on Critical Thinking
  • Travel and Lodging Info
  • FAQ & General Announcements
  • Focal and Plenary Session Descriptions
  • Program and Proceedings of the 39th Annual International Conference on Critical Thinking
  • The Venue: KU Leuven
  • Call for Critical Thinking Ambassadors
  • Conference Background Information
  • 38th Annual International Conference on Critical Thinking
  • Call for Ambassadors for Critical Thinking
  • Conference Focal Session Descriptions
  • Conference Concurrent Session Descriptions
  • Conference Roundtable Discussions
  • Conference Announcements and FAQ
  • Conference Program and Proceedings
  • Conference Daily Schedule
  • Conference Hotel Information
  • Conference Academic Credit
  • Conference Presuppositions
  • What Participants Have Said About the Conference
  • 37th Annual International Conference on Critical Thinking
  • Registration & Fees
  • FAQ and Announcements
  • Conference Presenters
  • 37th Conference Flyer
  • Program and Proceedings of the 37th Conference
  • 36th International Conference
  • Conference Sessions
  • Conference Flyer
  • Program and Proceedings
  • Academic Credit
  • 35th International Conference
  • Conference Session Descriptions
  • Available Online Sessions
  • Bertrand Russell Distinguished Scholar - Daniel Ellsberg
  • 35th International Conference Program
  • Concurrent Sessions
  • Posthumous Bertrand Russell Scholar
  • Hotel Information
  • Conference FAQs
  • Visiting UC Berkeley
  • 34th INTERNATIONAL CONFERENCE
  • Bertrand Russell Distinguished Scholar - Ralph Nader
  • Conference Concurrent Presenters
  • Conference Program
  • Conference Theme
  • Roundtable Discussions
  • Flyer for Bulletin Boards
  • 33rd INTERNATIONAL CONFERENCE
  • 33rd International Conference Program
  • 33rd International Conference Sessions
  • 33rd International Conference Presenters
  • The Bertrand Russell Distinguished Scholars Critical Thinking Conversations
  • 33rd International Conference - Fees & Registration
  • 33rd International Conference Concurrent Presenters
  • 33rd International Conference - Hotel Information
  • 33rd International Conference Flyer
  • 32nd INTERNATIONAL CONFERENCE
  • 32nd Annual Conference Sessions
  • 32nd Annual Conference Presenter Information
  • 32nd Conference Program
  • The Bertrand Russell Distinguished Scholars Critical Thinking Lecture Series
  • 32nd Annual Conference Concurrent Presenters
  • 32nd Annual Conference Academic Credit
  • 31st INTERNATIONAL CONFERENCE
  • 31st Conference Sessions
  • Comments about previous conferences
  • Conference Hotel (2011)
  • 31st Concurrent Presenters
  • Registration Fees
  • 31st International Conference
  • 30th INTERNATIONAL CONFERENCE ON CRITICAL THINKING
  • 30th International Conference Theme
  • 30th Conference Sessions
  • PreConference Sessions
  • 30th Concurrent Presenters
  • 30th Conference Presuppositions
  • Hilton Garden Inn
  • 29th International Conference
  • 29th Conference Theme
  • 29th Conference Sessions
  • 29th Preconference Sessions
  • 29th Conference Concurrent Sessions
  • 2008 International Conference on Critical Thinking
  • 2008 Preconference Sessions (28th Intl. Conference)
  • 2007 Conference on Critical Thinking (Main Page)
  • 2007 Conference Theme and sessions
  • 2007 Pre-Conference Workshops
  • 2006 Annual International Conference (archived)
  • 2006 International Conference Theme
  • 2005 International Conference (archived)
  • Prior Conference Programs (Pre 2000)
  • Workshop Archives
  • Spring 2022 Online Workshops
  • 2021 Online Workshops for Winter & Spring
  • 2019 Seminar for Military and Intelligence Trainers and Instructors
  • Transportation, Lodging, and Recreation
  • Seminar Flyer
  • 2013 Spring Workshops
  • Our Presenters
  • 2013 Spring Workshops - Hotel Information
  • 2013 Spring Workshops Flyer
  • 2013 Spring Workshops - Schedule
  • Spring Workshop 2012
  • 2012 Spring Workshop Strands
  • 2012 Spring Workshop Flier
  • 2011 Spring Workshop
  • Spring 2010 Workshop Strands
  • 2009 Spring Workshops on Critical Thinking
  • 2008 SPRING Workshops and Seminars on Critical Thinking
  • 2008 Ethical Reasoning Workshop
  • 2008 - On Richard Paul's Teaching Design
  • 2008 Engineering Reasoning Workshop
  • 2008 Academia sobre Formulando Preguntas Esenciales
  • Fellows Academy Archives
  • 2017 Fall International Fellows Academy
  • 4th International Fellows Academy - 2016
  • 3rd International Fellows Academy
  • 2nd International Fellows Academy
  • 1st International Fellows Academy
  • Academy Archives
  • October 2019 Critical Thinking Academy for Educators and Administrators
  • Advanced Seminar: Oxford Tutorial
  • Recreational Group Activities
  • Limited Scholarships Available
  • September 2019 Critical Thinking Educators and Administrators Academy
  • 2019 Critical Thinking Training for Trainers and Advanced Academy
  • Academy Flyer
  • Seattle, WA 2017 Spring Academy
  • San Diego, CA 2017 Spring Academy
  • 2016 Spring Academy -- Washington D.C.
  • 2016 Spring Academy -- Houston, TX
  • The 2nd International Academy on Critical Thinking (Oxford 2008)
  • 2007 National Academy on Critical Thinking Testing and Assessment
  • 2006 Cambridge Academy (archived)
  • 2006 Cambridge Academy Theme
  • 2006 Cambridge Academy Sessions
  • Accommodations at St. John's College
  • Assessment & Testing
  • A Model for the National Assessment of Higher Order Thinking
  • International Critical Thinking Essay Test
  • Online Critical Thinking Basic Concepts Test
  • Online Critical Thinking Basic Concepts Sample Test
  • Consequential Validity: Using Assessment to Drive Instruction
  • News & Announcements
  • Newest Pages Added to CriticalThinking.Org
  • Online Learning
  • Critical Thinking Online Courses
  • Critical Thinking Blog
  • 2019 Blog Entries
  • 2020 Blog Entries
  • 2021 Blog Entries
  • 2022 Blog Entries
  • 2023 Blog Entries
  • Online Courses for Your Students
  • 2023 Webinar Archives
  • 2022 Webinar Archives
  • 2021 Webinar Archive
  • 2020 Webinar Archive
  • Guided Study Groups
  • Critical Thinking Channel on YouTube

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

critical thinking a methodology

Critical Thinking: Where to Begin

critical thinking a methodology

  • For College and University Faculty
  • For College and University Students
  • For High School Teachers
  • For Jr. High School Teachers
  • For Elementary Teachers (Grades 4-6)
  • For Elementary Teachers (Kindergarten - 3rd Grade)
  • For Science and Engineering Instruction
  • For Business and Professional Development
  • For Nursing and Health Care
  • For Home Schooling and Home Study

If you are new to critical thinking or wish to deepen your conception of it, we recommend you review the content below and bookmark this page for future reference.

Our Conception of Critical Thinking...

getting started with critical thinking

"Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness..."

"Critical thinking is self-guided, self-disciplined thinking which attempts to reason at the highest level of quality in a fairminded way. People who think critically attempt, with consistent and conscious effort, to live rationally, reasonably, and empathically. They are keenly aware of the inherently flawed nature of human thinking when left unchecked. They strive to diminish the power of their egocentric and sociocentric tendencies. They use the intellectual tools that critical thinking offers – concepts and principles that enable them to analyze, assess, and improve thinking. They work diligently to develop the intellectual virtues of intellectual integrity, intellectual humility, intellectual civility, intellectual empathy, intellectual sense of justice and confidence in reason. They realize that no matter how skilled they are as thinkers, they can always improve their reasoning abilities and they will at times fall prey to mistakes in reasoning, human irrationality, prejudices, biases, distortions, uncritically accepted social rules and taboos, self-interest, and vested interest.

They strive to improve the world in whatever ways they can and contribute to a more rational, civilized society. At the same time, they recognize the complexities often inherent in doing so. They strive never to think simplistically about complicated issues and always to consider the rights and needs of relevant others. They recognize the complexities in developing as thinkers, and commit themselves to life-long practice toward self-improvement. They embody the Socratic principle: The unexamined life is not worth living , because they realize that many unexamined lives together result in an uncritical, unjust, dangerous world."

Why Critical Thinking?

critical thinking a methodology

The Problem:

Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed, or down-right prejudiced. Yet the quality of our lives and that of what we produce, make, or build depends precisely on the quality of our thought. Shoddy thinking is costly, both in money and in quality of life. Excellence in thought, however, must be systematically cultivated.

A Brief Definition:

Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. The Result: 

  A well-cultivated critical thinker:

  • raises vital questions and problems, formulating them clearly and precisely;
  • gathers and assesses relevant information, using abstract ideas to interpret it effectively;
  • comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards;
  • thinks openmindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; and
  • communicates effectively with others in figuring out solutions to complex problems.

Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It requires rigorous standards of excellence and mindful command of their use. It entails effective communication and problem-solving abilities, and a commitment to overcoming our native egocentrism and sociocentrism. Read more about our concept of critical thinking .

The Essential Dimensions of Critical Thinking

critical thinking a methodology

Our conception of critical thinking is based on the substantive approach developed by Dr. Richard Paul and his colleagues at the Center and Foundation for Critical Thinking over multiple decades. It is relevant to every subject, discipline, and profession, and to reasoning through the problems of everyday life. It entails five essential dimensions of critical thinking:

At the left is an overview of the first three dimensions. In sum, the elements or structures of thought enable us to "take our thinking apart" and analyze it. The intellectual standards are used to assess and evaluate the elements. The intellectual traits are dispositions of mind embodied by the fairminded critical thinker. To cultivate the mind, we need command of these essential dimensions, and we need to consistently apply them as we think through the many problems and issues in our lives.

The Elements of Reasoning and Intellectual Standards

critical thinking a methodology

To learn more about the elements of thought and how to apply the intellectual standards, check out our interactive model. Simply click on the link below, scroll to the bottom of the page, and explore the model with your mouse.

Why the Analysis of Thinking Is Important If you want to think well, you must understand at least the rudiments of thought, the most basic structures out of which all thinking is made. You must learn how to take thinking apart. Analyzing the Logic of a Subject When we understand the elements of reasoning, we realize that all subjects, all disciplines, have a fundamental logic defined by the structures of thought embedded within them. Therefore, to lay bare a subject’s most fundamental logic, we should begin with these questions:

critical thinking a methodology

Going Deeper...

critical thinking a methodology

The Critical Thinking Bookstore  

Our online bookstore houses numerous books and teacher's manuals , Thinker's Guides , videos , and other educational materials .  

Learn From Our Fellows and Scholars

Watch our Event Calendar , which provides an overview of all upcoming conferences and academies hosted by the Foundation for Critical Thinking. Clicking an entry on the Event Calendar will bring up that event's details, and the option to register. For those interested in online learning, the Foundation offers accredited online courses in critical thinking for both educators and the general public, as well as an online test for evaluating basic comprehension of critical thinking concepts . We are in the process of developing more online learning tools and tests to offer the community.  

Utilizing this Website

This website contains large amounts research and an online library of articles , both of which are freely available to the public. We also invite you to become a member of the Critical Thinking Community , where you will gain access to more tools and materials.  If you cannot locate a resource on a specific topic or concept, try searching for it using our Search Tool . The Search Tool is at the upper-right of every page on the website.

SkillsYouNeed

  • LEARNING SKILLS
  • Study Skills
  • Critical Thinking

Search SkillsYouNeed:

Learning Skills:

  • A - Z List of Learning Skills
  • What is Learning?
  • Learning Approaches
  • Learning Styles
  • 8 Types of Learning Styles
  • Understanding Your Preferences to Aid Learning
  • Lifelong Learning
  • Decisions to Make Before Applying to University
  • Top Tips for Surviving Student Life
  • Living Online: Education and Learning
  • 8 Ways to Embrace Technology-Based Learning Approaches

Critical Thinking Skills

  • Critical Thinking and Fake News
  • Understanding and Addressing Conspiracy Theories
  • Critical Analysis
  • Top Tips for Study
  • Staying Motivated When Studying
  • Student Budgeting and Economic Skills
  • Getting Organised for Study
  • Finding Time to Study
  • Sources of Information
  • Assessing Internet Information
  • Using Apps to Support Study
  • What is Theory?
  • Styles of Writing
  • Effective Reading
  • Critical Reading
  • Note-Taking from Reading
  • Note-Taking for Verbal Exchanges
  • Planning an Essay
  • How to Write an Essay
  • The Do’s and Don’ts of Essay Writing
  • How to Write a Report
  • Academic Referencing
  • Assignment Finishing Touches
  • Reflecting on Marked Work
  • 6 Skills You Learn in School That You Use in Real Life
  • Top 10 Tips on How to Study While Working
  • Exam Skills
  • Writing a Dissertation or Thesis
  • Research Methods
  • Teaching, Coaching, Mentoring and Counselling
  • Employability Skills for Graduates

Subscribe to our FREE newsletter and start improving your life in just 5 minutes a day.

You'll get our 5 free 'One Minute Life Skills' and our weekly newsletter.

We'll never share your email address and you can unsubscribe at any time.

What is Critical Thinking?

Critical thinking is the ability to think clearly and rationally, understanding the logical connection between ideas.  Critical thinking has been the subject of much debate and thought since the time of early Greek philosophers such as Plato and Socrates and has continued to be a subject of discussion into the modern age, for example the ability to recognise fake news .

Critical thinking might be described as the ability to engage in reflective and independent thinking.

In essence, critical thinking requires you to use your ability to reason. It is about being an active learner rather than a passive recipient of information.

Critical thinkers rigorously question ideas and assumptions rather than accepting them at face value. They will always seek to determine whether the ideas, arguments and findings represent the entire picture and are open to finding that they do not.

Critical thinkers will identify, analyse and solve problems systematically rather than by intuition or instinct.

Someone with critical thinking skills can:

Understand the links between ideas.

Determine the importance and relevance of arguments and ideas.

Recognise, build and appraise arguments.

Identify inconsistencies and errors in reasoning.

Approach problems in a consistent and systematic way.

Reflect on the justification of their own assumptions, beliefs and values.

Critical thinking is thinking about things in certain ways so as to arrive at the best possible solution in the circumstances that the thinker is aware of. In more everyday language, it is a way of thinking about whatever is presently occupying your mind so that you come to the best possible conclusion.

Critical Thinking is:

A way of thinking about particular things at a particular time; it is not the accumulation of facts and knowledge or something that you can learn once and then use in that form forever, such as the nine times table you learn and use in school.

The Skills We Need for Critical Thinking

The skills that we need in order to be able to think critically are varied and include observation, analysis, interpretation, reflection, evaluation, inference, explanation, problem solving, and decision making.

Specifically we need to be able to:

Think about a topic or issue in an objective and critical way.

Identify the different arguments there are in relation to a particular issue.

Evaluate a point of view to determine how strong or valid it is.

Recognise any weaknesses or negative points that there are in the evidence or argument.

Notice what implications there might be behind a statement or argument.

Provide structured reasoning and support for an argument that we wish to make.

The Critical Thinking Process

You should be aware that none of us think critically all the time.

Sometimes we think in almost any way but critically, for example when our self-control is affected by anger, grief or joy or when we are feeling just plain ‘bloody minded’.

On the other hand, the good news is that, since our critical thinking ability varies according to our current mindset, most of the time we can learn to improve our critical thinking ability by developing certain routine activities and applying them to all problems that present themselves.

Once you understand the theory of critical thinking, improving your critical thinking skills takes persistence and practice.

Try this simple exercise to help you to start thinking critically.

Think of something that someone has recently told you. Then ask yourself the following questions:

Who said it?

Someone you know? Someone in a position of authority or power? Does it matter who told you this?

What did they say?

Did they give facts or opinions? Did they provide all the facts? Did they leave anything out?

Where did they say it?

Was it in public or in private? Did other people have a chance to respond an provide an alternative account?

When did they say it?

Was it before, during or after an important event? Is timing important?

Why did they say it?

Did they explain the reasoning behind their opinion? Were they trying to make someone look good or bad?

How did they say it?

Were they happy or sad, angry or indifferent? Did they write it or say it? Could you understand what was said?

What are you Aiming to Achieve?

One of the most important aspects of critical thinking is to decide what you are aiming to achieve and then make a decision based on a range of possibilities.

Once you have clarified that aim for yourself you should use it as the starting point in all future situations requiring thought and, possibly, further decision making. Where needed, make your workmates, family or those around you aware of your intention to pursue this goal. You must then discipline yourself to keep on track until changing circumstances mean you have to revisit the start of the decision making process.

However, there are things that get in the way of simple decision making. We all carry with us a range of likes and dislikes, learnt behaviours and personal preferences developed throughout our lives; they are the hallmarks of being human. A major contribution to ensuring we think critically is to be aware of these personal characteristics, preferences and biases and make allowance for them when considering possible next steps, whether they are at the pre-action consideration stage or as part of a rethink caused by unexpected or unforeseen impediments to continued progress.

The more clearly we are aware of ourselves, our strengths and weaknesses, the more likely our critical thinking will be productive.

The Benefit of Foresight

Perhaps the most important element of thinking critically is foresight.

Almost all decisions we make and implement don’t prove disastrous if we find reasons to abandon them. However, our decision making will be infinitely better and more likely to lead to success if, when we reach a tentative conclusion, we pause and consider the impact on the people and activities around us.

The elements needing consideration are generally numerous and varied. In many cases, consideration of one element from a different perspective will reveal potential dangers in pursuing our decision.

For instance, moving a business activity to a new location may improve potential output considerably but it may also lead to the loss of skilled workers if the distance moved is too great. Which of these is the more important consideration? Is there some way of lessening the conflict?

These are the sort of problems that may arise from incomplete critical thinking, a demonstration perhaps of the critical importance of good critical thinking.

Further Reading from Skills You Need

The Skills You Need Guide for Students

The Skills You Need Guide for Students

Skills You Need

Develop the skills you need to make the most of your time as a student.

Our eBooks are ideal for students at all stages of education, school, college and university. They are full of easy-to-follow practical information that will help you to learn more effectively and get better grades.

In Summary:

Critical thinking is aimed at achieving the best possible outcomes in any situation. In order to achieve this it must involve gathering and evaluating information from as many different sources possible.

Critical thinking requires a clear, often uncomfortable, assessment of your personal strengths, weaknesses and preferences and their possible impact on decisions you may make.

Critical thinking requires the development and use of foresight as far as this is possible. As Doris Day sang, “the future’s not ours to see”.

Implementing the decisions made arising from critical thinking must take into account an assessment of possible outcomes and ways of avoiding potentially negative outcomes, or at least lessening their impact.

  • Critical thinking involves reviewing the results of the application of decisions made and implementing change where possible.

It might be thought that we are overextending our demands on critical thinking in expecting that it can help to construct focused meaning rather than examining the information given and the knowledge we have acquired to see if we can, if necessary, construct a meaning that will be acceptable and useful.

After all, almost no information we have available to us, either externally or internally, carries any guarantee of its life or appropriateness.  Neat step-by-step instructions may provide some sort of trellis on which our basic understanding of critical thinking can blossom but it doesn’t and cannot provide any assurance of certainty, utility or longevity.

Continue to: Critical Thinking and Fake News Critical Reading

See also: Analytical Skills Understanding and Addressing Conspiracy Theories Introduction to Neuro-Linguistic Programming (NLP)

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

critical thinking a methodology

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. These skills are especially helpful at school and in the workplace, where employers prioritize the ability to think critically. Find out why and see how you can demonstrate that you have this ability.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews the evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter and during your interview.

How to Demonstrate Critical Thinking in a Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your work history, include top critical thinking skills that accurately describe you. You can also include them in your resume summary, if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand skills in mind as you refine your critical thinking practice —whether for work or school.

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with analytical skills can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns

Communication

Often, you will need to share your conclusions with your employers or with a group of classmates or colleagues. You need to be able to communicate with others to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing

Open-Mindedness

To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem-Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • Questions for Each Level of Bloom's Taxonomy
  • Critical Thinking in Reading and Composition
  • Bloom's Taxonomy in the Classroom
  • How To Become an Effective Problem Solver
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • Introduction to Critical Thinking
  • Creativity & Creative Thinking
  • Higher-Order Thinking Skills (HOTS) in Education
  • 6 Skills Students Need to Succeed in Social Studies Classes
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • Types of Medical School Interviews and What to Expect
  • The Horse Problem: A Math Challenge
  • What to Do When the Technology Fails in Class
  • What Are Your Strengths and Weaknesses? Interview Tips for Teachers
  • A Guide to Business Letters Types
  • Landing Your First Teaching Job

Library Home

Introduction to Logic and Critical Thinking

(10 reviews)

critical thinking a methodology

Matthew Van Cleave, Lansing Community College

Copyright Year: 2016

Publisher: Matthew J. Van Cleave

Language: English

Formats Available

Conditions of use.

Attribution

Learn more about reviews.

Reviewed by "yusef" Alexander Hayes, Professor, North Shore Community College on 6/9/21

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness. read more

Comprehensiveness rating: 5 see less

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness.

Content Accuracy rating: 5

The book is accurate.

Relevance/Longevity rating: 5

While many modern examples are used, and they are helpful, they are not necessarily needed. The usefulness of logical principles and skills have proved themselves, and this text presents them clearly with many examples.

Clarity rating: 5

It is obvious that the author cares about their subject, audience, and students. The text is comprehensible and interesting.

Consistency rating: 5

The format is easy to understand and is consistent in framing.

Modularity rating: 5

This text would be easy to adapt.

Organization/Structure/Flow rating: 5

The organization is excellent, my one suggestion would be a concluding chapter.

Interface rating: 5

I accessed the PDF version and it would be easy to work with.

Grammatical Errors rating: 5

The writing is excellent.

Cultural Relevance rating: 5

This is not an offensive text.

Reviewed by Susan Rottmann, Part-time Lecturer, University of Southern Maine on 3/2/21

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it... read more

Comprehensiveness rating: 4 see less

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it fits better for a general critical thinking course than for a true logic course. I'm not sure that I'd agree. I have been using Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," and I think that book is a better introduction to critical thinking for non-philosophy majors. However, the latter is not open source so I will figure out how to get by without it in the future. Overall, the book seems comprehensive if the subject is logic. The index is on the short-side, but fine. However, one issue for me is that there are no page numbers on the table of contents, which is pretty annoying if you want to locate particular sections.

Content Accuracy rating: 4

I didn't find any errors. In general the book uses great examples. However, they are very much based in the American context, not for an international student audience. Some effort to broaden the chosen examples would make the book more widely applicable.

Relevance/Longevity rating: 4

I think the book will remain relevant because of the nature of the material that it addresses, however there will be a need to modify the examples in future editions and as the social and political context changes.

Clarity rating: 3

The text is lucid, but I think it would be difficult for introductory-level students who are not philosophy majors. For example, in Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," the sub-headings are very accessible, such as "Experts cannot rescue us, despite what they say" or "wishful thinking: perhaps the biggest single speed bump on the road to critical thinking." By contrast, Van Cleave's "Introduction to Logic and Critical Thinking" has more subheadings like this: "Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form" or "Propositional logic and the four basic truth functional connectives." If students are prepared very well for the subject, it would work fine, but for students who are newly being introduced to critical thinking, it is rather technical.

It seems to be very consistent in terms of its terminology and framework.

Modularity rating: 4

The book is divided into 4 chapters, each having many sub-chapters. In that sense, it is readily divisible and modular. However, as noted above, there are no page numbers on the table of contents, which would make assigning certain parts rather frustrating. Also, I'm not sure why the book is only four chapter and has so many subheadings (for instance 17 in Chapter 2) and a length of 242 pages. Wouldn't it make more sense to break up the book into shorter chapters? I think this would make it easier to read and to assign in specific blocks to students.

Organization/Structure/Flow rating: 4

The organization of the book is fine overall, although I think adding page numbers to the table of contents and breaking it up into more separate chapters would help it to be more easily navigable.

Interface rating: 4

The book is very simply presented. In my opinion it is actually too simple. There are few boxes or diagrams that highlight and explain important points.

The text seems fine grammatically. I didn't notice any errors.

The book is written with an American audience in mind, but I did not notice culturally insensitive or offensive parts.

Overall, this book is not for my course, but I think it could work well in a philosophy course.

critical thinking a methodology

Reviewed by Daniel Lee, Assistant Professor of Economics and Leadership, Sweet Briar College on 11/11/19

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument read more

Comprehensiveness rating: 3 see less

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument

To the best of my ability, I regard this content as accurate, error-free, and unbiased

The book is broadly relevant and up-to-date, with a few stray temporal references (sydney olympics, particular presidencies). I don't view these time-dated examples as problematic as the logical underpinnings are still there and easily assessed

Clarity rating: 4

My only pushback on clarity is I didn't find the distinction between argument and explanation particularly helpful/useful/easy to follow. However, this experience may have been unique to my class.

To the best of my ability, I regard this content as internally consistent

I found this text quite modular, and was easily able to integrate other texts into my lessons and disregard certain chapters or sub-sections

The book had a logical and consistent structure, but to the extent that there are only 4 chapters, there isn't much scope for alternative approaches here

No problems with the book's interface

The text is grammatically sound

Cultural Relevance rating: 4

Perhaps the text could have been more universal in its approach. While I didn't find the book insensitive per-se, logic can be tricky here because the point is to evaluate meaningful (non-trivial) arguments, but any argument with that sense of gravity can also be traumatic to students (abortion, death penalty, etc)

No additional comments

Reviewed by Lisa N. Thomas-Smith, Graduate Part-time Instructor, CU Boulder on 7/1/19

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text,... read more

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text, and the index is very thorough.

The content is excellent. The text is thorough and accurate with no errors that I could discern. The terminology and exercises cover the material nicely and without bias.

The text should easily stand the test of time. The exercises are excellent and would be very helpful for students to internalize correct critical thinking practices. Because of the logical arrangement of the text and the many sub-sections, additional material should be very easy to add.

The text is extremely clearly and simply written. I anticipate that a diligent student could learn all of the material in the text with little additional instruction. The examples are relevant and easy to follow.

The text did not confuse terms or use inconsistent terminology, which is very important in a logic text. The discipline often uses multiple terms for the same concept, but this text avoids that trap nicely.

The text is fairly easily divisible. Since there are only four chapters, those chapters include large blocks of information. However, the chapters themselves are very well delineated and could be easily broken up so that parts could be left out or covered in a different order from the text.

The flow of the text is excellent. All of the information is handled solidly in an order that allows the student to build on the information previously covered.

The PDF Table of Contents does not include links or page numbers which would be very helpful for navigation. Other than that, the text was very easy to navigate. All the images, charts, and graphs were very clear

I found no grammatical errors in the text.

Cultural Relevance rating: 3

The text including examples and exercises did not seem to be offensive or insensitive in any specific way. However, the examples included references to black and white people, but few others. Also, the text is very American specific with many examples from and for an American audience. More diversity, especially in the examples, would be appropriate and appreciated.

Reviewed by Leslie Aarons, Associate Professor of Philosophy, CUNY LaGuardia Community College on 5/16/19

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an... read more

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an argument and an explanation; validity; soundness; and the distinctions between an inductive and a deductive argument in accessible terms in the first chapter. It also does a good job introducing and discussing informal fallacies (Chapter 4). The incorporation of opportunities to evaluate real-world arguments is also very effective. Chapter 2 also covers a number of formal methods of evaluating arguments, such as Venn Diagrams and Propositional logic and the four basic truth functional connectives, but to my mind, it is much more thorough in its treatment of Informal Logic and Critical Thinking skills, than it is of formal logic. I also appreciated that Van Cleave’s book includes exercises with answers and an index, but there is no glossary; which I personally do not find detracts from the book's comprehensiveness.

Overall, Van Cleave's book is error-free and unbiased. The language used is accessible and engaging. There were no glaring inaccuracies that I was able to detect.

Van Cleave's Textbook uses relevant, contemporary content that will stand the test of time, at least for the next few years. Although some examples use certain subjects like former President Obama, it does so in a useful manner that inspires the use of critical thinking skills. There are an abundance of examples that inspire students to look at issues from many different political viewpoints, challenging students to practice evaluating arguments, and identifying fallacies. Many of these exercises encourage students to critique issues, and recognize their own inherent reader-biases and challenge their own beliefs--hallmarks of critical thinking.

As mentioned previously, the author has an accessible style that makes the content relatively easy to read and engaging. He also does a suitable job explaining jargon/technical language that is introduced in the textbook.

Van Cleave uses terminology consistently and the chapters flow well. The textbook orients the reader by offering effective introductions to new material, step-by-step explanations of the material, as well as offering clear summaries of each lesson.

This textbook's modularity is really quite good. Its language and structure are not overly convoluted or too-lengthy, making it convenient for individual instructors to adapt the materials to suit their methodological preferences.

The topics in the textbook are presented in a logical and clear fashion. The structure of the chapters are such that it is not necessary to have to follow the chapters in their sequential order, and coverage of material can be adapted to individual instructor's preferences.

The textbook is free of any problematic interface issues. Topics, sections and specific content are accessible and easy to navigate. Overall it is user-friendly.

I did not find any significant grammatical issues with the textbook.

The textbook is not culturally insensitive, making use of a diversity of inclusive examples. Materials are especially effective for first-year critical thinking/logic students.

I intend to adopt Van Cleave's textbook for a Critical Thinking class I am teaching at the Community College level. I believe that it will help me facilitate student-learning, and will be a good resource to build additional classroom activities from the materials it provides.

Reviewed by Jennie Harrop, Chair, Department of Professional Studies, George Fox University on 3/27/18

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters... read more

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters that are dense with statistical analyses and critical vocabulary. These topics are likely better broached in manageable snippets rather than hefty single chapters.

The ideas addressed in Introduction to Logic and Critical Thinking are accurate but at times notably political. While politics are effectively used to exemplify key concepts, some students may be distracted by distinct political leanings.

The terms and definitions included are relevant, but the examples are specific to the current political, cultural, and social climates, which could make the materials seem dated in a few years without intentional and consistent updates.

While the reasoning is accurate, the author tends to complicate rather than simplify -- perhaps in an effort to cover a spectrum of related concepts. Beginning readers are likely to be overwhelmed and under-encouraged by his approach.

Consistency rating: 3

The four chapters are somewhat consistent in their play of definition, explanation, and example, but the structure of each chapter varies according to the concepts covered. In the third chapter, for example, key ideas are divided into sub-topics numbering from 3.1 to 3.10. In the fourth chapter, the sub-divisions are further divided into sub-sections numbered 4.1.1-4.1.5, 4.2.1-4.2.2, and 4.3.1 to 4.3.6. Readers who are working quickly to master new concepts may find themselves mired in similarly numbered subheadings, longing for a grounded concepts on which to hinge other key principles.

Modularity rating: 3

The book's four chapters make it mostly self-referential. The author would do well to beak this text down into additional subsections, easing readers' accessibility.

The content of the book flows logically and well, but the information needs to be better sub-divided within each larger chapter, easing the student experience.

The book's interface is effective, allowing readers to move from one section to the next with a single click. Additional sub-sections would ease this interplay even further.

Grammatical Errors rating: 4

Some minor errors throughout.

For the most part, the book is culturally neutral, avoiding direct cultural references in an effort to remain relevant.

Reviewed by Yoichi Ishida, Assistant Professor of Philosophy, Ohio University on 2/1/18

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic,... read more

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic, this textbook does not cover suppositional arguments, such as conditional proof and reductio ad absurdum. But other standard argument forms are covered. Chapter 3 covers inductive logic, and here this textbook introduces probability and its relationship with cognitive biases, which are rarely discussed in other textbooks. Chapter 4 introduces common informal fallacies. The answers to all the exercises are given at the end. However, the last set of exercises is in Chapter 3, Section 5. There are no exercises in the rest of the chapter. Chapter 4 has no exercises either. There is index, but no glossary.

The textbook is accurate.

The content of this textbook will not become obsolete soon.

The textbook is written clearly.

The textbook is internally consistent.

The textbook is fairly modular. For example, Chapter 3, together with a few sections from Chapter 1, can be used as a short introduction to inductive logic.

The textbook is well-organized.

There are no interface issues.

I did not find any grammatical errors.

This textbook is relevant to a first semester logic or critical thinking course.

Reviewed by Payal Doctor, Associate Professro, LaGuardia Community College on 2/1/18

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner... read more

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner book, but seems to be a good text for a course that needs a foundation for arguments. There are exercises on creating truth tables and proofs, so it could work as a logic primer in short sessions or with the addition of other course content.

The books is accurate in the information it presents. It does not contain errors and is unbiased. It covers the essential vocabulary clearly and givens ample examples and exercises to ensure the student understands the concepts

The content of the book is up to date and can be easily updated. Some examples are very current for analyzing the argument structure in a speech, but for this sort of text understandable examples are important and the author uses good examples.

The book is clear and easy to read. In particular, this is a good text for community college students who often have difficulty with reading comprehension. The language is straightforward and concepts are well explained.

The book is consistent in terminology, formatting, and examples. It flows well from one topic to the next, but it is also possible to jump around the text without loosing the voice of the text.

The books is broken down into sub units that make it easy to assign short blocks of content at a time. Later in the text, it does refer to a few concepts that appear early in that text, but these are all basic concepts that must be used to create a clear and understandable text. No sections are too long and each section stays on topic and relates the topic to those that have come before when necessary.

The flow of the text is logical and clear. It begins with the basic building blocks of arguments, and practice identifying more and more complex arguments is offered. Each chapter builds up from the previous chapter in introducing propositional logic, truth tables, and logical arguments. A select number of fallacies are presented at the end of the text, but these are related to topics that were presented before, so it makes sense to have these last.

The text is free if interface issues. I used the PDF and it worked fine on various devices without loosing formatting.

1. The book contains no grammatical errors.

The text is culturally sensitive, but examples used are a bit odd and may be objectionable to some students. For instance, President Obama's speech on Syria is used to evaluate an extended argument. This is an excellent example and it is explained well, but some who disagree with Obama's policies may have trouble moving beyond their own politics. However, other examples look at issues from all political viewpoints and ask students to evaluate the argument, fallacy, etc. and work towards looking past their own beliefs. Overall this book does use a variety of examples that most students can understand and evaluate.

My favorite part of this book is that it seems to be written for community college students. My students have trouble understanding readings in the New York Times, so it is nice to see a logic and critical thinking text use real language that students can understand and follow without the constant need of a dictionary.

Reviewed by Rebecca Owen, Adjunct Professor, Writing, Chemeketa Community College on 6/20/17

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current... read more

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current events, funny scenarios, or other interesting ways to evaluate argument structure and validity. The third section, which deals with logical fallacies, is very clear and comprehensive. My only critique of the material included in the book is that the middle section may be a bit dense and math-oriented for learners who appreciate the more informal, informative style of the first and third section. Also, the book ends rather abruptly--it moves from a description of a logical fallacy to the answers for the exercises earlier in the text.

The content is very reader-friendly, and the author writes with authority and clarity throughout the text. There are a few surface-level typos (Starbuck's instead of Starbucks, etc.). None of these small errors detract from the quality of the content, though.

One thing I really liked about this text was the author's wide variety of examples. To demonstrate different facets of logic, he used examples from current media, movies, literature, and many other concepts that students would recognize from their daily lives. The exercises in this text also included these types of pop-culture references, and I think students will enjoy the familiarity--as well as being able to see the logical structures behind these types of references. I don't think the text will need to be updated to reflect new instances and occurrences; the author did a fine job at picking examples that are relatively timeless. As far as the subject matter itself, I don't think it will become obsolete any time soon.

The author writes in a very conversational, easy-to-read manner. The examples used are quite helpful. The third section on logical fallacies is quite easy to read, follow, and understand. A student in an argument writing class could benefit from this section of the book. The middle section is less clear, though. A student learning about the basics of logic might have a hard time digesting all of the information contained in chapter two. This material might be better in two separate chapters. I think the author loses the balance of a conversational, helpful tone and focuses too heavily on equations.

Consistency rating: 4

Terminology in this book is quite consistent--the key words are highlighted in bold. Chapters 1 and 3 follow a similar organizational pattern, but chapter 2 is where the material becomes more dense and equation-heavy. I also would have liked a closing passage--something to indicate to the reader that we've reached the end of the chapter as well as the book.

I liked the overall structure of this book. If I'm teaching an argumentative writing class, I could easily point the students to the chapters where they can identify and practice identifying fallacies, for instance. The opening chapter is clear in defining the necessary terms, and it gives the students an understanding of the toolbox available to them in assessing and evaluating arguments. Even though I found the middle section to be dense, smaller portions could be assigned.

The author does a fine job connecting each defined term to the next. He provides examples of how each defined term works in a sentence or in an argument, and then he provides practice activities for students to try. The answers for each question are listed in the final pages of the book. The middle section feels like the heaviest part of the whole book--it would take the longest time for a student to digest if assigned the whole chapter. Even though this middle section is a bit heavy, it does fit the overall structure and flow of the book. New material builds on previous chapters and sub-chapters. It ends abruptly--I didn't realize that it had ended, and all of a sudden I found myself in the answer section for those earlier exercises.

The simple layout is quite helpful! There is nothing distracting, image-wise, in this text. The table of contents is clearly arranged, and each topic is easy to find.

Tiny edits could be made (Starbuck's/Starbucks, for one). Otherwise, it is free of distracting grammatical errors.

This text is quite culturally relevant. For instance, there is one example that mentions the rumors of Barack Obama's birthplace as somewhere other than the United States. This example is used to explain how to analyze an argument for validity. The more "sensational" examples (like the Obama one above) are helpful in showing argument structure, and they can also help students see how rumors like this might gain traction--as well as help to show students how to debunk them with their newfound understanding of argument and logic.

The writing style is excellent for the subject matter, especially in the third section explaining logical fallacies. Thank you for the opportunity to read and review this text!

Reviewed by Laurel Panser, Instructor, Riverland Community College on 6/20/17

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as... read more

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as the 13th edition with the same title. Lori Watson is the second author on the 13th edition.

Competing with Hurley is difficult with respect to comprehensiveness. For example, Van Cleave’s book is comprehensive to the extent that it probably covers at least two-thirds or more of what is dealt with in most introductory, one-semester logic courses. Van Cleave’s chapter 1 provides an overview of argumentation including discerning non-arguments from arguments, premises versus conclusions, deductive from inductive arguments, validity, soundness and more. Much of Van Cleave’s chapter 1 parallel’s Hurley’s chapter 1. Hurley’s chapter 3 regarding informal fallacies is comprehensive while Van Cleave’s chapter 4 on this topic is less extensive. Categorical propositions are a topic in Van Cleave’s chapter 2; Hurley’s chapters 4 and 5 provide more instruction on this, however. Propositional logic is another topic in Van Cleave’s chapter 2; Hurley’s chapters 6 and 7 provide more information on this, though. Van Cleave did discuss messy issues of language meaning briefly in his chapter 1; that is the topic of Hurley’s chapter 2.

Van Cleave’s book includes exercises with answers and an index. A glossary was not included.

Reviews of open source textbooks typically include criteria besides comprehensiveness. These include comments on accuracy of the information, whether the book will become obsolete soon, jargon-free clarity to the extent that is possible, organization, navigation ease, freedom from grammar errors and cultural relevance; Van Cleave’s book is fine in all of these areas. Further criteria for open source books includes modularity and consistency of terminology. Modularity is defined as including blocks of learning material that are easy to assign to students. Hurley’s book has a greater degree of modularity than Van Cleave’s textbook. The prose Van Cleave used is consistent.

Van Cleave’s book will not become obsolete soon.

Van Cleave’s book has accessible prose.

Van Cleave used terminology consistently.

Van Cleave’s book has a reasonable degree of modularity.

Van Cleave’s book is organized. The structure and flow of his book is fine.

Problems with navigation are not present.

Grammar problems were not present.

Van Cleave’s book is culturally relevant.

Van Cleave’s book is appropriate for some first semester logic courses.

Table of Contents

Chapter 1: Reconstructing and analyzing arguments

  • 1.1 What is an argument?
  • 1.2 Identifying arguments
  • 1.3 Arguments vs. explanations
  • 1.4 More complex argument structures
  • 1.5 Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form
  • 1.6 Validity
  • 1.7 Soundness
  • 1.8 Deductive vs. inductive arguments
  • 1.9 Arguments with missing premises
  • 1.10 Assuring, guarding, and discounting
  • 1.11 Evaluative language
  • 1.12 Evaluating a real-life argument

Chapter 2: Formal methods of evaluating arguments

  • 2.1 What is a formal method of evaluation and why do we need them?
  • 2.2 Propositional logic and the four basic truth functional connectives
  • 2.3 Negation and disjunction
  • 2.4 Using parentheses to translate complex sentences
  • 2.5 “Not both” and “neither nor”
  • 2.6 The truth table test of validity
  • 2.7 Conditionals
  • 2.8 “Unless”
  • 2.9 Material equivalence
  • 2.10 Tautologies, contradictions, and contingent statements
  • 2.11 Proofs and the 8 valid forms of inference
  • 2.12 How to construct proofs
  • 2.13 Short review of propositional logic
  • 2.14 Categorical logic
  • 2.15 The Venn test of validity for immediate categorical inferences
  • 2.16 Universal statements and existential commitment
  • 2.17 Venn validity for categorical syllogisms

Chapter 3: Evaluating inductive arguments and probabilistic and statistical fallacies

  • 3.1 Inductive arguments and statistical generalizations
  • 3.2 Inference to the best explanation and the seven explanatory virtues
  • 3.3 Analogical arguments
  • 3.4 Causal arguments
  • 3.5 Probability
  • 3.6 The conjunction fallacy
  • 3.7 The base rate fallacy
  • 3.8 The small numbers fallacy
  • 3.9 Regression to the mean fallacy
  • 3.10 Gambler's fallacy

Chapter 4: Informal fallacies

  • 4.1 Formal vs. informal fallacies
  • 4.1.1 Composition fallacy
  • 4.1.2 Division fallacy
  • 4.1.3 Begging the question fallacy
  • 4.1.4 False dichotomy
  • 4.1.5 Equivocation
  • 4.2 Slippery slope fallacies
  • 4.2.1 Conceptual slippery slope
  • 4.2.2 Causal slippery slope
  • 4.3 Fallacies of relevance
  • 4.3.1 Ad hominem
  • 4.3.2 Straw man
  • 4.3.3 Tu quoque
  • 4.3.4 Genetic
  • 4.3.5 Appeal to consequences
  • 4.3.6 Appeal to authority

Answers to exercises Glossary/Index

Ancillary Material

About the book.

This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a “critical thinking textbook.”

About the Contributors

Matthew Van Cleave ,   PhD, Philosophy, University of Cincinnati, 2007.  VAP at Concordia College (Moorhead), 2008-2012.  Assistant Professor at Lansing Community College, 2012-2016. Professor at Lansing Community College, 2016-

Contribute to this Page

  • Campus Life
  • ...a student.
  • ...a veteran.
  • ...an alum.
  • ...a parent.
  • ...faculty or staff.
  • UTC Learn (Canvas)
  • Class Schedule
  • Crisis Resources
  • People Finder
  • Change Password

UTC RAVE Alert

Critical thinking and problem-solving, jump to: , what is critical thinking, characteristics of critical thinking, why teach critical thinking.

  • Teaching Strategies to Help Promote Critical Thinking Skills

References and Resources

When examining the vast literature on critical thinking, various definitions of critical thinking emerge. Here are some samples:

  • "Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action" (Scriven, 1996).
  • "Most formal definitions characterize critical thinking as the intentional application of rational, higher order thinking skills, such as analysis, synthesis, problem recognition and problem solving, inference, and evaluation" (Angelo, 1995, p. 6).
  • "Critical thinking is thinking that assesses itself" (Center for Critical Thinking, 1996b).
  • "Critical thinking is the ability to think about one's thinking in such a way as 1. To recognize its strengths and weaknesses and, as a result, 2. To recast the thinking in improved form" (Center for Critical Thinking, 1996c).

Perhaps the simplest definition is offered by Beyer (1995) : "Critical thinking... means making reasoned judgments" (p. 8). Basically, Beyer sees critical thinking as using criteria to judge the quality of something, from cooking to a conclusion of a research paper. In essence, critical thinking is a disciplined manner of thought that a person uses to assess the validity of something (statements, news stories, arguments, research, etc.).

Back        

Wade (1995) identifies eight characteristics of critical thinking. Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking, "Ambiguity and doubt serve a critical-thinking function and are a necessary and even a productive part of the process" (p. 56).

Another characteristic of critical thinking identified by many sources is metacognition. Metacognition is thinking about one's own thinking. More specifically, "metacognition is being aware of one's thinking as one performs specific tasks and then using this awareness to control what one is doing" (Jones & Ratcliff, 1993, p. 10 ).

In the book, Critical Thinking, Beyer elaborately explains what he sees as essential aspects of critical thinking. These are:

  • Dispositions: Critical thinkers are skeptical, open-minded, value fair-mindedness, respect evidence and reasoning, respect clarity and precision, look at different points of view, and will change positions when reason leads them to do so.
  • Criteria: To think critically, must apply criteria. Need to have conditions that must be met for something to be judged as believable. Although the argument can be made that each subject area has different criteria, some standards apply to all subjects. "... an assertion must... be based on relevant, accurate facts; based on credible sources; precise; unbiased; free from logical fallacies; logically consistent; and strongly reasoned" (p. 12).
  • Argument: Is a statement or proposition with supporting evidence. Critical thinking involves identifying, evaluating, and constructing arguments.
  • Reasoning: The ability to infer a conclusion from one or multiple premises. To do so requires examining logical relationships among statements or data.
  • Point of View: The way one views the world, which shapes one's construction of meaning. In a search for understanding, critical thinkers view phenomena from many different points of view.
  • Procedures for Applying Criteria: Other types of thinking use a general procedure. Critical thinking makes use of many procedures. These procedures include asking questions, making judgments, and identifying assumptions.

Oliver & Utermohlen (1995) see students as too often being passive receptors of information. Through technology, the amount of information available today is massive. This information explosion is likely to continue in the future. Students need a guide to weed through the information and not just passively accept it. Students need to "develop and effectively apply critical thinking skills to their academic studies, to the complex problems that they will face, and to the critical choices they will be forced to make as a result of the information explosion and other rapid technological changes" (Oliver & Utermohlen, p. 1 ).

As mentioned in the section, Characteristics of Critical Thinking , critical thinking involves questioning. It is important to teach students how to ask good questions, to think critically, in order to continue the advancement of the very fields we are teaching. "Every field stays alive only to the extent that fresh questions are generated and taken seriously" (Center for Critical Thinking, 1996a ).

Beyer sees the teaching of critical thinking as important to the very state of our nation. He argues that to live successfully in a democracy, people must be able to think critically in order to make sound decisions about personal and civic affairs. If students learn to think critically, then they can use good thinking as the guide by which they live their lives.

Teaching Strategies to Help Promote Critical Thinking

The 1995, Volume 22, issue 1, of the journal, Teaching of Psychology , is devoted to the teaching critical thinking. Most of the strategies included in this section come from the various articles that compose this issue.

  • CATS (Classroom Assessment Techniques): Angelo stresses the use of ongoing classroom assessment as a way to monitor and facilitate students' critical thinking. An example of a CAT is to ask students to write a "Minute Paper" responding to questions such as "What was the most important thing you learned in today's class? What question related to this session remains uppermost in your mind?" The teacher selects some of the papers and prepares responses for the next class meeting.
  • Cooperative Learning Strategies: Cooper (1995) argues that putting students in group learning situations is the best way to foster critical thinking. "In properly structured cooperative learning environments, students perform more of the active, critical thinking with continuous support and feedback from other students and the teacher" (p. 8).
  • Case Study /Discussion Method: McDade (1995) describes this method as the teacher presenting a case (or story) to the class without a conclusion. Using prepared questions, the teacher then leads students through a discussion, allowing students to construct a conclusion for the case.
  • Using Questions: King (1995) identifies ways of using questions in the classroom:
  • Reciprocal Peer Questioning: Following lecture, the teacher displays a list of question stems (such as, "What are the strengths and weaknesses of...). Students must write questions about the lecture material. In small groups, the students ask each other the questions. Then, the whole class discusses some of the questions from each small group.
  • Reader's Questions: Require students to write questions on assigned reading and turn them in at the beginning of class. Select a few of the questions as the impetus for class discussion.
  • Conference Style Learning: The teacher does not "teach" the class in the sense of lecturing. The teacher is a facilitator of a conference. Students must thoroughly read all required material before class. Assigned readings should be in the zone of proximal development. That is, readings should be able to be understood by students, but also challenging. The class consists of the students asking questions of each other and discussing these questions. The teacher does not remain passive, but rather, helps "direct and mold discussions by posing strategic questions and helping students build on each others' ideas" (Underwood & Wald, 1995, p. 18 ).
  • Use Writing Assignments: Wade sees the use of writing as fundamental to developing critical thinking skills. "With written assignments, an instructor can encourage the development of dialectic reasoning by requiring students to argue both [or more] sides of an issue" (p. 24).
  • Written dialogues: Give students written dialogues to analyze. In small groups, students must identify the different viewpoints of each participant in the dialogue. Must look for biases, presence or exclusion of important evidence, alternative interpretations, misstatement of facts, and errors in reasoning. Each group must decide which view is the most reasonable. After coming to a conclusion, each group acts out their dialogue and explains their analysis of it.
  • Spontaneous Group Dialogue: One group of students are assigned roles to play in a discussion (such as leader, information giver, opinion seeker, and disagreer). Four observer groups are formed with the functions of determining what roles are being played by whom, identifying biases and errors in thinking, evaluating reasoning skills, and examining ethical implications of the content.
  • Ambiguity: Strohm & Baukus advocate producing much ambiguity in the classroom. Don't give students clear cut material. Give them conflicting information that they must think their way through.
  • Angelo, T. A. (1995). Beginning the dialogue: Thoughts on promoting critical thinking: Classroom assessment for critical thinking. Teaching of Psychology, 22(1), 6-7.
  • Beyer, B. K. (1995). Critical thinking. Bloomington, IN: Phi Delta Kappa Educational Foundation.
  • Center for Critical Thinking (1996a). The role of questions in thinking, teaching, and learning. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Center for Critical Thinking (1996b). Structures for student self-assessment. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univclass/trc.nclk
  • Center for Critical Thinking (1996c). Three definitions of critical thinking [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Cooper, J. L. (1995). Cooperative learning and critical thinking. Teaching of Psychology, 22(1), 7-8.
  • Jones, E. A. & Ratcliff, G. (1993). Critical thinking skills for college students. National Center on Postsecondary Teaching, Learning, and Assessment, University Park, PA. (Eric Document Reproduction Services No. ED 358 772)
  • King, A. (1995). Designing the instructional process to enhance critical thinking across the curriculum: Inquiring minds really do want to know: Using questioning to teach critical thinking. Teaching of Psychology, 22 (1) , 13-17.
  • McDade, S. A. (1995). Case study pedagogy to advance critical thinking. Teaching Psychology, 22(1), 9-10.
  • Oliver, H. & Utermohlen, R. (1995). An innovative teaching strategy: Using critical thinking to give students a guide to the future.(Eric Document Reproduction Services No. 389 702)
  • Robertson, J. F. & Rane-Szostak, D. (1996). Using dialogues to develop critical thinking skills: A practical approach. Journal of Adolescent & Adult Literacy, 39(7), 552-556.
  • Scriven, M. & Paul, R. (1996). Defining critical thinking: A draft statement for the National Council for Excellence in Critical Thinking. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Strohm, S. M., & Baukus, R. A. (1995). Strategies for fostering critical thinking skills. Journalism and Mass Communication Educator, 50 (1), 55-62.
  • Underwood, M. K., & Wald, R. L. (1995). Conference-style learning: A method for fostering critical thinking with heart. Teaching Psychology, 22(1), 17-21.
  • Wade, C. (1995). Using writing to develop and assess critical thinking. Teaching of Psychology, 22(1), 24-28.

Other Reading

  • Bean, J. C. (1996). Engaging ideas: The professor's guide to integrating writing, critical thinking, & active learning in the classroom. Jossey-Bass.
  • Bernstein, D. A. (1995). A negotiation model for teaching critical thinking. Teaching of Psychology, 22(1), 22-24.
  • Carlson, E. R. (1995). Evaluating the credibility of sources. A missing link in the teaching of critical thinking. Teaching of Psychology, 22(1), 39-41.
  • Facione, P. A., Sanchez, C. A., Facione, N. C., & Gainen, J. (1995). The disposition toward critical thinking. The Journal of General Education, 44(1), 1-25.
  • Halpern, D. F., & Nummedal, S. G. (1995). Closing thoughts about helping students improve how they think. Teaching of Psychology, 22(1), 82-83.
  • Isbell, D. (1995). Teaching writing and research as inseparable: A faculty-librarian teaching team. Reference Services Review, 23(4), 51-62.
  • Jones, J. M. & Safrit, R. D. (1994). Developing critical thinking skills in adult learners through innovative distance learning. Paper presented at the International Conference on the practice of adult education and social development. Jinan, China. (Eric Document Reproduction Services No. ED 373 159)
  • Sanchez, M. A. (1995). Using critical-thinking principles as a guide to college-level instruction. Teaching of Psychology, 22(1), 72-74.
  • Spicer, K. L. & Hanks, W. E. (1995). Multiple measures of critical thinking skills and predisposition in assessment of critical thinking. Paper presented at the annual meeting of the Speech Communication Association, San Antonio, TX. (Eric Document Reproduction Services No. ED 391 185)
  • Terenzini, P. T., Springer, L., Pascarella, E. T., & Nora, A. (1995). Influences affecting the development of students' critical thinking skills. Research in Higher Education, 36(1), 23-39.

On the Internet

  • Carr, K. S. (1990). How can we teach critical thinking. Eric Digest. [On-line]. Available HTTP: http://ericps.ed.uiuc.edu/eece/pubs/digests/1990/carr90.html
  • The Center for Critical Thinking (1996). Home Page. Available HTTP: http://www.criticalthinking.org/University/
  • Ennis, Bob (No date). Critical thinking. [On-line], April 4, 1997. Available HTTP: http://www.cof.orst.edu/cof/teach/for442/ct.htm
  • Montclair State University (1995). Curriculum resource center. Critical thinking resources: An annotated bibliography. [On-line]. Available HTTP: http://www.montclair.edu/Pages/CRC/Bibliographies/CriticalThinking.html
  • No author, No date. Critical Thinking is ... [On-line], April 4, 1997. Available HTTP: http://library.usask.ca/ustudy/critical/
  • Sheridan, Marcia (No date). Internet education topics hotlink page. [On-line], April 4, 1997. Available HTTP: http://sun1.iusb.edu/~msherida/topics/critical.html

Walker Center for Teaching and Learning

  • 433 Library
  • Dept 4354
  • 615 McCallie Ave
  •   423-425-4188

The Classroom | Empowering Students in Their College Journey

The Relationship Between Scientific Method & Critical Thinking

Scott Neuffer

What Is the Function of the Hypothesis?

Critical thinking, that is the mind’s ability to analyze claims about the world, is the intellectual basis of the scientific method. The scientific method can be viewed as an extensive, structured mode of critical thinking that involves hypothesis, experimentation and conclusion.

Critical Thinking

Broadly speaking, critical thinking is any analytical thought aimed at determining the validity of a specific claim. It can be as simple as a nine-year-old questioning a parent’s claim that Santa Claus exists, or as complex as physicists questioning the relativity of space and time. Critical thinking is the point when the mind turns in opposition to an accepted truth and begins analyzing its underlying premises. As American philosopher John Dewey said, it is the “active, persistent and careful consideration of a belief or supposed form of knowledge in light of the grounds that support it, and the further conclusions to which it tends.”

Critical thinking initiates the act of hypothesis. In the scientific method, the hypothesis is the initial supposition, or theoretical claim about the world, based on questions and observations. If critical thinking asks the question, then the hypothesis is the best attempt at the time to answer the question using observable phenomenon. For example, an astrophysicist may question existing theories of black holes based on his own observation. He may posit a contrary hypothesis, arguing black holes actually produce white light. It is not a final conclusion, however, as the scientific method requires specific forms of verification.

Experimentation

The scientific method uses formal experimentation to analyze any hypothesis. The rigorous and specific methodology of experimentation is designed to gather unbiased empirical evidence that either supports or contradicts a given claim. Controlled variables are used to provide an objective basis of comparison. For example, researchers studying the effects of a certain drug may provide half the test population with a placebo pill and the other half with the real drug. The effects of the real drug can then be assessed relative to the control group.

In the scientific method, conclusions are drawn only after tested, verifiable evidence supports them. Even then, conclusions are subject to peer review and often retested before general consensus is reached. Thus, what begins as an act of critical thinking becomes, in the scientific method, a complex process of testing the validity of a claim. English philosopher Francis Bacon put it this way: “If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts, he shall end in certainties.”

Related Articles

According to the Constitution, What Power Is Denied to the Judicial Branch?

According to the Constitution, What Power Is Denied to the Judicial ...

How to Evaluate Statistical Analysis

How to Evaluate Statistical Analysis

The Disadvantages of Qualitative & Quantitative Research

The Disadvantages of Qualitative & Quantitative Research

Qualitative and Quantitative Research Methods

Qualitative and Quantitative Research Methods

What Is Experimental Research Design?

What Is Experimental Research Design?

The Parts of an Argument

The Parts of an Argument

What Is a Confirmed Hypothesis?

What Is a Confirmed Hypothesis?

The Formula for T Scores

The Formula for T Scores

  • How We Think: John Dewey
  • The Advancement of Learning: Francis Bacon

Scott Neuffer is an award-winning journalist and writer who lives in Nevada. He holds a bachelor's degree in English and spent five years as an education and business reporter for Sierra Nevada Media Group. His first collection of short stories, "Scars of the New Order," was published in 2014.

Critical thinking definition

critical thinking a methodology

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

MinimumMeanMedianMaximum
Analysis7088.690100
Interpretation7489.787100
Inference7887.989100
Evaluation6383.684100
Explanation6184.487100
Induction7487.48797
Deduction7186.48797
Overall73868597

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

VariableModel 1Model 2Model 3
CCTST inference0.536***0.491**
Grade point average0.1760.092
Independent study courses−0.0870.001
Writing-intensive courses0.1310.021
Institution0.3290.115
Male0.0850.041
Underrepresented group−0.114−0.060
Adjusted 0.273−0. 0220.195

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]
  • Open access
  • Published: 28 June 2024

Perceived efficacy of case analysis as an assessment method for clinical competencies in nursing education: a mixed methods study

  • Basma Mohammed Al Yazeedi   ORCID: orcid.org/0000-0003-2327-6918 1 ,
  • Lina Mohamed Wali Shakman 1 ,
  • Sheeba Elizabeth John Sunderraj   ORCID: orcid.org/0000-0002-9171-7239 1 ,
  • Harshita Prabhakaran   ORCID: orcid.org/0000-0002-5470-7066 1 ,
  • Judie Arulappan 1 ,
  • Erna Judith Roach   ORCID: orcid.org/0000-0002-5817-8886 1 ,
  • Aysha Al Hashmi 1 , 2 &
  • Zeinab Al Azri   ORCID: orcid.org/0000-0002-3376-9380 1  

BMC Nursing volume  23 , Article number:  441 ( 2024 ) Cite this article

Metrics details

Case analysis is a dynamic and interactive teaching and learning strategy that improves critical thinking and problem-solving skills. However, there is limited evidence about its efficacy as an assessment strategy in nursing education.

This study aimed to explore nursing students’ perceived efficacy of case analysis as an assessment method for clinical competencies in nursing education.

This study used a mixed methods design. Students filled out a 13-item study-advised questionnaire, and qualitative data from the four focus groups was collected. The setting of the study was the College of Nursing at Sultan Qaboos University, Oman. Descriptive and independent t-test analysis was used for the quantitative data, and the framework analysis method was used for the qualitative data.

The descriptive analysis of 67 participants showed that the mean value of the perceived efficacy of case analysis as an assessment method was 3.20 (SD = 0.53), demonstrating an 80% agreement rate. Further analysis indicated that 78.5% of the students concurred with the acceptability of case analysis as an assessment method (mean = 3.14, SD = 0.58), and 80.3% assented its association with clinical competencies as reflected by knowledge and cognitive skills (m = 3.21, SD = 0.60). No significant difference in the perceived efficacy between students with lower and higher GPAs (t [61] = 0.05, p  > 0.05) was identified Three qualitative findings were discerned: case analysis is a preferred assessment method for students when compared to MCQs, case analysis assesses students’ knowledge, and case analysis assesses students’ cognitive skills.

Conclusions

This study adds a potential for the case analysis to be acceptable and relevant to the clinical competencies when used as an assessment method. Future research is needed to validate the effectiveness of case analysis exams in other nursing clinical courses and examine their effects on academic and clinical performance.

Peer Review reports

Introduction

Nurses play a critical role in preserving human health by upholding core competencies [ 1 ]. Clinical competence in nursing involves a constant process of acquiring knowledge, values, attitudes, and abilities to deliver safe and high-quality care [ 2 , 3 ]. Nurses possessing such competencies can analyze and judge complicated problems, including those involving crucial patient care, ethical decision-making, and nurse-patient disputes, meeting the constantly altering health needs [ 4 , 5 ]. To optimize the readiness of the new graduates for the challenging clinical work environment needs, nurse leaders call for integrating clinical competencies into the nursing curriculum [ 6 , 7 ] In 2021, the American Association of Colleges of Nursing (AACN) released updated core competencies for professional nursing education [ 8 ]. These competencies were classified into ten fundamental essentials, including knowledge of nursing practice and person-centered care (e.g. integrate assessment skills in practice, diagnose actual or potential health problems and needs, develop a plan of care), representing clinical core competencies.

Nursing programs emphasize clinical competencies through innovative and effective teaching strategies, including case-based teaching (CBT) [ 9 ]. CBT is a dynamic teaching method that enhances the focus on learning goals and increases the chances of the instructor and students actively participating in teaching and learning [ 10 , 11 ]. Additionally, it improves the students’ critical thinking and problem-solving skills and enriches their capacity for independent study, cooperation capacity, and communication skills [ 12 , 13 , 14 , 15 ]. It also broadens students’ perspectives and helps develop greater creativity in fusing theory and practice [ 16 , 17 , 18 , 19 , 20 ]. As the learning environment significantly impacts the students’ satisfaction, case analysis fosters a supportive learning atmosphere and encourages active participation in learning, ultimately improving their satisfaction [ 21 , 22 ].

In addition to proper teaching strategies for clinical competencies, programs are anticipated to evaluate the students’ attainment of such competencies through effective evaluation strategies [ 23 ]. However, deploying objective assessment methods for the competencies remains challenging for most educators [ 24 ]. The standard assessment methods used in clinical nursing courses, for instance, include clinical evaluations (direct observation), skills checklists, Objective Structured Clinical Examination (OSCE), and multiple-choice questions (MCQs) written exams [ 25 ]. MCQs tend to test the recall of factual information rather than the application of knowledge and cognitive skills, potentially leading to assessment inaccuracies [ 26 ].

Given the aforementioned outcomes of CBT, the deployment of case analysis as a clinical written exam is more closely aligned with the course’s expected competencies. A mixed methods study was conducted among forty nursing students at the University of Southern Taiwan study concluded that the unfolding case studies create a safe setting where nursing students can learn and apply their knowledge to safe patient care [ 6 ]. In a case analysis, the patient’s sickness emerges in stages including the signs and symptoms of the disease, urgent care to stabilize the patient, and bedside care to enhance recovery. Thus, unfolding the case with several scenarios helps educators track students’ attained competencies [ 27 ]. However, case analysis as an assessment method is sparsely researched [ 28 ]. A literature review over the past five years yielded no studies investigating case analysis as an assessment method, necessitating new evidence. There remains uncertainty regarding its efficacy as an assessment method, particularly from the students’ perspectives [ 29 ]. In this study, we explored the undergraduate nursing students’ perceived efficacy of case analysis as an assessment method for clinical competencies. Results from this study will elucidate the position of case analysis as an assessment method in nursing education. The potential benefits are improved standardization of clinical assessment and the ability to efficiently evaluate a broad range of competencies.

Research design

Mixed-method research with a convergent parallel design was adopted in the study. This approach intends to converge two data types (quantitative and qualitative) at the interpretation stage to ensure an inclusive research problem analysis [ 30 ]. The quantitative aspect of the study was implemented through a cross-sectional survey. The survey captured the perceived efficacy of using case analysis as an assessment method in clinical nursing education. The qualitative part of the study was carried out through a descriptive qualitative method using focus groups to provide an in-depth understanding of the perceived strengths experienced by the students.

Study setting

Data were collected in the College of Nursing at Sultan Qaboos University (SQU), Oman, during the Spring and Fall semesters of 2023. At the end of each clinical course, the students have a clinical written exam and a clinical practical exam, which constitute their final exam. Most clinical courses use multiple-choice questions (MCQs) in their written exam. However, the child health clinical course team initiated the case analysis as an assessment method in the clinical written exam, replacing the MCQs format.

Participants

For this study, the investigators invited undergraduate students enrolled in the child health nursing clinical course in the Spring and Fall semesters of 2023. Currently, the only course that uses case analysis is child health. Other courses use MCQs. A total enumeration sampling technique was adopted. All the students enrolled in child health nursing clinical courses in the Spring and Fall 2023 semesters were invited to participate in the study. In the Spring, 36 students registered for the course, while 55 students were enrolled in the Fall. We included students who completed the case analysis as a final clinical written exam on the scheduled exam time. Students who did not show up for the exam during the scheduled time and students not enrolled in the course during the Spring and Fall of 2023 were excluded. Although different cases were used each semester, both had the same structure and level of complexity. Further, both cases were peer-reviewed.

Case analysis format

The format presents open-ended questions related to a clinical case scenario. It comprises three main sections: Knowledge, Emergency Room, and Ward. The questions in the sections varied in difficulty based on Bloom’s cognitive taxonomy levels, as presented in Table  1 . An answer key was generated to ensure consistency among course team members when correcting the exam. Three experts in child health nursing peer-reviewed both the case analysis exam paper and the answer key paper. The students were allocated two hours to complete the exam.

Study instruments

Quantitative stage.

The researchers developed a study questionnaire to meet the study objectives. It included two parts. The first was about the demographic data, including age, gender, type of residence, year in the program, and cumulative grade point average (GPA). The second part comprised a 13-item questionnaire assessing the perceived efficacy of case analysis as an assessment method. The perceived efficacy was represented by the acceptability of case analysis as an assessment method (Items 1–5 and 13) and the association with clinical competencies (Items 6 to 12). Acceptability involved format organization and clarity, time adequacy, alignment with course objectives, appropriateness to students’ level, and recommendation for implementation in other clinical nursing courses. Clinical competencies-related items were relevant to knowledge (motivation to prepare well for the exam, active learning, interest in topics, collaboration while studying) and cognitive skills (critical thinking, decision-making, and problem-solving skills) (The questionnaire is attached as a supplementary document).

The questionnaire is answered on a 4-point Likert scale: 1 = strongly disagree, 2 = disagree, 3 = agree, 4 = strongly agree. Higher scores indicated better perceived efficacy and vice versa. The tool underwent content validity testing with five experts in nursing clinical education, resulting in an item-content validity index ranging from 0.7 to 1. The Cronbach alpha was 0.83 for acceptability and 0.90 for clinical competencies.

Qualitative stage

For the focus group interviews, the investigators created a semi-structured interview guide to obtain an in-depth understanding of the students’ perceived strengths of case analysis as an assessment method. See Table  2 .

Data collection

Data was collected from the students after they gave their written informed consent. Students were invited to fill out the study questionnaire after they completed the case analysis as a clinical written exam.

All students in the child health course were invited to participate in focus group discussions. Students who approached the PI to participate in the focus group discussion were offered to participate in four different time slots. So, the students chose their time preferences. Four focus groups were conducted in private rooms at the College of Nursing. Two trained and bilingual interviewers attended the focus groups, one as a moderator while the other took notes on the group dynamics and non-verbal communication. The discussion duration ranged between 30 and 60 min. After each discussion, the moderator transcribed the audio recording. The transcriptions were rechecked against the audio recording for accuracy. Later, the transcriptions were translated into English by bilingual researchers fluent in Arabic and English for the analysis.

Rigor and trustworthiness

The rigor and trustworthiness of the qualitative method were enhanced using multiple techniques. Firstly, quantitative data, literature reviews, and focus groups were triangulated. Participants validated the summary after each discussion using member checking to ensure the moderator’s understanding was accurate. Third, the principal investigator (PI) reflected on her assumptions, experiences, expectations, and feelings weekly. In addition, the PI maintained a detailed audit trail of study details and progress. The nursing faculty conducted the study with experience in qualitative research and nursing education. This report was prepared following the Standard for Reporting Qualitative Research (SRQR) protocol [ 31 ].

Data analysis

Quantitative data were entered in SPSS version 24 and analyzed using simple descriptive analysis using means, standard deviations, and percentages. After computing the means of each questionnaire item, an average of the means was calculated to identify the perceived efficacy rate. A similar technique was used to calculate the rate of acceptability and clinical competencies. The percentage was calculated based on the mean: gained score/total score* 100. In addition, the investigators carried out an independent t-test to determine the relationship between the perceived efficacy and students’ GPA.

The qualitative data were analyzed using the framework analysis method. In our analysis, we followed the seven interconnected stages of framework analysis: (1) transcription, (2) familiarization with the interview, (3) coding, (4) developing a working analytical framework, (5) applying the analytical framework, (6) charting data into framework matrix and (7) interpreting the data [ 32 ]. Two members of the team separately analyzed the transcriptions. Then, they discussed the coding, and discrepancies were solved with discussion.

Mixed method integration

In our study, the quantitative and qualitative data were analyzed separately, and integration occurred at the interpretation level by merging the data [ 33 ]. As a measure of integration between qualitative and quantitative data, findings were assessed through confirmation, expansion, and discordance. If both data sets confirmed each other’s findings, it was considered confirmation, and if they expanded each other’s insight, it was considered expansion. Discordance was determined if the findings were contradictory.

Ethical considerations

Ethical approval was obtained from the Research and Ethics Committee of the College of Nursing, SQU (CON/NF/2023/18). Informed consent was collected, and no identifiable information was reported. For the focus group interviews, students were reassured that their grades were finalized, and their participation would not affect their grades. Also, the interviewers were instructed to maintain a non-judgmental and non-biased position during the interview. Data were saved in a locked cabinet inside a locked office room. The electronic data were saved in a password-protected computer.

The results section will present findings from the study’s quantitative and qualitative components. The integration of the two data types is described after each qualitative finding.

Quantitative findings

We analyzed the data of 67 participants, representing a 73.6% response rate. The mean age was 21.0 years old (SD 0.73) and 36.4% were male students. See Table  3 for more details.

The descriptive analysis showed that the mean value of the perceived efficacy of case analysis as an assessment method was 3.20 (SD = 0.53), demonstrating an 80% agreement rate. Further analysis indicated that 78.5% of the students concurred the acceptability of case analysis as an assessment method (mean = 3.14, SD = 0.58) and 80.3% (m = 3.21, SD = 0.60) assented the clinical competencies associated with it.

For the items representing acceptability, 81.8% of the students agreed that the case analysis was written clearly, and 80.3% reported that it was well organized. As per the questions, 81% described they were appropriate to their level, and 79.8% agreed upon their alignment with the course objectives. Moreover, the time allocated was adequate for 74.5% of the students, and 73.5% recommend using case analysis as an evaluation strategy for other clinical written examinations.

Regarding the clinical competencies, 77.3% of students agreed that the case analysis motivated them to prepare well for the exam, 81.3% reported that it encouraged them to be active in learning, and 81.0% indicated that it stimulated their interest in the topics discussed in the course. Additionally, 76.5% of the students agreed that the case analysis encouraged them to collaborate with other students when studying for the exam. Among the students, 82.5% reported that the case analysis as an assessment method enhanced their critical thinking skills, 81.0% agreed that it helped them practice decision-making skills, and 81.8% indicated that it improved their problem-solving abilities. See Table  4 .

The independent t-test analysis revealed no significant difference in the perceived efficacy between students with lower and higher GPAs (t [61] = 0.05, p  > 0.05). Further analysis showed that the means of acceptability and clinical competencies were not significantly different between the lower GPA group and higher GPA group, t [62] = 0.72, p  > 0.05 and t [63] = -0.83, p  > 0.05, respectively (Table  5 ).

Qualitative findings

A total of 22 had participated in four focus groups, each group had 5–6 students. The qualitative framework analysis revealed three main findings; case analysis is a preferred assessment method to students when compared to MCQs, case analysis assesses students’ knowledge, and case analysis assesses students’ cognitive skills.

Qualitative Finding 1: case analysis is a preferred assessment method to students when compared to MCQs

Most of the students’ statements about the case analysis as an assessment method were positive. One student stated, “Previously, we have MCQs in clinical exams, but they look as if they are theory exams. This exam makes me deal with cases like a patient, which is good for clinical courses.” . At the same time, many students conveyed optimism about obtaining better grades with this exam format. A student stated, “Our grades, with case analysis format, will be better, … may be because we can write more in open-ended questions, so we can get some marks, in contrast to MCQs where we may get it right or wrong” . On the other hand, a few students suggested adding multiple-choice questions, deleting the emergency department section, and lessening the number of care plans in the ward section to secure better grades.

Although the case analysis was generally acceptable to students, they have repeatedly expressed a need to allocate more time for this type of exam. A student stated, “The limited time with the type of questions was a problem, …” . When further discussion was prompted to understand this challenge, we figured that students are not used to handwriting, which has caused them to be exhausted during the exam. An example is “writing is time-consuming and energy consuming in contrast to MCQs …” . These statements elucidate that the students don’t necessarily mind writing but recommend more practice as one student stated, “More experience of this type of examination is required, more examples during clinical practice are needed.” Some even recommended adopting this format with other clinical course exams by saying “It’s better to start this method from the first year for the new cohort and to apply it in all other courses.”

Mixed Methods Inference 1: Confirmation and Expansion

The abovementioned qualitative impression supports the high acceptability rate in quantitative analysis. In fact, there is a general agreement that the case analysis format surpasses the MCQs when it comes to the proper evaluation strategies for clinical courses. Expressions in the qualitative data revealed more details, such as the limited opportunities to practice handwriting, which negatively impacted the perceived adequacy of exam time.

Qualitative Finding 2: case analysis assesses students’ knowledge

Students conferred that they were reading more about the disease pathophysiology, lab values, and nursing care plans, which they did not usually do with traditional means of examination. Examples of statements include “… before we were not paying attention to the normal lab results but …in this exam, we went back and studied them which was good for our knowledge” and “we cared about the care plan. In previous exams, we were not bothered by these care plans”. Regarding the burden that could be perceived with this type of preparation, the students expressed that this has helped them prepare for the theory course exam; as one student said, “We also focus on theory lectures to prepare for this exam …. this was very helpful to prepare us for the theory final exam as well.” However, others have highlighted the risks of limiting the exam’s content to one case analysis. The argument was that some students may have not studied the case completely or been adequately exposed to the case in the clinical setting. To solve this risk, the students themselves advocated for frequent case group discussions in the clinical setting as stated by one student: “There could be some differences in the cases that we see during our clinical posting, for that I recommend that instructors allocate some time to gather all the students and discuss different cases.” Also, the participants advocated for more paper-based case analysis exercises as it is helpful to prepare them for the exams and enhance their knowledge and skills.

Mixed Methods Inferences 2: Confirmation and Expansion

The qualitative finding supports the quantitative data relevant to items 6, 7, and 8. Students’ expressions revealed more insights, including the acquisition of deeper knowledge, practicing concept mapping, and readiness for other course-related exams. At the same time, students recommended that faculty ensure all students’ exposure to common cases in the clinical setting for fair exam preparation.

Qualitative Finding 3. case analysis assesses students’ cognitive skills

Several statements conveyed how the case analysis format helped the students use their critical thinking and analysis skills. One student stated, “It, the case analysis format, enhanced our critical thinking skills as there is a case with given data and we analyze the case….” . Therefore, the case analysis format as an exam is potentially a valid means to assess the student’s critical thinking skills. Students also conveyed that the case analysis format helped them link theory to practice and provided them with the platform to think like real nurses and be professional. Examples of statements are: “…we connect our knowledge gained from theory with the clinical experience to get the answers…” and “The questions were about managing a case, which is what actual nurses are doing daily.” Another interesting cognitive benefit to case analysis described by the students was holistic thinking. For example, one student said, “Case analysis format helped us to see the case as a whole and not only from one perspective.”

Mixed Methods Inferences 3: Confirmation

The quantitative data indicated mutual agreement among the students that the case analysis enhanced their critical thinking, decision-making, and problem-solving skills. The students’ statements from the interviews, including critical thinking, linking theory to practice, and holistic thinking, further supported these presumptions.

This research presents the findings from a mixed methods study that explored undergraduate nursing students’ perceived efficacy of using case analysis as an assessment method. The perceived efficacy was reflected through acceptability and association with two core competencies: knowledge and cognitive skills. The study findings showed a high rate of perceived efficacy of case analysis as an assessment method among nursing students. Additionally, three findings were extracted from the qualitative data that further confirmed the perceived efficacy: (1) case analysis is a preferred assessment method to students compared to MCQs, (2) case analysis assesses students’ knowledge, and (3) case analysis assesses students’ cognitive skills. Moreover, the qualitative findings revealed details that expanded the understanding of the perceived efficacy among nursing students.

Previous literature reported students’ preference for case analysis as a teaching method. A randomized controlled study investigated student’s satisfaction levels with case-based teaching, in addition to comparing certain outcomes between a traditional teaching group and a case-based teaching group. They reported that most students favored the use of case-based teaching, whom at the same time had significantly better OSCE scores compared to the other group [ 34 ]. As noted, this favorable teaching method ultimately resulted in better learning outcomes and academic performance. Although it may be challenging since no answer options are provided, students appreciate the use of case analysis format in their exams because it aligns better with the course objectives and expected clinical competencies. The reason behind students’ preference for case analysis is that it allows them to interact with the teaching content and visualize the problem, leading to a better understanding. When case analysis is used as an assessment method, students can connect the case scenario presented in the exam to their clinical training, making it more relevant.

In this study, students recognized the incorporation of nursing knowledge in the case analysis exam. They also acknowledged improved knowledge and learning abilities similar to those observed in case-based teaching. Boney et al. (2015) reported that students perceived increased learning gains and a better ability to identify links between different concepts and other aspects of life through case-based teaching [ 35 ]. Additionally, case analysis as an exam promotes students’ in-depth acquirement of knowledge through the type of preparation it entails. Literature suggested that case-based teaching promotes self-directed learning with high autonomous learning ability [ 34 , 36 ]. Thus, better achievement in the case analysis exam could be linked with a higher level of knowledge, making it a suitable assessment method for knowledge integration in nursing care.

The findings of this study suggest that case analysis can be a useful tool for evaluating students’ cognitive skills, such as critical thinking, decision-making, and problem-solving. A randomized controlled study implied better problem-solving abilities among the students in the case-based learning group compared to those in the traditional teaching methods group [ 12 ]. Moreover, students in our study conveyed that case analysis as an exam was an opportunity for them to think like real nurses. Similar to our findings, a qualitative study on undergraduate nutrition students found that case-based learning helped students develop professional competencies for their future practice, in addition to higher-level cognitive skills [ 37 ]. Therefore, testing students through case analysis allows educators to assess the student’s readiness for entry-level professional competencies, including the thinking process. Also, to evaluate students’ high-level cognitive skills according to Bloom’s taxonomy (analysis, synthesis, and evaluation), which educators often find challenging.

Case analysis as an assessment method for clinical courses is partially integrated in case presentation or OSCE evaluation methods. However, the written format is considered to be more beneficial for both assessment and learning processes. A qualitative study was conducted to examine the impact of paper-based case learning versus video-based case learning on clinical decision-making skills among midwifery students. The study revealed that students paid more attention and were able to focus better on the details when the case was presented in a paper format [ 38 ]. Concurrently, the students in our study recommended more paper-based exercises, which they believed would improve their academic performance.

This study has possible limitations. The sample size was small due to the limited experience of case analysis as a clinical written exam in the program. Future studies with larger sample sizes and diverse nursing courses are needed for better generalizability.

Implications

Little evidence relates to the efficacy of case analysis as an evaluation method, suggesting the novelty of this study. Despite the scarcity of case-based assessment studies, a reader can speculate from this study’s findings that there is a potential efficacy of case analysis as an assessment method in nursing education. Future research is warranted to validate the effectiveness of case-analysis assessment methods and investigate the effects of case-analysis exams on academic and clinical performance.

Overall, our findings are in accordance with the evidence suggesting students’ perceived efficacy of case analysis as a teaching method. This study adds a potential for the case analysis to be acceptable and relevant to the clinical competencies when used as an assessment method. Future research is needed to validate the effectiveness of case analysis exams in other nursing clinical courses and examine their effects on academic and clinical performance.

Data availability

The datasets used and/or analyzed during the current study are available fromthe Principal Investigator (BAY) upon reasonable request.

Iriarte-Roteta A, Lopez‐Dicastillo O, Mujika A, Ruiz‐Zaldibar C, Hernantes N, Bermejo‐Martins E, Pumar‐Méndez MJ. Nurses’ role in health promotion and prevention: a critical interpretive synthesis. J Clin Nurs. 2020;29(21–22):3937–49. https://doi.org/10.1111/jocn.15441

Article   PubMed   Google Scholar  

Fukada M. Nursing competency: definition, structure and development. Yonago Acta Med. 2018;61(1):001–7. https://doi.org/10.33160/yam.2018.03.001

Article   Google Scholar  

Nabizadeh-Gharghozar Z, Alavi NM, Ajorpaz NM. Clinical competence in nursing: a hybrid concept analysis. Nurse Educ Today. 2021;97:104728. https://doi.org/10.1016/j.nedt.2020.104728

Allande-Cussó R, Fernández-García E, Porcel-Gálvez AM. Defining and characterising the nurse–patient relationship: a concept analysis. Nurs Ethics. 2021;29(2):462–84. https://doi.org/10.1177/09697330211046651

Butts JB, Rich KL. Nursing ethics: across the curriculum and into practice. Jones & Bartlett Learning; 2019.

Chen W, Shah UV, Brechtelsbauer C. A framework for hands-on learning in chemical engineering education—training students with the end goal in mind. Educ Chem Eng. 2019;28:25–9.

Willman A, Bjuresäter K, Nilsson J. Newly graduated registered nurses’ self-assessed clinical competence and their need for further training. Nurs Open. 2020;7(3):720–30. https://doi.org/10.1002/nop2.443

Article   PubMed   PubMed Central   Google Scholar  

American Association of Colleges of Nursing. (2021). The essentials: Core competencies for professional nursing education. In. Retrieved from n.d.). American Association of Colleges of Nursing. https://www.aacnnursing.org/Portals/0/PDFs/Publications/Essentials-2021.pdf

Kaur G, Rehncy J, Kahal KS, Singh J, Sharma V, Matreja PS, Grewal H. Case-based learning as an effective tool in teaching pharmacology to undergraduate medical students in a large group setting. J Med Educ Curric Dev. 2020;7:2382120520920640.

Patiraki E, Katsaragakis S, Dreliozi A, Prezerakos P. Nursing care plans based on NANDA, nursing interventions classification, and nursing outcomes classification: the investigation of the effectiveness of an educational intervention in Greece. Int J Nurs Knowl. 2017;28:88–93.

Cui C, Li Y, Geng D, Zhang H, Jin C. The effectiveness of evidence-based nursing on development of nursing students ‘critical thinking: a meta-analysis. Nurse Educ Today. 2018;65:46–53.

Bi M, Zhao Z, Yang J, Wang Y. Comparison of case-based learning and traditional method in teaching postgraduate students of medical oncology. Med Teach. 2019;41(10):1124–8.

Seshan V, Matua GA, Raghavan D, Arulappan J, Al Hashmi I, Roach EJ, Prince EJ. Case study analysis as an effective teaching strategy: perceptions of undergraduate nursing students from a Middle Eastern Country. SAGE Open Nurs. 2021;7:23779608211059265.

PubMed   PubMed Central   Google Scholar  

Slieman TA, Camarata T. Case-based group learning using concept maps to achieve multiple educational objectives and behavioral outcomes. J Med Educ Curric Dev. 2019;6:2382120519872510.

Yu Z, Hu R, Ling S, Zhuang J, Chen Y, Chen M, Lin Y. Effects of blended versus offline case-centered learning on the academic performance and critical thinking ability of undergraduate nursing students: a cluster randomized controlled trial. Nurse Educ Pract. 2021;53:103080.

Chan AW, Chair SY, Sit JW, Wong EM, Lee DT, Fung OW. Case-based web learning versus face-to-face learning: a mixed-method study on university nursing students. J Nurs Res. 2016;24(1):31–40.

Hong S, Yu P. Comparison of the effectiveness of two styles of case-based learning implemented in lectures for developing nursing students’ critical thinking ability: a randomized controlled trial. Int J Nurs Stud. 2017;68:16–24.

Shohani M, Bastami M, Gheshlaghi LA, Nasrollahi A. Nursing student’s satisfaction with two methods of CBL and lecture-based learning. BMC Med Educ. 2023;23(1):1–5.

Tan KW. Using Teaching Cases for Achieving Bloom’s High-Order Cognitive Levels: An Application in Technically-Oriented Information Systems Course (2017). 2017 Proceedings. 1. http://aisel.aisnet.org/siged2017/1

Farashahi M, Tajeddin M. Effectiveness of teaching methods in business education: a comparison study on the learning outcomes of lectures, case studies and simulations. Int J Manage Educ. 2018;16(1):131–42.

Google Scholar  

Farha RJA, Zein MH, Al Kawas S. Introducing integrated case-based learning to clinical nutrition training and evaluating students’ learning performance. J Taibah Univ Med Sci. 2021;16(4):558–64.

Picciano AG. Theories and frameworks for Online Education: seeking an Integrated Model. Online Learn. 2017;213:166–90.

Bezanilla MJ, Fernández-Nogueira D, Poblete M, Galindo-Domínguez H. Methodologies for teaching-learning critical thinking in higher education: the teacher’s view. Think Skills Creativity. 2019;33:100584.

Immonen K, Oikarainen A, Tomietto M, Kääriäinen M, Tuomikoski A-M, Kaučič BM, Perez-Canaveras RM. Assessment of nursing students’ competence in clinical practice: a systematic review of reviews. Int J Nurs Stud. 2019;100:103414.

Oermann MH, Gaberson KB, De Gagne JC, NPD-BC C. Evaluation and testing in nursing education. Springer Publishing Company; 2024.

McCarty T. (2020). How to Build Assessments for Clinical Learners. Roberts Academic Medicine Handbook: A Guide to Achievement and Fulfillment for Academic Faculty, 83–90.

Gholami M, Changaee F, Karami K, Shahsavaripour Z, Veiskaramian A, Birjandi M. Effects of multiepisode case-based learning (CBL) on problem-solving ability and learning motivation of nursing students in an emergency care course. J Prof Nurs. 2021;37(3):612–9.

King N. (2016, April). Case-based exams for learning and assessment: Experiences in an information systems course [ Confeence presentation]. In 2016 IEEE Global Engineering Education Conference (EDUCON) , Abu Dhabi, UAE.

Pereira D, Flores MA, Niklasson L. Assessment revisited: a review of research in Assessment and evaluation in Higher Education. Assess Evaluation High Educ. 2016;41(7):1008–32.

Creswell JW, Poth CN. Qualitative inquiry and research design: choosing among five approaches. SAGE; 2016.

O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research. Acad Med. 2014;89(9):1245–51.

Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13(1). https://doi.org/10.1186/1471-2288-13-117

Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs—principles and practices. Health Serv Res. 2013;48(6pt2):2134–56. https://doi.org/10.1111/1475-6773.12117

Liu L, Li M, Zheng Q, Jiang H. The effects of case-based teaching in nursing skill education: cases do matter. INQUIRY. J Health Care Organ Provis Financing. 2020;57:004695802096442.

Bonney KM. Case study teaching methods improve student performance and perceptions of learning gains. J Microbiol Biology Educ. 2015;16(1):21–8.

Rezaee R, Mosalanejad L. The effects of case-based team learning on students’ learning, self-regulation, and self-direction. Global J Health Sci. 2015;7(4):295.

Harman T, Bertrand B, Greer A, Pettus A, Jennings J, Wall-Bassett E, Babatunde OT. Case-based learning facilitates critical thinking in undergraduate nutrition education: students describe the big picture. J Acad Nutr Dietetics. 2015;115(3):378–88.

Nunohara K, Imafuku R, Saiki T, Bridges SM, Kawakami C, Tsunekawa K, Niwa M, Fujisaki K, Suzuki Y. (2020). How does video case-based learning influence clinical decision-making by midwifery students? An exploratory study. BMC Med Educ, 20 (1).

Download references

Acknowledgements

The authors wish to thank the nursing students at SQU who voluntarily participated in this study.

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and affiliations.

Sultan Qaboos University, Al Khodh 66, Muscat, 123, Oman

Basma Mohammed Al Yazeedi, Lina Mohamed Wali Shakman, Sheeba Elizabeth John Sunderraj, Harshita Prabhakaran, Judie Arulappan, Erna Judith Roach, Aysha Al Hashmi & Zeinab Al Azri

Oman College of Health Science, Norht Sharqia Branch, Ibra 66, Ibra, 124, Oman

Aysha Al Hashmi

You can also search for this author in PubMed   Google Scholar

Contributions

Dr. Basma Mohammed Al Yazeedi contributed to conceptualization, methods, data collection, data analysis, writing the draft, and reviewing the final draft. Ms. Lina Mohamed Wali Shakman contributed to conceptualization, data collection, data analysis, writing the draft, and reviewing the final draft. Ms. Sheeba Elizabeth John Sunderraj contributed to conceptualization, methods, data collection, writing the draft, and reviewing the final draft.Ms. Harshita Prabhakaran contributed to conceptualization, data collection, writing the draft, and reviewing the final draft.Dr. Judie Arulappan contributed to conceptualization and reviewing the final draft.Dr. Erna Roach contributed to conceptualization writing the draft and reviewing the final draft.Ms. Aysha Al Hashmi contributed to the conceptualization and reviewing the final draft. Dr. Zeinab Al Azri contributed to data collection, data analysis, writing the draft, and reviewing the final draft.All auhors reviewed and approved the final version of the manuscirpt.

Corresponding author

Correspondence to Zeinab Al Azri .

Ethics declarations

Ethics approval and consent to participate.

The study was conducted in accordance with the Declaration of Helsinki. Ethical approval was obtained from the Research and Ethics Committee of the College of Nursing, Sultan Qaboos University SQU (CON/NF/2023/18). All data was held and stored following the SQU data policy retention. Informed consent to participate was obtained from all of the participants in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1:

The questionnaire used in this study is attached as a supplementary document.

Supplementary Material 2

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Yazeedi, B.M.A., Shakman, L.M.W., Sunderraj, S.E.J. et al. Perceived efficacy of case analysis as an assessment method for clinical competencies in nursing education: a mixed methods study. BMC Nurs 23 , 441 (2024). https://doi.org/10.1186/s12912-024-02102-9

Download citation

Received : 07 April 2024

Accepted : 17 June 2024

Published : 28 June 2024

DOI : https://doi.org/10.1186/s12912-024-02102-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Case-analysis
  • Clinical competency
  • Nursing education

BMC Nursing

ISSN: 1472-6955

critical thinking a methodology

Contact IPC

IPC International, Inc.

IPC White Paper Emphasizes the Critical Importance of ‘Design for Excellence’ Throughout the Full Ecosystem of Electronics Design

A new white paper from IPC’s Chief Technologist (CTC) and Design Leadership (DLC) Councils, Better Electronics by Design: Next Generation Design Needs explores the elements of the “Design for Excellence” methodology, re-thinking how it needs to be further defined and applied in the full ecosystem of electronics design.

In addition, the whitepaper provides a high-level exploration of the full “silicon-to-systems” ecosystem, examines the justification and implications of an Authoritative Source of Truth (ASOT), discusses the need for synergy between building blocks of electronic systems, and takes a deep dive into the subject of design rules and “design for manufacturability.”

Key messages within the white paper include:

  • Electronic systems are increasingly complex and heterogeneous.
  • Facilitation of an Authoritative Source of Truth (ASOT) is mandatory to ensure consistency, efficiency, and traceability.
  • The industry needs a more robust and interoperable toolset to support greater silicon-to-systems collaboration.
  • Diverse electronic and mechanical CAD systems must gain interoperability by including proper model-based definitions (MBD) and collaborative software to cover both disciplines.
  • A shift towards true digital collaboration and automation and an early consideration of manufacturing capabilities across the development process is essential to master increasing time-to-market and complexity challenges.

“Creating better electronics by design is a common, collective goal of the electronics industry,” said Peter Tranitz, IPC senior director, technology solutions, and leader of IPC’s design initiative. “To achieve this, an ASOT should be established and protocols, standardized. To leverage the associated benefits, a culture of real digital collaboration, transparency, and accountability needs to be established. Tools need to support bi-directional, incremental exchange of information. And design rules need to be broken down to the relevant stages of the design workflow and checks need to be performed after every stage of the design process to drive the concept of ‘shift left.’ For effective Design for Manufacturability execution, manufacturers need to provide clear guidance on manufacturing capabilities and constraints to designers.”

Download the report:   https://go.ipc.org/next-gen-design .

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

religions-logo

Article Menu

critical thinking a methodology

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Early biblical fundamentalism’s xenophobic rejection of the subject in european philosophy: how rejecting the knowing subject formed fundamentalism’s way of thinking.

critical thinking a methodology

1. Introduction

2. anti-german xenophobia, 3. “german fancies”, 4. nineteenth-century anglo-american philosophical tradition, 5. reid and baconism.

If there are certain principles, as I think there are, which the constitution of our nature leads us to believe, and which we are under a necessity to take for granted in the common concerns of life, without being able to give a reason for them; these are what we call the principles of common sense; and what is manifestly contrary to them, is what we call absurd. ( Reid 1852, p. 108 )
I like Biblical theology that does not start with the superficial Aristotelian method of reason, that does not begin with an hypothesis, and then warp the facts and the philosophy to fit the crook of our dogma, but a Baconian system, which first gathers the teachings of the word of God, and then seeks to deduce some general law upon which the facts can be arranged.

6. Baconian Common Sense

7. rejection of key enlightenment values, 8. rejection of subjectivity, 9. the outcome: fundamentalism’s common sense realism, 10. conclusions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • Adams, Ray H. 1933. Preachers Present Arms . New York: Round Table Press. [ Google Scholar ]
  • Bacon, Francis. 1973. The Advancement of Learning . Edited by George William Kitchin. Introduction by Arthur Johnston. London: J.M. Dent and Sons. [ Google Scholar ]
  • Barr, James. 1980. The Fundamentalist Understanding of Scripture. Concilium 138: 70–74. [ Google Scholar ]
  • Calvin, John. 1989. Institutes of the Christian Religion . Translated by Henry Beveridge. Grand Rapids: Wm. B. Eerdmans. [ Google Scholar ]
  • Darwin, Charles. 1958. The Autobiography of Charles Darwin 1809–1882, with Original Omissions Restored . Edited with Appendix and Notes by His Grand-Daughter Nora Barlow. New York: W.W. Norton & Company. [ Google Scholar ]
  • Dobson, Ed, Edward E. Hindson, and Jerry Falwell. 1986. The Fundamentalist Phenomenon , 2nd ed. Grand Rapids: Baker Book House. [ Google Scholar ]
  • Ferrell, Robert H. 1985. Woodrow Wilson and World War I, 1917–1921 . New York: Harper and Row. [ Google Scholar ]
  • Gaussen, Samuel R. L. 1842. Plenary Inspiration of the Holy Scriptures . Translated by Edward Norris Kirk. New York: John S. Taylor & Co. [ Google Scholar ]
  • Harris, Harriet A. 1998. Fundamentalism and Evangelicalism . Oxford: University Press. [ Google Scholar ]
  • Hofstadter, Richard. 1964. Anti-Intellectualism in American Life . London: Jonathan Cape. First published 1963. New York: Knopf. [ Google Scholar ]
  • Huff, Peter A. 2000. The Challenge of Fundamentalism for Interreligious Dialogue. CrossCurrents 50: 94–102. Available online: http://www.jstor.org/stable/24461235 (accessed on 12 February 2024).
  • Lightner, Robert. 1995. Handbook of Evangelical Theology . Grand Rapids: Kregel Publications. First published 1986, as Evangelical Theology . [ Google Scholar ]
  • Lonergan, Bernard. 1988. Collection, Collected Works of Bernard Lonergan, Volume 4 , 2nd ed. revised and augmented. Edited by Frederick E. Crowe and Robert M. Doran. Toronto: University of Toronto Press. [ Google Scholar ]
  • Lonergan, Bernard. 1992. Insight: A Study of Human Understanding. Collected Works of Bernard Lonergan, Volume 3 , 5th ed. rev. and augmented. Edited by Frederick E. Crowe and Robert M. Doran. Toronto: University of Toronto Press. [ Google Scholar ]
  • Lonergan, Bernard. 2004. Philosophical and Theological Papers 1965–1980. Collected Works of Bernard Lonergan, Volume 17 . Edited by Robert C. Croken and Robert M. Doran. Toronto: University of Toronto Press. [ Google Scholar ]
  • Lonergan, Bernard. 2016. A Second Collection. Collected Works of Bernard Lonergan, Volume 13 . Edited by William F. J. Ryan and Bernard J. Tyrell. Toronto: University of Toronto Press. [ Google Scholar ]
  • Marsden, George. 1991. Understanding Fundamentalism and Evangelicalism . Grand Rapids: Wm. B. Eerdmans Publishing. [ Google Scholar ]
  • Marsden, George M. 2022. Fundamentalism and American Culture , 3rd ed. New York: Oxford University Press. [ Google Scholar ]
  • Marty, Martin E., and Robert Scott Appleby, eds. 1991. Fundamentalisms Observed (The Fundamentalism Project, Volume 1) . Chicago: University of Chicago Press. [ Google Scholar ]
  • Marty, Martin E., and Robert Scott Appleby, eds. 1995. Fundamentalisms Comprehended (The Fundamentalism Project, Volume 5) . Chicago: University of Chicago Press. [ Google Scholar ]
  • McBrien, Richard P. 1994. Catholicism . Revised and updated edition. North Blackburn: Collins Dove. [ Google Scholar ]
  • Mencken, Henry Louis. 1926. Prejudices: Fifth Series . New York: A. A. Knopf. Available online: https://hdl.handle.net/2027/mdp.39015004722354 (accessed on 15 August 2023).
  • Meyer, Ben F. 1989. Critical Realism and the New Testament . Allison Park: Pickwick Publications. [ Google Scholar ]
  • Meynell, Hugo A. 1991. On being an Aristotelian. Heythrop Journal 32: 233–48. [ Google Scholar ] [ CrossRef ]
  • Morris, Henry M. 1971. The Bible Has the Answer . Grand Rapids: Baker Book House, Reprinted 1994. [ Google Scholar ]
  • Nichols, Ryan. 2014. “Thomas Reid,” Stanford Encyclopedia of Philosophy. (First published 2000). Available online: https://plato.stanford.edu/entries/reid/ (accessed on 15 December 2023).
  • Ogilvie, Matthew C. 2001. Faith Seeking Understanding: The Functional Specialty ‘Systematics’ in Bernard Lonergan’s Method in Theology . Milwaukee: Marquette University Press. [ Google Scholar ]
  • Ogilvie, Matthew C., ed. 2019. “The Truth Will Set You Free” Part 2: What is the Truth that Sets us Free? In Australian Lonergan Workshop III . Sydney: Sydney Lonergan Centre. [ Google Scholar ]
  • Packer, James Innell. 1992. ‘Fundamentalism’ and the Word of God: Some Evangelical Principles . Grand Rapids: Wm. B. Eerdmans. First published 1958. [ Google Scholar ]
  • Packer, James Innell. 1993. God Has Spoken: Revelation and the Bible . Revised and Enlarged Edition. London: Hodder and Stoughton, First edition, 1965. [ Google Scholar ]
  • Peterková, Eva. 2015. Thomas Reid and Notion of Common Sense. Ostium 11. Available online: https://ostium.sk/language/sk/thomas-reid-and-notion-of-common-sense/ (accessed on 16 March 2024).
  • Pontifical Biblical Commission. 1993. The Interpretation of the Bible in the Church. Available online: https://www.bc.edu/content/dam/files/research_sites/cjl/texts/cjrelations/resources/documents/catholic/pbcinterpretation.htm (accessed on 15 February 2023).
  • Reid, Thomas. 1852. The Works of Thomas Reid, D.D.; Now Fully Collected, with Selections from His Unpublished Letters . Preface by Sir William Hamilton. Edinburgh: Maclachlan and Stewart. [ Google Scholar ]
  • Shea, William M., ed. 1993. The Struggle Over the Past: Fundamentalism in the Modern World . Lanham: College Theology Society and University Press of America. [ Google Scholar ]
  • Torrey, Rueben A., and Amzi C. Dixon, eds. 1917. The Fundamentals: A Testimony to Truth. (Four Volume Set) . Los Angeles: Bible Institute of Los Angeles, Reprinted 1996 by Baker Books, Grand Rapids, MI. [ Google Scholar ]
  • Warfield, Benjamin B. 1948. The Inspiration and Authority of the Bible . Edited by Samuel G. Craig. Phillipsburg: Presbyterian and Reformed Publishing Company. [ Google Scholar ]
  • Waugh, Earle H. 1997. Review of Fundamentalism: Harbinger of Academic Revisionism? Journal of the American Academy of Religion 65: 161–68. Available online: http://www.jstor.org/stable/1465823 (accessed on 13 March 2023). [ CrossRef ]
  • Wood, Simon A., and David Harrington Watt, eds. 2014. Fundamentalism: Perspectives on a Contested History . Columbia: University of South Carolina Press. [ Google Scholar ]
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Ogilvie, M.C. Early Biblical Fundamentalism’s Xenophobic Rejection of the Subject in European Philosophy: How Rejecting the Knowing Subject Formed Fundamentalism’s Way of Thinking. Religions 2024 , 15 , 790. https://doi.org/10.3390/rel15070790

Ogilvie MC. Early Biblical Fundamentalism’s Xenophobic Rejection of the Subject in European Philosophy: How Rejecting the Knowing Subject Formed Fundamentalism’s Way of Thinking. Religions . 2024; 15(7):790. https://doi.org/10.3390/rel15070790

Ogilvie, Matthew C. 2024. "Early Biblical Fundamentalism’s Xenophobic Rejection of the Subject in European Philosophy: How Rejecting the Knowing Subject Formed Fundamentalism’s Way of Thinking" Religions 15, no. 7: 790. https://doi.org/10.3390/rel15070790

Article Metrics

Further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

COMMENTS

  1. What is critical thinking?

    Critical thinking is a kind of thinking in which you question, analyse, interpret , evaluate and make a judgement about what you read, hear, say, or write. The term critical comes from the Greek word kritikos meaning "able to judge or discern". Good critical thinking is about making reliable judgements based on reliable information.

  2. What Is Critical Thinking?

    Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions. Critical thinking is important in all disciplines and throughout all stages of the research ...

  3. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind; thus, a critical thinker is a person who practices the ...

  4. Defining Critical Thinking

    Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism.

  5. Critical Thinking

    Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking ...

  6. Critical Thinking

    Critical Thinking. Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. ... Critical theorists may use critical thinking methodology, but their subject matter is distinct, and they also may offer critical analyses of critical thinking itself. Table of Contents.

  7. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  8. Critical thinking

    From the turn of the 20th century, he and others working in the overlapping fields of psychology, philosophy, and educational theory sought to rigorously apply the scientific method to understand and define the process of thinking. They conceived critical thinking to be related to the scientific method but more open, flexible, and self ...

  9. Critical Thinking and Decision-Making

    Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions. It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better. This may sound like a pretty broad definition, and that's because critical thinking is a ...

  10. Critical Thinking > Educational Methods (Stanford Encyclopedia of

    Educational Methods. Experiments have shown that educational interventions can improve critical thinking abilities and dispositions, as measured by standardized tests. Glaser (1941) developed teaching materials suitable for senior primary school, high school and college students. To test their effectiveness, he developed with his sponsor ...

  11. Critical Thinking: Where to Begin

    A Brief Definition: Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. A well-cultivated critical thinker: communicates effectively with others in figuring out solutions to complex problems. Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking.

  12. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  13. Critical Thinking

    Approach problems in a consistent and systematic way. Reflect on the justification of their own assumptions, beliefs and values. ... Critical thinking requires a clear, often uncomfortable, assessment of your personal strengths, weaknesses and preferences and their possible impact on decisions you may make.

  14. Critical Thinking Definition, Skills, and Examples

    Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful ...

  15. Introduction to Logic and Critical Thinking

    This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a ...

  16. a guide to creative and critical thinking

    The open step goes on to outline some of the critical thinking processes that tie into the definitions we've seen. These critical thinking skills include: Analysing and weighing up arguments. Evaluating evidence that has been presented. Distinguishing between fact and opinion.

  17. Critical Thinking and Problem-Solving

    Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking ...

  18. The Relationship Between Scientific Method & Critical Thinking

    Critical thinking initiates the act of hypothesis. In the scientific method, the hypothesis is the initial supposition, or theoretical claim about the world, based on questions and observations. If critical thinking asks the question, then the hypothesis is the best attempt at the time to answer the question using observable phenomenon.

  19. PDF A Systematic Process for Critical Thinking

    Critical thinking can sometimes be a convoluted and mysterious process; this resource provides a systematic, critical thinking method that makes it a lot less puzzling. We know that critical thinking is necessary and crucial for our work, but how do we get to the "Oh! I know!" or "Hmmm, what if we…" moments?

  20. Methodologies for teaching-learning critical thinking in higher

    Marön-Calderón (2014) sets out the need to define a teaching approach including different types of activities to develop critical thinking. In this sense, it would be key to design a strategy or sequence of activities that ensure a meaningful learning of critical thinking, rather than talking about methodologies and activities of teaching ...

  21. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and ...

  22. Guide to Critical Thinking: Learn to Use Critical Thinking Skills

    Level Up Your Team. See why leading organizations rely on MasterClass for learning & development. Many decision-making and problem-solving tasks require critical thinking skills, which entail the ability to analyze information to reach a rational conclusion.

  23. Understanding the Complex Relationship between Critical Thinking and

    Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical ...

  24. Perceived efficacy of case analysis as an assessment method for

    Case analysis is a dynamic and interactive teaching and learning strategy that improves critical thinking and problem-solving skills. However, there is limited evidence about its efficacy as an assessment strategy in nursing education. This study aimed to explore nursing students' perceived efficacy of case analysis as an assessment method for clinical competencies in nursing education.

  25. IPC White Paper Emphasizes the Critical Importance of 'Design for

    A new white paper from IPC's Chief Technologist (CTC) and Design Leadership (DLC) Councils, Better Electronics by Design: Next Generation Design Needs explores the elements of the "Design for Excellence" methodology, re-thinking how it needs to be further defined and applied in the full ecosystem of electronics design. In addition, the whitepaper provides a high-level exploration of the ...

  26. Religions

    This article thus shows how Fundamentalism radically opposes God and human beings, and faith and philosophy, with the resulting way of thinking that can be characterised as "naïve realism", an approach to thinking that excludes the active thinking subject and does not allow for critical judgement or personal understanding.