• ADEA Connect

' src=

  • Communities
  • Career Opportunities
  • New Thinking
  • ADEA Governance
  • House of Delegates
  • Board of Directors
  • Advisory Committees
  • Sections and Special Interest Groups
  • Governance Documents and Publications
  • Dental Faculty Code of Conduct
  • ADEAGies Foundation
  • About ADEAGies Foundation
  • ADEAGies Newsroom
  • Gies Awards
  • Press Center
  • Strategic Directions
  • 2023 Annual Report
  • ADEA Membership
  • Institutions
  • Faculty and Staff
  • Individuals
  • Corporations
  • ADEA Members
  • Predoctoral Dental
  • Allied Dental
  • Nonfederal Advanced Dental
  • U.S. Federal Dental
  • Students, Residents and Fellows
  • Corporate Members
  • Member Directory
  • Directory of Institutional Members (DIM)
  • 5 Questions With
  • ADEA Member to Member Recruitment
  • Students, Residents, and Fellows
  • Information For
  • Deans & Program Directors
  • Current Students & Residents
  • Prospective Students
  • Educational Meetings
  • Upcoming Events
  • 2025 Annual Session & Exhibition
  • eLearn Webinars
  • Past Events
  • Professional Development
  • eLearn Micro-credentials
  • Leadership Institute
  • Leadership Institute Alumni Association (LIAA)
  • Faculty Development Programs
  • ADEA Scholarships, Awards and Fellowships
  • Academic Fellowship
  • For Students
  • For Dental Educators
  • For Leadership Institute Fellows
  • Teaching Resources
  • ADEA weTeach®
  • MedEdPORTAL

Critical Thinking Skills Toolbox

  • Resources for Teaching
  • Policy Topics
  • Task Force Report
  • Opioid Epidemic
  • Financing Dental Education
  • Holistic Review
  • Sex-based Health Differences
  • Access, Diversity and Inclusion
  • ADEA Commission on Change and Innovation in Dental Education
  • Tool Resources
  • Campus Liaisons
  • Policy Resources
  • Policy Publications
  • Holistic Review Workshops
  • Leading Conversations Webinar Series
  • Collaborations
  • Summer Health Professions Education Program
  • Minority Dental Faculty Development Program
  • Federal Advocacy
  • Dental School Legislators
  • Policy Letters and Memos
  • Legislative Process
  • Federal Advocacy Toolkit
  • State Information
  • Opioid Abuse
  • Tracking Map
  • Loan Forgiveness Programs
  • State Advocacy Toolkit
  • Canadian Information
  • Dental Schools
  • Provincial Information
  • ADEA Advocate
  • Books and Guides
  • About ADEA Publications
  • 2023-24 Official Guide
  • Dental School Explorer
  • Dental Education Trends
  • Ordering Publications
  • ADEA Bookstore
  • Newsletters
  • About ADEA Newsletters
  • Bulletin of Dental Education
  • Charting Progress
  • Subscribe to Newsletter
  • Journal of Dental Education
  • Subscriptions
  • Submissions FAQs
  • Data, Analysis and Research
  • Educational Institutions
  • Applicants, Enrollees and Graduates
  • Dental School Seniors
  • ADEA AADSAS® (Dental School)
  • AADSAS Applicants
  • Admissions Officers
  • Health Professions Advisors
  • ADEA CAAPID® (International Dentists)
  • CAAPID Applicants
  • Program Finder
  • ADEA DHCAS® (Dental Hygiene Programs)
  • DHCAS Applicants
  • Program Directors
  • ADEA PASS® (Advanced Dental Education Programs)
  • PASS Applicants
  • PASS Evaluators
  • DentEd Jobs
  • Information For:

critical thinking self assessment tool

  • Introduction
  • Overview of Critical Thinking Skills
  • Teaching Observations
  • Avenues for Research

CTS Tools for Faculty and Student Assessment

  • Critical Thinking and Assessment
  • Conclusions
  • Bibliography
  • Helpful Links
  • Appendix A. Author's Impressions of Vignettes

A number of critical thinking skills inventories and measures have been developed:

     Watson-Glaser Critical Thinking Appraisal (WGCTA)      Cornell Critical Thinking Test      California Critical Thinking Disposition Inventory (CCTDI)      California Critical Thinking Skills Test (CCTST)      Health Science Reasoning Test (HSRT)      Professional Judgment Rating Form (PJRF)      Teaching for Thinking Student Course Evaluation Form      Holistic Critical Thinking Scoring Rubric      Peer Evaluation of Group Presentation Form

Excluding the Watson-Glaser Critical Thinking Appraisal and the Cornell Critical Thinking Test, Facione and Facione developed the critical thinking skills instruments listed above. However, it is important to point out that all of these measures are of questionable utility for dental educators because their content is general rather than dental education specific. (See Critical Thinking and Assessment .)

Table 7. Purposes of Critical Thinking Skills Instruments

Watson-Glaser Critical Thinking Appraisal- FS (WGCTA-FS) Assesses participants' skills in five subscales: inference, recognition of assumptions, deduction, interpretation, and evaluation of arguments.
Cornell Critical Thinking Test (CCTT) Measures test takers' skills in induction, credibility, prediction and experimental planning, fallacies, and deduction.
California Critical Thinking Disposition Inventory (CCTDI)
Assesses test takers' consistent internal motivations to engage in critical thinking skills.
California Critical Thinking Skills Test
(CCTST)
Provides objective measures of participants' skills in six subscales (analysis, inference, explanation, interpretation, self-regulation, and evaluation) and an overall score for critical thinking.
The Health Science Reasoning Test (HSRT) Assesses critical thinking skills of health science professionals and students.
Measures analysis, evaluation, inference, and inductive and deductive reasoning.
Professional Judgment Rating Form (PJRF) Measures extent to which novices approach problems with CTS. Can be used to assess effectiveness of training programs for individual or group evaluation.
Teaching for Thinking Student Course Evaluation Form
Used by students to rate the perceived critical thinking skills content in secondary and postsecondary classroom experiences.
Holistic Critical Thinking Scoring Rubric
Used by professors and students to rate learning outcomes or presentations on critical thinking skills and dispositions. The rubric can capture the type of target behaviors, qualities, or products that professors are interested in evaluating.
Peer Evaluation of Group Presentation Form
A common set of criteria used by peers and the instructor to evaluate student-led group presentations.

  Reliability and Validity

Reliability means that individual scores from an instrument should be the same or nearly the same from one administration of the instrument to another. The instrument can be assumed to be free of bias and measurement error (68). Alpha coefficients are often used to report an estimate of internal consistency. Scores of .70 or higher indicate that the instrument has high reliability when the stakes are moderate. Scores of .80 and higher are appropriate when the stakes are high.

Validity means that individual scores from a particular instrument are meaningful, make sense, and allow researchers to draw conclusions from the sample to the population that is being studied (69) Researchers often refer to "content" or "face" validity. Content validity or face validity is the extent to which questions on an instrument are representative of the possible questions that a researcher could ask about that particular content or skills.

Watson-Glaser Critical Thinking Appraisal-FS (WGCTA-FS)

The WGCTA-FS is a 40-item inventory created to replace Forms A and B of the original test, which participants reported was too long.70 This inventory assesses test takers' skills in:

     (a) Inference: the extent to which the individual recognizes whether assumptions are clearly stated      (b) Recognition of assumptions: whether an individual recognizes whether assumptions are clearly stated      (c) Deduction: whether an individual decides if certain conclusions follow the information provided      (d) Interpretation: whether an individual considers evidence provided and determines whether generalizations from data are warranted      (e) Evaluation of arguments: whether an individual distinguishes strong and relevant arguments from weak and irrelevant arguments

Researchers investigated the reliability and validity of the WGCTA-FS for subjects in academic fields. Participants included 586 university students. Internal consistencies for the total WGCTA-FS among students majoring in psychology, educational psychology, and special education, including undergraduates and graduates, ranged from .74 to .92. The correlations between course grades and total WGCTA-FS scores for all groups ranged from .24 to .62 and were significant at the p < .05 of p < .01. In addition, internal consistency and test-retest reliability for the WGCTA-FS have been measured as .81. The WGCTA-FS was found to be a reliable and valid instrument for measuring critical thinking (71).

Cornell Critical Thinking Test (CCTT)

There are two forms of the CCTT, X and Z. Form X is for students in grades 4-14. Form Z is for advanced and gifted high school students, undergraduate and graduate students, and adults. Reliability estimates for Form Z range from .49 to .87 across the 42 groups who have been tested. Measures of validity were computed in standard conditions, roughly defined as conditions that do not adversely affect test performance. Correlations between Level Z and other measures of critical thinking are about .50.72 The CCTT is reportedly as predictive of graduate school grades as the Graduate Record Exam (GRE), a measure of aptitude, and the Miller Analogies Test, and tends to correlate between .2 and .4.73

California Critical Thinking Disposition Inventory (CCTDI)

Facione and Facione have reported significant relationships between the CCTDI and the CCTST. When faculty focus on critical thinking in planning curriculum development, modest cross-sectional and longitudinal gains have been demonstrated in students' CTS.74 The CCTDI consists of seven subscales and an overall score. The recommended cut-off score for each scale is 40, the suggested target score is 50, and the maximum score is 60. Scores below 40 on a specific scale are weak in that CT disposition, and scores above 50 on a scale are strong in that dispositional aspect. An overall score of 280 shows serious deficiency in disposition toward CT, while an overall score of 350 (while rare) shows across the board strength. The seven subscales are analyticity, self-confidence, inquisitiveness, maturity, open-mindedness, systematicity, and truth seeking (75).

In a study of instructional strategies and their influence on the development of critical thinking among undergraduate nursing students, Tiwari, Lai, and Yuen found that, compared with lecture students, PBL students showed significantly greater improvement in overall CCTDI (p = .0048), Truth seeking (p = .0008), Analyticity (p =.0368) and Critical Thinking Self-confidence (p =.0342) subscales from the first to the second time points; in overall CCTDI (p = .0083), Truth seeking (p= .0090), and Analyticity (p =.0354) subscales from the second to the third time points; and in Truth seeking (p = .0173) and Systematicity (p = .0440) subscales scores from the first to the fourth time points (76). California Critical Thinking Skills Test (CCTST)

Studies have shown the California Critical Thinking Skills Test captured gain scores in students' critical thinking over one quarter or one semester. Multiple health science programs have demonstrated significant gains in students' critical thinking using site-specific curriculum. Studies conducted to control for re-test bias showed no testing effect from pre- to post-test means using two independent groups of CT students. Since behavioral science measures can be impacted by social-desirability bias-the participant's desire to answer in ways that would please the researcher-researchers are urged to have participants take the Marlowe Crowne Social Desirability Scale simultaneously when measuring pre- and post-test changes in critical thinking skills. The CCTST is a 34-item instrument. This test has been correlated with the CCTDI with a sample of 1,557 nursing education students. Results show that, r = .201, and the relationship between the CCTST and the CCTDI is significant at p< .001. Significant relationships between CCTST and other measures including the GRE total, GRE-analytic, GRE-Verbal, GRE-Quantitative, the WGCTA, and the SAT Math and Verbal have also been reported. The two forms of the CCTST, A and B, are considered statistically significant. Depending on the testing, context KR-20 alphas range from .70 to .75. The newest version is CCTST Form 2000, and depending on the testing context, KR-20 alphas range from .78-.84.77

The Health Science Reasoning Test (HSRT)

Items within this inventory cover the domain of CT cognitive skills identified by a Delphi group of experts whose work resulted in the development of the CCTDI and CCTST. This test measures health science undergraduate and graduate students' CTS. Although test items are set in health sciences and clinical practice contexts, test takers are not required to have discipline-specific health sciences knowledge. For this reason, the test may have limited utility in dental education (78).

Preliminary estimates of internal consistency show that overall KR-20 coefficients range from .77 to .83.79 The instrument has moderate reliability on analysis and inference subscales, although the factor loadings appear adequate. The low K-20 coefficients may be result of small sample size, variance in item response, or both (see following table).

Table 8. Estimates of Internal Consistency and Factor Loading by Subscale for HSRT

Inductive
.76 .332-.769
Deductive .71 .366-.579
Analysis .54 .369-.599
Inference .52 .300-.664
Evaluation .77 .359-.758

Professional Judgment Rating Form (PJRF)

The scale consists of two sets of descriptors. The first set relates primarily to the attitudinal (habits of mind) dimension of CT. The second set relates primarily to CTS.

A single rater should know the student well enough to respond to at least 17 or the 20 descriptors with confidence. If not, the validity of the ratings may be questionable. If a single rater is used and ratings over time show some consistency, comparisons between ratings may be used to assess changes. If more than one rater is used, then inter-rater reliability must be established among the raters to yield meaningful results. While the PJRF can be used to assess the effectiveness of training programs for individuals or groups, the evaluation of participants' actual skills are best measured by an objective tool such as the California Critical Thinking Skills Test.

Teaching for Thinking Student Course Evaluation Form

Course evaluations typically ask for responses of "agree" or "disagree" to items focusing on teacher behavior. Typically the questions do not solicit information about student learning. Because contemporary thinking about curriculum is interested in student learning, this form was developed to address differences in pedagogy and subject matter, learning outcomes, student demographics, and course level characteristic of education today. This form also grew out of a "one size fits all" approach to teaching evaluations and a recognition of the limitations of this practice. It offers information about how a particular course enhances student knowledge, sensitivities, and dispositions. The form gives students an opportunity to provide feedback that can be used to improve instruction.

Holistic Critical Thinking Scoring Rubric

This assessment tool uses a four-point classification schema that lists particular opposing reasoning skills for select criteria. One advantage of a rubric is that it offers clearly delineated components and scales for evaluating outcomes. This rubric explains how students' CTS will be evaluated, and it provides a consistent framework for the professor as evaluator. Users can add or delete any of the statements to reflect their institution's effort to measure CT. Like most rubrics, this form is likely to have high face validity since the items tend to be relevant or descriptive of the target concept. This rubric can be used to rate student work or to assess learning outcomes. Experienced evaluators should engage in a process leading to consensus regarding what kinds of things should be classified and in what ways.80 If used improperly or by inexperienced evaluators, unreliable results may occur.

Peer Evaluation of Group Presentation Form

This form offers a common set of criteria to be used by peers and the instructor to evaluate student-led group presentations regarding concepts, analysis of arguments or positions, and conclusions.81 Users have an opportunity to rate the degree to which each component was demonstrated. Open-ended questions give users an opportunity to cite examples of how concepts, the analysis of arguments or positions, and conclusions were demonstrated.

Table 8. Proposed Universal Criteria for Evaluating Students' Critical Thinking Skills 

     Accuracy
     Adequacy
     Clarity
     Completeness
     Consistency
     Depth
     Fairness
     Logic
     Precision
     Realism
     Relevance
     Significance
     Specificity

Aside from the use of the above-mentioned assessment tools, Dexter et al. recommended that all schools develop universal criteria for evaluating students' development of critical thinking skills (82).

Their rationale for the proposed criteria is that if faculty give feedback using these criteria, graduates will internalize these skills and use them to monitor their own thinking and practice (see Table 4).

' src=

  • Application Information
  • ADEA GoDental
  • ADEA AADSAS
  • ADEA CAAPID
  • Events & Professional Development
  • Scholarships, Awards & Fellowships
  • Publications & Data
  • Official Guide to Dental Schools
  • Data, Analysis & Research
  • Follow Us On:

' src=

  • ADEA Privacy Policy
  • Terms of Use
  • Website Feedback
  • Website Help

critical thinking self assessment tool

Critical Thinking Self Assessment

Critical thinking self-assessment is an evaluation of one's ability to think critically and analyze a situation. it seeks to understand how someone reasons and makes decisions, as well as their ability to think objectively and logically. it usually involves a series of questions or activities designed to measure the individual's skills in areas such as problem-solving, decision-making, creativity, and analytical ability. .

2 minutes to complete

Eligibility

Eligibility to complete a Critical Thinking Self Assessment includes being at least 18 years of age, having a basic understanding of logical reasoning and critical thinking concepts, and having access to a computer or other device with internet access.

undefined

Questions for Critical Thinking Self Assessment

I look for evidence before believing claims

I consider issues from different perspectives

I feel confident to present my own arguments even when it challenges the views of others

I actively seek evidence that might counter what Ialready know

My opinions are influenced by evidence rather than justpersonal experience and emotion

If I am not sure about something, I will researchto find out more

I know how to search for reliable information to develop my knowledge of a topic

Assessments Similar to Critical Thinking Self Assessment

  • Critical Thinking Assessment Tool
  • Critical Thinking Skills Assessment
  • Critical Thinking Evaluation Form
  • Critical Thinking Skills Survey
  • Critical Thinking Ability Test
  • Critical Thinking Competency Test

Here are some FAQs and additional information on Critical Thinking Self Assessment

What is critical thinking, critical thinking is the ability to think clearly and rationally, understanding the logical connection between ideas. it involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. critical thinking also involves analyzing and synthesizing information from various sources in order to make informed decisions and come to sound conclusions., how can i assess my critical thinking skills, there are a variety of self-assessment tools available to help you assess your critical thinking skills. these tools typically involve answering questions about your approach to problem-solving and decision-making., how can i improve my critical thinking skills, improving your critical thinking skills requires actively engaging in activities that challenge you to think critically. examples of activities that can help you develop your critical thinking skills include: reading, discussing, and debating topics with others; taking time to reflect on your thoughts and ideas; and questioning assumptions and biases., want to use this template, loved by people at home and at work.

undefined

What's next? Try out templates like Critical Thinking Self Assessment

1000+ templates, 50+ categories.

undefined

Want to create secure online forms and surveys?

Join blocksurvey..

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to Critical Thinking

How can one assess, for purposes of instruction or research, the degree to which a person possesses the dispositions, skills and knowledge of a critical thinker?

In psychometrics, assessment instruments are judged according to their validity and reliability.

Roughly speaking, an instrument is valid if it measures accurately what it purports to measure, given standard conditions. More precisely, the degree of validity is “the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests” (American Educational Research Association 2014: 11). In other words, a test is not valid or invalid in itself. Rather, validity is a property of an interpretation of a given score on a given test for a specified use. Determining the degree of validity of such an interpretation requires collection and integration of the relevant evidence, which may be based on test content, test takers’ response processes, a test’s internal structure, relationship of test scores to other variables, and consequences of the interpretation (American Educational Research Association 2014: 13–21). Criterion-related evidence consists of correlations between scores on the test and performance on another test of the same construct; its weight depends on how well supported is the assumption that the other test can be used as a criterion. Content-related evidence is evidence that the test covers the full range of abilities that it claims to test. Construct-related evidence is evidence that a correct answer reflects good performance of the kind being measured and an incorrect answer reflects poor performance.

An instrument is reliable if it consistently produces the same result, whether across different forms of the same test (parallel-forms reliability), across different items (internal consistency), across different administrations to the same person (test-retest reliability), or across ratings of the same answer by different people (inter-rater reliability). Internal consistency should be expected only if the instrument purports to measure a single undifferentiated construct, and thus should not be expected of a test that measures a suite of critical thinking dispositions or critical thinking abilities, assuming that some people are better in some of the respects measured than in others (for example, very willing to inquire but rather closed-minded). Otherwise, reliability is a necessary but not a sufficient condition of validity; a standard example of a reliable instrument that is not valid is a bathroom scale that consistently under-reports a person’s weight.

Assessing dispositions is difficult if one uses a multiple-choice format with known adverse consequences of a low score. It is pretty easy to tell what answer to the question “How open-minded are you?” will get the highest score and to give that answer, even if one knows that the answer is incorrect. If an item probes less directly for a critical thinking disposition, for example by asking how often the test taker pays close attention to views with which the test taker disagrees, the answer may differ from reality because of self-deception or simple lack of awareness of one’s personal thinking style, and its interpretation is problematic, even if factor analysis enables one to identify a distinct factor measured by a group of questions that includes this one (Ennis 1996). Nevertheless, Facione, Sánchez, and Facione (1994) used this approach to develop the California Critical Thinking Dispositions Inventory (CCTDI). They began with 225 statements expressive of a disposition towards or away from critical thinking (using the long list of dispositions in Facione 1990a), validated the statements with talk-aloud and conversational strategies in focus groups to determine whether people in the target population understood the items in the way intended, administered a pilot version of the test with 150 items, and eliminated items that failed to discriminate among test takers or were inversely correlated with overall results or added little refinement to overall scores (Facione 2000). They used item analysis and factor analysis to group the measured dispositions into seven broad constructs: open-mindedness, analyticity, cognitive maturity, truth-seeking, systematicity, inquisitiveness, and self-confidence (Facione, Sánchez, and Facione 1994). The resulting test consists of 75 agree-disagree statements and takes 20 minutes to administer. A repeated disturbing finding is that North American students taking the test tend to score low on the truth-seeking sub-scale (on which a low score results from agreeing to such statements as the following: “To get people to agree with me I would give any reason that worked”. “Everyone always argues from their own self-interest, including me”. “If there are four reasons in favor and one against, I’ll go with the four”.) Development of the CCTDI made it possible to test whether good critical thinking abilities and good critical thinking dispositions go together, in which case it might be enough to teach one without the other. Facione (2000) reports that administration of the CCTDI and the California Critical Thinking Skills Test (CCTST) to almost 8,000 post-secondary students in the United States revealed a statistically significant but weak correlation between total scores on the two tests, and also between paired sub-scores from the two tests. The implication is that both abilities and dispositions need to be taught, that one cannot expect improvement in one to bring with it improvement in the other.

A more direct way of assessing critical thinking dispositions would be to see what people do when put in a situation where the dispositions would reveal themselves. Ennis (1996) reports promising initial work with guided open-ended opportunities to give evidence of dispositions, but no standardized test seems to have emerged from this work. There are however standardized aspect-specific tests of critical thinking dispositions. The Critical Problem Solving Scale (Berman et al. 2001: 518) takes as a measure of the disposition to suspend judgment the number of distinct good aspects attributed to an option judged to be the worst among those generated by the test taker. Stanovich, West and Toplak (2011: 800–810) list tests developed by cognitive psychologists of the following dispositions: resistance to miserly information processing, resistance to myside thinking, absence of irrelevant context effects in decision-making, actively open-minded thinking, valuing reason and truth, tendency to seek information, objective reasoning style, tendency to seek consistency, sense of self-efficacy, prudent discounting of the future, self-control skills, and emotional regulation.

It is easier to measure critical thinking skills or abilities than to measure dispositions. The following eight currently available standardized tests purport to measure them: the Watson-Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), the Cornell Critical Thinking Tests Level X and Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), the Ennis-Weir Critical Thinking Essay Test (Ennis & Weir 1985), the California Critical Thinking Skills Test (Facione 1990b, 1992), the Halpern Critical Thinking Assessment (Halpern 2016), the Critical Thinking Assessment Test (Center for Assessment & Improvement of Learning 2017), the Collegiate Learning Assessment (Council for Aid to Education 2017), the HEIghten Critical Thinking Assessment (https://territorium.com/heighten/), and a suite of critical thinking assessments for different groups and purposes offered by Insight Assessment (https://www.insightassessment.com/products). The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students’ critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level certificates in critical thinking on the basis of an examination (OCR 2011). Many of these standardized tests have received scholarly evaluations at the hands of, among others, Ennis (1958), McPeck (1981), Norris and Ennis (1989), Fisher and Scriven (1997), Possin (2008, 2013a, 2013b, 2013c, 2014, 2020) and Hatcher and Possin (2021). Their evaluations provide a useful set of criteria that such tests ideally should meet, as does the description by Ennis (1984) of problems in testing for competence in critical thinking: the soundness of multiple-choice items, the clarity and soundness of instructions to test takers, the information and mental processing used in selecting an answer to a multiple-choice item, the role of background beliefs and ideological commitments in selecting an answer to a multiple-choice item, the tenability of a test’s underlying conception of critical thinking and its component abilities, the set of abilities that the test manual claims are covered by the test, the extent to which the test actually covers these abilities, the appropriateness of the weighting given to various abilities in the scoring system, the accuracy and intellectual honesty of the test manual, the interest of the test to the target population of test takers, the scope for guessing, the scope for choosing a keyed answer by being test-wise, precautions against cheating in the administration of the test, clarity and soundness of materials for training essay graders, inter-rater reliability in grading essays, and clarity and soundness of advance guidance to test takers on what is required in an essay. Rear (2019) has challenged the use of standardized tests of critical thinking as a way to measure educational outcomes, on the grounds that  they (1) fail to take into account disputes about conceptions of critical thinking, (2) are not completely valid or reliable, and (3) fail to evaluate skills used in real academic tasks. He proposes instead assessments based on discipline-specific content.

There are also aspect-specific standardized tests of critical thinking abilities. Stanovich, West and Toplak (2011: 800–810) list tests of probabilistic reasoning, insights into qualitative decision theory, knowledge of scientific reasoning, knowledge of rules of logical consistency and validity, and economic thinking. They also list instruments that probe for irrational thinking, such as superstitious thinking, belief in the superiority of intuition, over-reliance on folk wisdom and folk psychology, belief in “special” expertise, financial misconceptions, overestimation of one’s introspective powers, dysfunctional beliefs, and a notion of self that encourages egocentric processing. They regard these tests along with the previously mentioned tests of critical thinking dispositions as the building blocks for a comprehensive test of rationality, whose development (they write) may be logistically difficult and would require millions of dollars.

A superb example of assessment of an aspect of critical thinking ability is the Test on Appraising Observations (Norris & King 1983, 1985, 1990a, 1990b), which was designed for classroom administration to senior high school students. The test focuses entirely on the ability to appraise observation statements and in particular on the ability to determine in a specified context which of two statements there is more reason to believe. According to the test manual (Norris & King 1985, 1990b), a person’s score on the multiple-choice version of the test, which is the number of items that are answered correctly, can justifiably be given either a criterion-referenced or a norm-referenced interpretation.

On a criterion-referenced interpretation, those who do well on the test have a firm grasp of the principles for appraising observation statements, and those who do poorly have a weak grasp of them. This interpretation can be justified by the content of the test and the way it was developed, which incorporated a method of controlling for background beliefs articulated and defended by Norris (1985). Norris and King synthesized from judicial practice, psychological research and common-sense psychology 31 principles for appraising observation statements, in the form of empirical generalizations about tendencies, such as the principle that observation statements tend to be more believable than inferences based on them (Norris & King 1984). They constructed items in which exactly one of the 31 principles determined which of two statements was more believable. Using a carefully constructed protocol, they interviewed about 100 students who responded to these items in order to determine the thinking that led them to choose the answers they did (Norris & King 1984). In several iterations of the test, they adjusted items so that selection of the correct answer generally reflected good thinking and selection of an incorrect answer reflected poor thinking. Thus they have good evidence that good performance on the test is due to good thinking about observation statements and that poor performance is due to poor thinking about observation statements. Collectively, the 50 items on the final version of the test require application of 29 of the 31 principles for appraising observation statements, with 13 principles tested by one item, 12 by two items, three by three items, and one by four items. Thus there is comprehensive coverage of the principles for appraising observation statements. Fisher and Scriven (1997: 135–136) judge the items to be well worked and sound, with one exception. The test is clearly written at a grade 6 reading level, meaning that poor performance cannot be attributed to difficulties in reading comprehension by the intended adolescent test takers. The stories that frame the items are realistic, and are engaging enough to stimulate test takers’ interest. Thus the most plausible explanation of a given score on the test is that it reflects roughly the degree to which the test taker can apply principles for appraising observations in real situations. In other words, there is good justification of the proposed interpretation that those who do well on the test have a firm grasp of the principles for appraising observation statements and those who do poorly have a weak grasp of them.

To get norms for performance on the test, Norris and King arranged for seven groups of high school students in different types of communities and with different levels of academic ability to take the test. The test manual includes percentiles, means, and standard deviations for each of these seven groups. These norms allow teachers to compare the performance of their class on the test to that of a similar group of students.

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

critical thinking self assessment tool

INSIGHT BASECAMP: The new comprehensive and innovative platform for developing your critical thinking.

Insight Assessment’s high-quality, expertly designed, and interactive critical thinking self-development tools are now available for the first time directly to individuals through INSIGHT BASECAMP . Adults, teens, and children can gain the reasoning skills and habits of mind that will last them a lifetime. With new short courses, quizzes, and surveys designed by experts in the training of critical thinking, everyone can build their critical thinking skills and fortify their positive thinking habits of mind in as little as one hour.

What you will find at INSIGHT BASECAMP

Insight Basecamp offers a wide range of courses and activities to help individuals hone their critical thinking skills and mindset. These courses, quizzes, and surveys are designed so that individuals can identify their own strengths and weaknesses, develop strategies for improving their critical thinking skills, and gain a confident mindset necessary to make sound decisions in their everyday and professional lives.

critical thinking self assessment tool

Navigating Misinformation: Critical Thinking in the Harris vs. Trump Election As Vice President Kamala Harris enters the presidential race and…

' src=

In today’s rapidly evolving educational landscape, the question of whether college is “worth it” has become increasingly common. For educators…

Is AI Data-Driven HR Right for Your Business?

Oracle HR technology has immense potential in Human Resources, bringing the same evidence-based approach that has succeeded in fields like…

' src=

  • Learning Environments
  • Teaching, learning and assessment

Self-assessment

Self-assessment is a formative assessment approach, which moves the responsibility for learning from the educator to the student.

What is it?

Self-assessment is a learning activity that gives students a structure to generate feedback on their own outputs or skills. It is a great way to prompt students to think critically about their work and make them aware of their own learning. Regular self-assessment engages students in metacognition and supports them in becoming self-regulated learners. In a task-specific context, students can assess their draft or a component of a larger task. This will help students to improve their understanding of the task at hand and set themselves up well for the upcoming summative assessment. Assessment rubrics can provide a structure to a self-assessment task and prompt students to generate self-feedback.

Why is it useful?

Benefits for students.

  • Gives students a chance to practise in a no-stakes environment and with little to no social pressure
  • Promotes active engagement with assessment criteria that way developing their understanding of intended learning outcomes
  • Encourages self-reflection and metacognition and, in turn, develops students’ capacity for independent learning.

Benefits for educators

  • Allows you to clarify assessment instructions by giving students a chance to have a go at the task and ask questions based on their first attempt.

How do I implement it?

To implement self-assessment in your teaching, try these strategies:

  • Engage students in self-assessment by highlighting its value to their learning. It is important to introduce self-assessment as an opportunity for students to improve their understanding of the assessment task and the quality of their work.
  • Provide opportunities for students to self-assess at various stages of the assessment task. For example, when working on a research essay, students can be first prompted to self-assess their argument and/or plan before assessing the full draft. This will make self-assessment a regular practice and provide the necessary scaffolding for students to gradually get used to the self-assessment protocols.
  • Prepare students to self-assess by stepping them through the process. A task-specific assessment rubric will help structure self-assessment and guide students through the process. Having a rubric for self-assessment will allow for students to generate self-feedback and translate it into actionable steps to work on in preparation for the summative assessment. For example, if during self-assessment a student identifies flaws in their essay argument, a well-designed analytic rubric will guide them towards actionable steps to improve their argument.
  • Use assessment exemplars to model self-assessment. Exemplars can be a great way to introduce students to the expectations for self-assessment. Before performing a rubric-referenced self-assessment, students can be asked to assess the exemplar based on the same rubric they will be later using to assess their own work.

Supporting technologies

  • FeedbackFruits is a user-friendly tool that can facilitate both self-assessment of work and self-assessment of skill. It makes it easy to structure a purposeful self-assessment activity where students grade themselves on a set of pre-determined criteria. Additionally, FeedbackFruits provides scaffolded assessment options where self-assessment can be added as a pre and/or post reflection step.
  • PebblePad offers opportunities for self-assessment as part of a larger task, iterative task, such as a portfolio, a blog, a placement, or a workbook. For example, rubrics, checklists, and Likert scales can be integrated into a workbook or a placement activity for a student to check their progress or measure the attainment of skill.
  • The University’s own Assessment Literacy Tool is an online platform for students to engage with sample assignments and compare their marking decisions with the evaluations provided by the teaching team. This can serve as a perfect preparation for self-assessment. The tool allows you to upload a rubric for students to assess a sample assignment against. Importantly, the tool also gets students to justify their selection. Upon providing their justifications and rubric selections, students can compare their responses to the teaching team’s assessment. Submit a ServiceNow request to add the tool to your subject’s LMS.
  • Bourke, R. (2018). Self-assessment to incite learning in higher education: developing ontological awareness . Assessment and Evaluation in Higher Education , 43 (5), 827-839.
  • Learning Environments, University of Melbourne (2022, September 15). Reflection and consolidation activities in BSL .
  • Learning Management System, University of Melbourne . FeedbackFruits .
  • Learning Management System, University of Melbourne . PebblePad .
  • Yan, Z., and Brown, G. T. L. (2017). A cyclical self-assessment process: towards a model of how students engage in self-assessment . Assessment and Evaluation in Higher Education , 42 (8), 1247-1262.
  • Assessment literacy tool resource .

This page was last updated on 11 Apr 2024.

Please report any errors on this page to our website maintainers

Support centre

Login to the lms.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

education-logo

Article Menu

critical thinking self assessment tool

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Development and validation of a critical thinking assessment-scale short form.

critical thinking self assessment tool

1. Introduction

2. materials and methods, 2.1. shorthening of the ctsas, 2.2. participants, 2.3. instruments and procedures, 2.3.1. translation of the ctsas short form into different languages, 2.3.2. data collection, 2.4. statistical analysis, 3.1. descriptive analysis of items, 3.2. confirmatory factor analysis (cfa) and reliability.

  • Model 1: One-factor model. This model tests the existence of one global factor on critical thinking skills, which explains the variances of the 60 variables.
  • Model 2: Six-factor (non-correlated) model. This model tests the existence of six non-correlated factors that explain the variance of the set of items.
  • Model 3: Six-factor (correlated) model. This model tests the existence of six correlated latent factors, each one explaining the variance of a set of items.
  • Model 4: Second-order factor model. This model represents the original model proposed by Nair [ 36 ], in which a global critical-thinking-skills construct explains the six latent-skills variance, which, in turn, each explain a set of items.
  • Model 5: Bi-factor model. This model tests the possibility that the 60 scale-items variances are being explained by a global critical-thinking-skills construct, and by the six latent skills, independently.

3.3. Multigroup Invariance

4. discussion, author contributions, institutional review board statement, informed consent statement, data availability statement, acknowledgments, conflicts of interest.

MeanSd.Skew.Kurt.K-S Testp
1. I try to figure out the content of the problem.5.040.958−0.744−0.2320.1521.000
2. I classify data using a framework.3.891.319−0.452−0.1400.9940.276
3. I break the complex ideas into manageable sub-ideas.3.961.357−0.467−0.0490.7180.682
4. I observe the facial expression people use in a given situation.4.631.380−1.0710.7150.9140.374
5. I examine the values rooted in the information presented.4.121.284−0.532−0.1720.7540.620
6. I restate another person’s statements to clarify the meaning.3.631.515−0.359−0.5450.7620.607
7. I figure out an example which explains the concept/opinion.4.531.097−0.7850.5500.6010.863
8. I clarify my thoughts by explaining to someone else.4.291.348−0.8030.2030.8640.445
9. I seek clarification of the meanings of another’s opinion or points of view.4.231.185−0.483−0.1960.7180.682
10. I examine the similarities and differences among the opinions posed for a given problem.4.231.166−0.7420.7650.5180.951
11. I examine the interrelationships among concepts or opinions posed.3.841.222−0.3640.1010.6290.823
12. I look for supporting reasons when examining opinions.4.441.174−0.6920.4360.6400.808
13. I look for relevant information to answer the question at issue.4.621.147−0.8550.6570.6510.790
14. I examine the proposals for solving a given problem.4.651.089−0.626−0.1000.2601.000
15. I ask questions in order to seek evidence to support or refute the author’s claim.4.091.341−0.566−0.0841.0410.229
16. I figure out if author’s arguments include both for and against the claim.3.971.316−0.433−0.2291.0440.226
17. I figure out unstated assumptions in one’s reasoning for a claim.3.631.289−0.287−0.1900.7230.673
18. I look for the overall structure of the argument.3.991.332−0.5800.1360.8640.444
19. I figure out the process of reasoning for an argument.4.021.306−0.5780.2530.3810.999
20. I figure out the assumptions implicit in the author’s reasoning.3.731.275−0.436−0.0320.8280.500
21. I assess the contextual relevance of an opinion or claim posed.4.001.192−0.4930.3870.8100.528
22. I seek the accuracy of the evidence supporting a given judgment.4.181.283−0.6930.3060.8580.453
23. I assess the chances of success or failure in using a premise to conclude an argument.4.081.344−0.599−0.0071.1200.163
24. I examine the logical strength of the underlying reason in an argument.4.061.295−0.464−0.0300.9190.367
25. I search for new data to confirm or refute a given claim4.151.288−0.6440.1420.7080.698
26. I search for additional information that might support or weaken an argument.4.341.195−0.520−0.2060.4350.992
27. I examine the logical reasoning of an objection to a claim.4.171.310−0.5520.0250.8830.417
28. I seek useful information to refute an argument when supported by unsure reasons.4.371.186−0.6550.4780.3141.000
29. I collect evidence supporting the availability of information to back up opinions.4.211.317−0.7710.5850.7940.554
30. I seek for evidence/information before accepting a solution.4.491.241−0.7290.1760.3551.000
31. I figure out alternate hypotheses/questions, when I need to solve a problem.4.211.311−0.6450.1661.0420.228
32. Given a problem to solve, I develop a set of options for solving the problem.4.331.255−0.6850.2340.6830.739
33. I systematically analyse the problem using multiple sources of information to draw inferences.4.111.381−0.596−0.1030.3251.000
34. I figure out the merits and demerits of a solution while prioritizing from alternatives for making decisions.4.011.320−0.455−0.1300.8120.525
35. I identify the consequences of various options to solving a problem.4.361.208−0.558−0.0090.6250.830
36. I arrive at conclusions that are supported with strong evidence.4.301.164−0.328−0.4840.4900.970
37. I use both deductive and inductive reasoning to interpret information.4.001.330−0.419−0.2590.7660.600
38. I analyse my thinking before jumping to conclusions.4.391.335−0.7100.0650.4370.991
39. I confidently reject an alternative solution when it lacks evidence.3.891.417−0.312−0.5870.5410.932
40. I figure out the pros and cons of a solution before accepting it.4.641.175−0.7210.2160.7100.695
41. I can describe the results of a problem using inferential evidence.3.781.206−0.2690.0680.7010.709
42. I can logically present results to address a given problem.4.181.138−0.4250.1111.5330.018
43. I state my choice of using a particular method to solve the problem.4.031.277−0.5300.1640.3051.000
44. I can explain a key concept to clarify my thinking.4.101.246−0.408−0.1410.5850.883
45. I write essays with adequate arguments supported with reasons for a given policy or situation.3.131.734−0.208−0.9660.8330.492
46. I anticipate reasonable criticisms one might raise against one’s viewpoints.3.921.319−0.438−0.3400.7300.661
47. I respond to reasonable criticisms one might raise against one’s viewpoints.3.821.292−0.456−0.0551.7720.004
48. I clearly articulate evidence for my own viewpoints.4.221.159−0.353−0.2830.1951.000
49. I present more evidence or counter evidence for another’s points of view.3.611.338−0.258−0.5400.6640.770
50. I provide reasons for rejecting another’s claim.4.041.400−0.535−0.3091.2550.086
51. I reflect on my opinions and reasons to ensure my premises are correct.4.431.136−0.442−0.4210.5400.932
52. I review sources of information to ensure important information is not overlooked.4.261.317−0.628−0.0741.0090.260
53. I examine and consider ideas and viewpoints even when others do not agree.4.201.156−0.380−0.2350.1741.000
54. I examine my values, thoughts/beliefs based on reasons and evidence.4.411.159−0.455−0.1510.1431.000
55. I continuously assess my targets and work towards achieving them.4.461.182−0.472−0.3670.3541.000
56. I review my reasons and reasoning process in coming to a given conclusion.4.181.187−0.349−0.2360.4150.995
57. I analyze areas of consistencies and inconsistencies in my thinking.4.011.294−0.448−0.1920.9260.358
58. I willingly revise my work to correct my opinions and beliefs.4.271.263−0.457−0.1720.6630.772
59. I continually revise and rethink strategies to improve my thinking.4.341.280−0.601−0.0730.6830.739
60. I reflect on my thinking to improve the quality of my judgment.4.531.187−0.8050.7520.2351.000
ItemInterpretationAnalysisEvaluationInferenceExplanationSelf-Regulation
1. I try to figure out the content of the problem.0.662
2. I classify data using a framework.0.661
3. I break the complex ideas into manageable sub-ideas.0.633
4. I observe the facial expression people use in a given situation0.386
5. I examine the values rooted in the information presented.0.654
6. I restate another person’s statements to clarify the meaning.0.499
7. I figure out an example which explains the concept/opinion.0.594
8. I clarify my thoughts by explaining to someone else.0.422
9. I seek clarification of the meanings of another’s opinion or points of view.0.536
10. I examine the similarities and differences among the opinions posed for a given problem. 0.614
11. I examine the interrelationships among concepts or opinions posed. 0.734
12. I look for supporting reasons when examining opinions. 0.671
13. I look for relevant information to answer the question at issue. 0.650
14. I examine the proposals for solving a given problem. 0.701
15. I ask questions in order to seek evidence to support or refute the author’s claim. 0.666
16. I figure out if author’s arguments include both for and against the claim. 0.670
17. I figure out unstated assumptions in one’s reasoning for a claim. 0.619
18. I look for the overall structure of the argument. 0.707
19. I figure out the process of reasoning for an argument. 0.772
20. I figure out the assumptions implicit in the author’s reasoning. 0.745
21. I assess the contextual relevance of an opinion or claim posed. 0.723
22. I seek the accuracy of the evidence supporting a given judgment. 0.735
23. I assess the chances of success or failure in using a premise to conclude an argument. 0.702
24. I examine the logical strength of the underlying reason in an argument. 0.725
25. I search for new data to confirm or refute a given claim 0.674
26. I search for additional information that might support or weaken an argument. 0.732
27. I examine the logical reasoning of an objection to a claim. 0.761
28. I seek useful information to refute an argument when supported by unsure reasons. 0.717
29. I collect evidence supporting the availability of information to back up opinions. 0.740
30. I seek for evidence/information before accepting a solution. 0.691
31. I figure out alternate hypotheses/questions, when I need to solve a problem. 0.734
32. Given a problem to solve, I develop a set of options for solving the problem. 0.710
33. I systematically analyse the problem using multiple sources of information to draw inferences. 0.738
34. I figure out the merits and demerits of a solution while prioritizing from alternatives for making decisions. 0.742
35. I identify the consequences of various options to solving a problem. 0.704
36. I arrive at conclusions that are supported with strong evidence. 0.756
37. I use both deductive and inductive reasoning to interpret information. 0.696
38. I analyse my thinking before jumping to conclusions. 0.636
39. I confidently reject an alternative solution when it lacks evidence. 0.470
40. I figure out the pros and cons of a solution before accepting it. 0.656
41. I can describe the results of a problem using inferential evidence. 0.745
42. I can logically present results to address a given problem. 0.749
43. I state my choice of using a particular method to solve the problem. 0.672
44. I can explain a key concept to clarify my thinking. 0.740
45. I write essays with adequate arguments supported with reasons for a given policy or situation. 0.511
46. I anticipate reasonable criticisms one might raise against one’s viewpoints 0.606
47. I respond to reasonable criticisms one might raise against one’s viewpoints. 0.650
48. I clearly articulate evidence for my own viewpoints. 0.720
49. I present more evidence or counter evidence for another’s points of view. 0.573
50. I provide reasons for rejecting another’s claim. 0.536
51. I reflect on my opinions and reasons to ensure my premises are correct. 0.719
52. I review sources of information to ensure important information is not overlooked. 0.785
53. I examine and consider ideas and viewpoints even when others do not agree. 0.705
54. I examine my values, thoughts/beliefs based on reasons and evidence. 0.756
55. I continuously assess my targets and work towards achieving them. 0.673
56. I review my reasons and reasoning process in coming to a given conclusion. 0.728
57. I analyze areas of consistencies and inconsistencies in my thinking. 0.737
58. I willingly revise my work to correct my opinions and beliefs. 0.750
59. I continually revise and rethink strategies to improve my thinking. 0.786
60. I reflect on my thinking to improve the quality of my judgment. 0.763
SkillsAlpha’s CronbachSub-SkillsStd
Alpha’s Cronbach
Interpretation0.772Categorization 0.670
Clarifying meaning 0.673
Decoding significance 0.473
Analysis 0.888Detecting arguments 0.632
Analyzing arguments 0.812
Examining ideas 0.799
Evaluation 0.858Assessing claim 0.723
Assessing arguments 0.821
Inference 0.905Drawing conclusions 0.743
Conjecturing alternatives 0.843
Querying evidence 0.752
Explanation 0.853Stating results 0.688
Justifying procedures 0.681
Presenting arguments 0.778
Self-regulation 0.905Self-examining 0.860
Self-correction 0.834
  • Dumitru, D.; Bigu, D.; Elen, J.; Jiang, L.; Railienè, A.; Penkauskienè, D.; Papathanasiou, I.V.; Tsaras, K.; Fradelos, E.C.; Ahern, A.; et al. A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century ; UTAD: Vila Real, Portugal, 2018. [ Google Scholar ]
  • Cruz, G.; Payan-Carreira, R.; Dominguez, C.; Silva, H.; Morais, F. What critical thinking skills and dispositions do new graduates need for professional life? Views from Portuguese employers in different fields. High. Educ. Res. Dev. 2021 , 40 , 721–737. [ Google Scholar ] [ CrossRef ]
  • Braun, H.I.; Shavelson, R.J.; Zlatkin-Troitschanskaia, O.; Borowiec, K. Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 2020 , 5 , 156. [ Google Scholar ] [ CrossRef ]
  • Cinque, M.; Carretero, S.; Napierala, J. Non-Cognitive Skills and Other Related Concepts: Towards a Better Understanding of Similarities and Differences ; Joint Research Centre, European Commission: Brussels, Belgium, 2021; 31p. [ Google Scholar ]
  • Pnevmatikos, D.; Christodoulou, P.; Georgiadou, T.; Lithoxoidou, A.; Dimitriadou, A.; Payan Carreira, R.; Simões, M.; Ferreira, D.; Rebelo, H.; Sebastião, L.; et al. THINK4JOBS TRAINING: Critical Thinking Training Packages for Higher Education Instructors and Labour Market Tutors ; University of Western Macedonia: Kozani, Greece, 2021. [ Google Scholar ]
  • Facione, P. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction (The Delphi Report) ; California Academic Press: Millbrae, CA, USA; Newark, DE, USA, 1990; 112p. [ Google Scholar ]
  • Payan-Carreira, R.; Sebastião, L.; Cristóvão, A.; Rebelo, H. How to Enhance Students’ Self-Regulation. In The Psychology of Self-Regulation ; Dutton, J., Ed.; Psychology of Emotions, Motivations and Actions; Nova Science Publishers, Inc.: Hauppauge, NY, USA, 2022; p. 22. (in press) [ Google Scholar ]
  • Rear, D. One size fits all? The limitations of standardised assessment in critical thinking. Assess. Eval. High. Educ. 2019 , 44 , 664–675. [ Google Scholar ] [ CrossRef ]
  • Thaiposri, P.; Wannapiroon, P. Enhancing Students’ Critical Thinking Skills through Teaching and Learning by Inquiry-based Learning Activities Using Social Network and Cloud Computing. Procedia-Soc. Behav. Sci. 2015 , 174 , 2137–2144. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Lai, R.E. Critical Thinking: A Literature Review. Pearson Res. Rep. 2011 , 6 , 40–41. [ Google Scholar ]
  • Shavelson, R.J.; Zlatkin-Troitschanskaia, O.; Beck, K.; Schmidt, S.; Marino, J.P. Assessment of University Students’ Critical Thinking: Next Generation Performance Assessment. Int. J. Test. 2019 , 19 , 337–362. [ Google Scholar ] [ CrossRef ]
  • Pnevmatikos, D.; Christodoulou, P.; Georgiadou, T. Promoting critical thinking in higher education through the values and knowledge education (VaKE) method. Stud. High. Educ. 2019 , 44 , 892–901. [ Google Scholar ] [ CrossRef ]
  • Facione, P.A. The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill. Informal Log. 2000 , 20 , 61–84. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Ennis, R.H. The Nature of Critical Thinking: Outlines of General Critical Thinking Dispositions and Abilities. 2013. Available online: https://education.illinois.edu/docs/default-source/faculty-documents/robert-ennis/thenatureofcriticalthinking_51711_000.pdf (accessed on 17 November 2022).
  • Halpern, D.F. Teaching critical thinking for transfer across domains. Dispositions, skills, structure training, and metacognitive monitoring. Am. Psychol. 1998 , 53 , 449–455. [ Google Scholar ] [ CrossRef ]
  • Nair, G.G.; Stamler, L. A Conceptual Framework for Developing a Critical Thinking Self-Assessment Scale. J. Nurs. Educ. 2013 , 52 , 131–138. [ Google Scholar ] [ CrossRef ]
  • Rapps, A.M. Let the Seuss loose. In Rutgers ; The State University of New Jersey: Camden, NJ, USA, 2017. [ Google Scholar ]
  • Tight, M. Twenty-first century skills: Meaning, usage and value. Eur. J. High. Educ. 2021 , 11 , 160–174. [ Google Scholar ] [ CrossRef ]
  • Ryan, C.; Tatum, K. Objective Measurement of Critical-Thinking Ability in Registered Nurse Applicants. JONA J. Nurs. Adm. 2012 , 42 , 89–94. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Patrício, M.F.; Julião, M.; Fareleira, F.; Carneiro, A.V. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med. Teach. 2013 , 35 , 503–514. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Hyytinen, H.; Ursin, J.; Silvennoinen, K.; Kleemola, K.; Toom, A. The dynamic relationship between response processes and self-regulation in critical thinking assessments. Stud. Educ. Eval. 2021 , 71 , 101090. [ Google Scholar ] [ CrossRef ]
  • Simper, N.; Frank, B.; Kaupp, J.; Mulligan, N.; Scott, J. Comparison of standardized assessment methods: Logistics, costs, incentives and use of data. Assess. Eval. High. Educ. 2019 , 44 , 821–834. [ Google Scholar ] [ CrossRef ]
  • Verburgh, A.; François, S.; Elen, J.; Janssen, R. The Assessment of Critical Thinking Critically Assessed in Higher Education: A Validation Study of the CCTT and the HCTA. Educ. Res. Int. 2013 , 2013 , 198920. [ Google Scholar ] [ CrossRef ]
  • Hart, C.; Da Costa, C.; D’Souza, D.; Kimpton, A.; Ljbusic, J. Exploring higher education students’ critical thinking skills through content analysis. Think. Ski. Creat. 2021 , 41 , 100877. [ Google Scholar ] [ CrossRef ]
  • Williamson, D.M.; Xi, X.; Breyer, F.J. A Framework for Evaluation and Use of Automated Scoring. Educ. Meas. Issues Pract. 2012 , 31 , 2–13. [ Google Scholar ] [ CrossRef ]
  • Haromi, F.; Sadeghi, K.; Modirkhameneh, S.; Alavinia, P.; Khonbi, Z. Teaching through Appraisal: Developing Critical Reading in Iranian EFL Learners. Proc. Int. Conf. Current Trends Elt 2014 , 98 , 127–136. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Ku, K.Y.L. Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Think. Ski. Creat. 2009 , 4 , 70–76. [ Google Scholar ] [ CrossRef ]
  • de Bie, H.; Wilhelm, P.; van der Meij, H. The Halpern Critical Thinking Assessment: Toward a Dutch appraisal of critical thinking. Think. Ski. Creat. 2015 , 17 , 33–44. [ Google Scholar ] [ CrossRef ]
  • Liu, O.L.; Frankel, L.; Roohr, K.C. Assessing Critical Thinking in Higher Education: Current State and Directions for Next-Generation Assessment. ETS Res. Rep. Ser. 2014 , 2014 , 1–23. [ Google Scholar ] [ CrossRef ]
  • Hatcher, D.L. Which test? Whose scores? Comparing standardized critical thinking tests. New Dir. Inst. Res. 2011 , 2011 , 29–39. [ Google Scholar ] [ CrossRef ]
  • Cole, J.S.; Gonyea, R.M. Accuracy of Self-reported SAT and ACT Test Scores: Implications for Research. Res. High. Educ. 2010 , 51 , 305–319. [ Google Scholar ] [ CrossRef ]
  • Althubaiti, A. Information bias in health research: Definition, pitfalls, and adjustment methods. J. Multidiscip Healthc. 2016 , 9 , 211–217. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Payan-Carreira, R.; Cruz, G.; Papathanasiou, I.V.; Fradelos, E.; Jiang, L. The effectiveness of critical thinking instructional strategies in health professions education: A systematic review. Stud. High. Educ. 2019 , 44 , 829–843. [ Google Scholar ] [ CrossRef ]
  • Kreitchmann, R.S.; Abad, F.J.; Ponsoda, V.; Nieto, M.D.; Morillo, D. Controlling for Response Biases in Self-Report Scales: Forced-Choice vs. Psychometric Modeling of Likert Items. Front. Psychol. 2019 , 10 , 2309. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Nair, G. Preliminary Psychometric Characteristics of the Critical Thinking Self-Assessment Scale ; University of Saskatchewan: Saskatoon, SK, Canada, 2011. [ Google Scholar ]
  • Nair, G.G.; Hellsten, L.M.; Stamler, L.L. Accumulation of Content Validation Evidence for the Critical Thinking Self-Assessment Scale. J. Nurs. Meas. 2017 , 25 , 156–170. [ Google Scholar ] [ CrossRef ]
  • Gudmundsson, E. Guidelines for translating and adapting psychological instruments. Nord. Psychol. 2009 , 61 , 29–45. [ Google Scholar ] [ CrossRef ]
  • Tsang, S.; Royse, C.F.; Terkawi, A.S. Guidelines for developing, translating, and validating a questionnaire in perioperative and pain medicine. Saudi J. Anaesth. 2017 , 11 , S80–S89. [ Google Scholar ] [ CrossRef ]
  • Gerdts-Andresen, T.; Hansen, M.T.; Grøndahl, V.A. Educational effectiveness: Validation of an instrument to measure students’ critical thinking and disposition. Int. J. Instr. 2022 , 25 , 685–700. [ Google Scholar ] [ CrossRef ]
  • Flora, D.B.; Curran, P.J. An empirical evaluation of alternative methods of estimation for confirmatory factor analysis with ordinal data. Psychol. Methods 2004 , 9 , 466–491. [ Google Scholar ] [ CrossRef ] [ PubMed ] [ Green Version ]
  • Hu, L.t.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. A Multidiscip. J. 1999 , 6 , 1–55. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F.; Page, M.; Brunsveld, N. Essentials of Business Research Methods , 4th ed.; Routledge: New York, NY, USA, 2019. [ Google Scholar ]
  • Cheung, G.W.; Rensvold, R.B. Evaluating Goodness-of-Fit Indexes for Testing Measurement Invariance. Struct. Equ. Model. A Multidiscip. J. 2002 , 9 , 233–255. [ Google Scholar ] [ CrossRef ]
  • Chen, F.F. Sensitivity of Goodness of Fit Indexes to Lack of Measurement Invariance. Struct. Equ. Model. A Multidiscip. J. 2007 , 14 , 464–504. [ Google Scholar ] [ CrossRef ]
  • Muthén, L.K.; Muthén, B.O. Mplus User’s Guide ; Muthén & Muthén: Los Angeles, CA, USA, 2012. [ Google Scholar ]
  • Brown, T.A. Confirmatory Factor Analysis for Applied Research , 2nd ed.; Guiford Press: New York, NJ, USA, 2015; 462p. [ Google Scholar ]
  • MacCallum, R.C.; Widaman, K.F.; Zhang, S.; Hong, S. Sample size in factor analysis. Psychol. Methods 1999 , 4 , 84–99. [ Google Scholar ] [ CrossRef ]
  • Commission, E. Tertiary Education Statistics ; Eurostat: Luxembourg, 2022. [ Google Scholar ]
  • Feinian, C.; Curran, P.J.; Bollen, K.A.; Kirby, J.; Paxton, P. An Empirical Evaluation of the Use of Fixed Cutoff Points in RMSEA Test Statistic in Structural Equation Models. Sociol. Methods Res. 2008 , 36 , 462–494. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Rosenman, R.; Tennekoon, V.; Hill, L.G. Measuring bias in self-reported data. Int. J. Behav. Healthc. Res. 2011 , 2 , 320–332. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Taber, K.S. The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Res. Sci. Educ. 2018 , 48 , 1273–1296. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Marôco, J. Análise de Equações Estruturais—Fundamentos Teóricos, Software & Aplicações , 2nd ed.; ReportNumber, Análise e Gestão de Informação, Ltd.: Pero Pinheiro, Portugal, 2014; p. 390. [ Google Scholar ]
  • Maroco, J. Análise Estatística com o SPSS Statistics , 7th ed.; ReportNumber-Análise e gestão de Informação, Ltd.: Pero Pinheiro, Portugal, 2018; 1013p. [ Google Scholar ]
  • Clark, L.A.; Watson, D. Constructing validity: New developments in creating objective measuring instruments. Psychol. Assess. 2019 , 31 , 1412–1427. [ Google Scholar ] [ CrossRef ]
CTSAS Dimensions (Skills/Sub-Skills)Items in the
Original CTSAS
Eliminated ItemsItems in the
CTSAS Short-Form
InterpretationCategorization1–92, 4, 6–81–3
Clarifying meaning15–2118–206–9
Decoding significance10–1410, 12, 144, 5
AnalysisDetecting arguments28–3332, 3315, 16
Analyzing arguments34–4934, 3917–20
Examining ideas22–2727–2910–14
EvaluationAssessing claims40–4440–4221, 22
Assessing arguments45–5246, 50, 5223–27
InferenceDrawing conclusions67–7467, 68, 7336–40
Conjecturing alternatives60–6662, 6531–35
Querying evidence53–5953, 54, 58, 5928–30
ExplanationStating results75–7976, 77, 7941, 42
Justifying procedures80–8881, 83–8843, 44
Presenting arguments89–9695, 9645–50
Self-regulationSelf-examination97–10598, 10451–57
Self-correction106–115107, 109–111, 113–11558–60
Modelsχ (df)pRMSEA
[90%IC]
CFITLI
Model 1: 1-factor model5159.412
(1710)
<0.00010.061
[0.059–0.063]
0.8930.890
Model 2: 6-factor model (non-correlated)29275.338
(1710)
<0.00010.174
[0.172–0176]
0.1480.118
Model 3: 6-factor model (correlated)3871.243
(1695)
<0.00010.049
[0.047–0.051]
0.9330.930
Model 4: second-order factor model3975.885
(1704)
<0.00010.051
[0.049–0.053]
0.9270.924
Model 5: bi-factor model18,656.904
(1657)
<0.00010.139
[0.137–0.141]
0.4740.439
SkillsαCrT-Skills12345
1. Interpretation0.7720.881
2. Analysis0.8880.9250.905
3. Evaluation0.8580.9650.8100.934
4. Inference0.9050.9560.8060.8580.937
5. Explanation0.8530.9070.7650.8250.8640.868
6. Self-regulation0.9050.8510.7500.7500.7810.8410.805
(df)
Female3488.157 (1704)<0.00010.052 [0.049–0.054]0.9290.926
Male2314.349 (1704)<0.00010.050 [0.045–0.055]0.9480.946
(df)
Configural invariance5521.460 (3390)<0.00010.049 [0.046–0.051]0.9390.936
Metric invariance5490.717 (3444)<0.00010.047 [0.045–0.050]0.9410.940
Scalar invariance5613.987 (3732)<0.00010.044 [0.041–0.046]0.9460.949
(df)
Metric vs. Configural45.988 (54)0.7730.0020.002
Scalar vs. Configural370.658 (342)0.1370.0050.007
Scalar vs. Metric328.786 (288)0.0490.0030.005
SkillsInterpretationAnalysisEvaluationInferenceExplanation
FMFMFMFMFM
Analysis0.8880.941
Evaluation0.7600.9000.9220.955
Inference0.7590.8900.8380.9020.9240.956
Explanation0.7390.8490.8160.8770.8500.9070.8560.925
Self-regulation0.7200.8080.7380.7800.7590.8250.8050.9070.7820.885
SkillsΔMeansSEEst/SEp
Interpretation−0.0140.106−0.1290.897
Analysis0.0230.0960.2440.807
Evaluation0.0710.0960.7360.462
Inference−0.0510.099−0.5120.608
Explanation0.1770.0971.8320.067
Self-regulation−0.0050.098−0.0460.963
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Payan-Carreira, R.; Sacau-Fontenla, A.; Rebelo, H.; Sebastião, L.; Pnevmatikos, D. Development and Validation of a Critical Thinking Assessment-Scale Short Form. Educ. Sci. 2022 , 12 , 938. https://doi.org/10.3390/educsci12120938

Payan-Carreira R, Sacau-Fontenla A, Rebelo H, Sebastião L, Pnevmatikos D. Development and Validation of a Critical Thinking Assessment-Scale Short Form. Education Sciences . 2022; 12(12):938. https://doi.org/10.3390/educsci12120938

Payan-Carreira, Rita, Ana Sacau-Fontenla, Hugo Rebelo, Luis Sebastião, and Dimitris Pnevmatikos. 2022. "Development and Validation of a Critical Thinking Assessment-Scale Short Form" Education Sciences 12, no. 12: 938. https://doi.org/10.3390/educsci12120938

Article Metrics

Article access statistics, supplementary material.

  • Externally hosted supplementary file 1 Doi: www.mdpi.com/xxx/s1

Further Information

Mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

ClickCease

The Science of Reading

2-day workshop sept 14 & 21, standards-based grading, 2-day workshop sept 17 & 24, social-emotional learning, 2-day workshop oct 22 & 29, engaging the 21st century learner, 4-day workshop oct 30, nov 6, 13, & 20, reclaiming the joy of teaching, full day workshop dec 7, culture & climate, full day workshop jan 12, trauma-informed practices, 2-day workshop jan 25 & feb 1, data-driven decisions, 2-day workshop feb 12 & 19.

critical thinking self assessment tool

Assessing Student Learning: 6 Types of Assessment and How to Use Them

assessment with bulb

Assessing student learning is a critical component of effective teaching and plays a significant role in fostering academic success. We will explore six different types of assessment and evaluation strategies that can help K-12 educators, school administrators, and educational organizations enhance both student learning experiences and teacher well-being.

We will provide practical guidance on how to implement and utilize various assessment methods, such as formative and summative assessments, diagnostic assessments, performance-based assessments, self-assessments, and peer assessments.

Additionally, we will also discuss the importance of implementing standard-based assessments and offer tips for choosing the right assessment strategy for your specific needs.

Importance of Assessing Student Learning

Assessment plays a crucial role in education, as it allows educators to measure students’ understanding, track their progress, and identify areas where intervention may be necessary. Assessing student learning not only helps educators make informed decisions about instruction but also contributes to student success and teacher well-being.

Assessments provide insight into student knowledge, skills, and progress while also highlighting necessary adjustments in instruction. Effective assessment practices ultimately contribute to better educational outcomes and promote a culture of continuous improvement within schools and classrooms.

1. Formative assessment

teacher assessing the child

Formative assessment is a type of assessment that focuses on monitoring student learning during the instructional process. Its primary purpose is to provide ongoing feedback to both teachers and students, helping them identify areas of strength and areas in need of improvement. This type of assessment is typically low-stakes and does not contribute to a student’s final grade.

Some common examples of formative assessments include quizzes, class discussions, exit tickets, and think-pair-share activities. This type of assessment allows educators to track student understanding throughout the instructional period and identify gaps in learning and intervention opportunities.

To effectively use formative assessments in the classroom, teachers should implement them regularly and provide timely feedback to students.

This feedback should be specific and actionable, helping students understand what they need to do to improve their performance. Teachers should use the information gathered from formative assessments to refine their instructional strategies and address any misconceptions or gaps in understanding. Formative assessments play a crucial role in supporting student learning and helping educators make informed decisions about their instructional practices.

Check Out Our Online Course: Standards-Based Grading: How to Implement a Meaningful Grading System that Improves Student Success

2. summative assessment.

students taking summative assessment

Examples of summative assessments include final exams, end-of-unit tests, standardized tests, and research papers. To effectively use summative assessments in the classroom, it’s important to ensure that they are aligned with the learning objectives and content covered during instruction.

This will help to provide an accurate representation of a student’s understanding and mastery of the material. Providing students with clear expectations and guidelines for the assessment can help reduce anxiety and promote optimal performance.

Summative assessments should be used in conjunction with other assessment types, such as formative assessments, to provide a comprehensive evaluation of student learning and growth.

3. Diagnostic assessment

Diagnostic assessment, often used at the beginning of a new unit or term, helps educators identify students’ prior knowledge, skills, and understanding of a particular topic.

This type of assessment enables teachers to tailor their instruction to meet the specific needs and learning gaps of their students. Examples of diagnostic assessments include pre-tests, entry tickets, and concept maps.

To effectively use diagnostic assessments in the classroom, teachers should analyze the results to identify patterns and trends in student understanding.

This information can be used to create differentiated instruction plans and targeted interventions for students struggling with the upcoming material. Sharing the results with students can help them understand their strengths and areas for improvement, fostering a growth mindset and encouraging active engagement in their learning.

4. Performance-based assessment

Performance-based assessment is a type of evaluation that requires students to demonstrate their knowledge, skills, and abilities through the completion of real-world tasks or activities.

The main purpose of this assessment is to assess students’ ability to apply their learning in authentic, meaningful situations that closely resemble real-life challenges. Examples of performance-based assessments include projects, presentations, portfolios, and hands-on experiments.

These assessments allow students to showcase their understanding and application of concepts in a more active and engaging manner compared to traditional paper-and-pencil tests.

To effectively use performance-based assessments in the classroom, educators should clearly define the task requirements and assessment criteria, providing students with guidelines and expectations for their work. Teachers should also offer support and feedback throughout the process, allowing students to revise and improve their performance.

Incorporating opportunities for peer feedback and self-reflection can further enhance the learning process and help students develop essential skills such as collaboration, communication, and critical thinking.

5. Self-assessment

Self-assessment is a valuable tool for encouraging students to engage in reflection and take ownership of their learning. This type of assessment requires students to evaluate their own progress, skills, and understanding of the subject matter. By promoting self-awareness and critical thinking, self-assessment can contribute to the development of lifelong learning habits and foster a growth mindset.

Examples of self-assessment activities include reflective journaling, goal setting, self-rating scales, or checklists. These tools provide students with opportunities to assess their strengths, weaknesses, and areas for improvement. When implementing self-assessment in the classroom, it is important to create a supportive environment where students feel comfortable and encouraged to be honest about their performance.

Teachers can guide students by providing clear criteria and expectations for self-assessment, as well as offering constructive feedback to help them set realistic goals for future learning.

Incorporating self-assessment as part of a broader assessment strategy can reinforce learning objectives and empower students to take an active role in their education.

Reflecting on their performance and understanding the assessment criteria can help them recognize both short-term successes and long-term goals. This ongoing process of self-evaluation can help students develop a deeper understanding of the material, as well as cultivate valuable skills such as self-regulation, goal setting, and critical thinking.

6. Peer assessment

Peer assessment, also known as peer evaluation, is a strategy where students evaluate and provide feedback on their classmates’ work. This type of assessment allows students to gain a better understanding of their own work, as well as that of their peers.

Examples of peer assessment activities include group projects, presentations, written assignments, or online discussion boards.

In these settings, students can provide constructive feedback on their peers’ work, identify strengths and areas for improvement, and suggest specific strategies for enhancing performance.

Constructive peer feedback can help students gain a deeper understanding of the material and develop valuable skills such as working in groups, communicating effectively, and giving constructive criticism.

To successfully integrate peer assessment in the classroom, consider incorporating a variety of activities that allow students to practice evaluating their peers’ work, while also receiving feedback on their own performance.

Encourage students to focus on both strengths and areas for improvement, and emphasize the importance of respectful, constructive feedback. Provide opportunities for students to reflect on the feedback they receive and incorporate it into their learning process. Monitor the peer assessment process to ensure fairness, consistency, and alignment with learning objectives.

Implementing Standard-Based Assessments

kids having quizzes

Standard-based assessments are designed to measure students’ performance relative to established learning standards, such as those generated by the Common Core State Standards Initiative or individual state education guidelines.

By implementing these types of assessments, educators can ensure that students meet the necessary benchmarks for their grade level and subject area, providing a clearer picture of student progress and learning outcomes.

To successfully implement standard-based assessments, it is essential to align assessment tasks with the relevant learning standards.

This involves creating assessments that directly measure students’ knowledge and skills in relation to the standards rather than relying solely on traditional testing methods.

As a result, educators can obtain a more accurate understanding of student performance and identify areas that may require additional support or instruction. Grading formative and summative assessments within a standard-based framework requires a shift in focus from assigning letter grades or percentages to evaluating students’ mastery of specific learning objectives.

This approach encourages educators to provide targeted feedback that addresses individual student needs and promotes growth and improvement. By utilizing rubrics or other assessment tools, teachers can offer clear, objective criteria for evaluating student work, ensuring consistency and fairness in the grading process.

Tips For Choosing the Right Assessment Strategy

When selecting an assessment strategy, it’s crucial to consider its purpose. Ask yourself what you want to accomplish with the assessment and how it will contribute to student learning. This will help you determine the most appropriate assessment type for your specific situation.

Aligning assessments with learning objectives is another critical factor. Ensure that the assessment methods you choose accurately measure whether students have met the desired learning outcomes. This alignment will provide valuable feedback to both you and your students on their progress. Diversifying assessment methods is essential for a comprehensive evaluation of student learning.

By using a variety of assessment types, you can gain a more accurate understanding of students’ strengths and weaknesses. This approach also helps support different learning styles and reduces the risk of overemphasis on a single assessment method.

Incorporating multiple forms of assessment, such as formative, summative, diagnostic, performance-based, self-assessment, and peer assessment, can provide a well-rounded understanding of student learning. By doing so, educators can make informed decisions about instruction, support, and intervention strategies to enhance student success and overall classroom experience.

Challenges and Solutions in Assessment Implementation

Implementing various assessment strategies can present several challenges for educators. One common challenge is the limited time and resources available for creating and administering assessments. To address this issue, teachers can collaborate with colleagues to share resources, divide the workload, and discuss best practices.

Utilizing technology and online platforms can also streamline the assessment process and save time. Another challenge is ensuring that assessments are unbiased and inclusive.

To overcome this, educators should carefully review assessment materials for potential biases and design assessments that are accessible to all students, regardless of their cultural backgrounds or learning abilities.

Offering flexible assessment options for the varying needs of learners can create a more equitable and inclusive learning environment. It is essential to continually improve assessment practices and seek professional development opportunities.

Seeking support from colleagues, attending workshops and conferences related to assessment practices, or enrolling in online courses can help educators stay up-to-date on best practices while also providing opportunities for networking with other professionals.

Ultimately, these efforts will contribute to an improved understanding of the assessments used as well as their relevance in overall student learning.

Assessing student learning is a crucial component of effective teaching and should not be overlooked. By understanding and implementing the various types of assessments discussed in this article, you can create a more comprehensive and effective approach to evaluating student learning in your classroom.

Remember to consider the purpose of each assessment, align them with your learning objectives, and diversify your methods for a well-rounded evaluation of student progress.

If you’re looking to further enhance your assessment practices and overall professional development, Strobel Education offers workshops , courses , keynotes , and coaching  services tailored for K-12 educators. With a focus on fostering a positive school climate and enhancing student learning,  Strobel Education can support your journey toward improved assessment implementation and greater teacher well-being.

Related Posts

SBG: Step 1 Prioritizing

SBG | Step 1 – Prioritizing

critical thinking self assessment tool

Exploring Traditional Grading Systems: Pros, Cons, And Alternatives

critical thinking self assessment tool

Do You Know How Our Current Grading System Was Founded

kids with their grades

Leading Change with Effective Grading Practices

Subscribe to our blog today, keep in touch.

Copyright 2024 Strobel Education, all rights reserved.

What is the Critical Thinking Test?

Critical thinking practice test, take a free practice critical thinking test, practice critical thinking test.

Updated November 16, 2023

Edward Melett

The Critical Thinking Test is a comprehensive evaluation designed to assess individuals' cognitive capacities and analytical prowess.

This formal examination, often referred to as the critical thinking assessment, is a benchmark for those aiming to demonstrate their proficiency in discernment and problem-solving.

In addition, this evaluative tool meticulously gauges a range of skills, including logical reasoning, analytical thinking, and the ability to evaluate and synthesize information.

This article will embark on an exploration of the Critical Thinking Test, elucidating its intricacies and elucidating its paramount importance. We will dissect the essential skills it measures and clarify its significance in gauging one's intellectual aptitude.

We will examine examples of critical thinking questions, illuminating the challenging scenarios that candidates encounter prompting them to navigate the complexities of thought with finesse.

Before going ahead to take the critical thinking test, let's delve into the realm of preparation. This segment serves as a crucible for honing the skills assessed in the actual examination, offering candidates a chance to refine their analytical blades before facing the real challenge. Here are some skills that will help you with the critical thinking assessment: Logical Reasoning: The practice test meticulously evaluates your ability to deduce conclusions from given information, assess the validity of arguments, and recognize patterns in logic. Analytical Thinking: Prepare to dissect complex scenarios, identify key components, and synthesize information to draw insightful conclusions—a fundamental aspect of the critical thinking assessment. Problem-Solving Proficiency: Navigate through intricate problems that mirror real-world challenges, honing your capacity to approach issues systematically and derive effective solutions. What to Expect: The Critical Thinking Practice Test is crafted to mirror the format and complexity of the actual examination. Expect a series of scenarios, each accompanied by a set of questions that demand thoughtful analysis and logical deduction. These scenarios span diverse fields, from business and science to everyday scenarios, ensuring a comprehensive evaluation of your critical thinking skills. Examples of Critical Thinking Questions Scenario: In a business context, analyze the potential impacts of a proposed strategy on both short-term profitability and long-term sustainability. Question: What factors would you consider in determining the viability of the proposed strategy, and how might it affect the company's overall success? Scenario: Evaluate conflicting scientific studies on a pressing environmental issue.

Question: Identify the key methodologies and data points in each study. How would you reconcile the disparities to form an informed, unbiased conclusion?

Why Practice Matters

Engaging in the Critical Thinking Practice Test familiarizes you with the test format and cultivates a mindset geared towards agile and astute reasoning. This preparatory phase allows you to refine your cognitive toolkit, ensuring you approach the assessment with confidence and finesse.

We'll navigate through specific examples as we proceed, offering insights into effective strategies for tackling critical thinking questions. Prepare to embark on a journey of intellectual sharpening, where each practice question refines your analytical prowess for the challenges ahead.

This is a practice critical thinking test.

The test consists of three questions . 

After you have answered all the questions, you will be shown the correct answers and given full explanations.

Make sure you read and fully understand each question before answering. Work quickly, but don't rush. You cannot afford to make mistakes on a real test .

If you get a question wrong, make sure you find out why and learn how to answer this type of question in the future. 

Six friends are seated in a restaurant across a rectangular table. There are three chairs on each side. Adam and Dorky do not have anyone sitting to their right and Clyde and Benjamin do not have anyone sitting to their left. Adam and Benjamin are not sitting on the same side of the table.

If Ethan is not sitting next to Dorky, who is seated immediately to the left of Felix?

Job Test Prep

You might also be interested in these other PRT articles:

15 Free Psychometric Test Questions and Answers

loading

The University of Edinburgh home

  • Schools & departments

Reflection Toolkit

List of tools for reflection

A quick overview of all the reflective tools included in the Reflectors' Toolkit and a description of the examples provided.

Models for reflecting on experience

The 5r framework for reflection.

This framework takes you through Reporting, Responding, Relating, Reasoning, and Reconstructing.

Example
Reflecting on a meeting with a supervisor

The reflector had a vague objective going into the meeting and did not manage to get what they needed from it, leaving them feeling that they wasted time.

This example does not reference theoretical literature to support the reflection.

5R framework  (within Reflectors' Toolkit)

The CARL framework of reflection

This framework takes you through Context, Action, Results, and Learning.

Examples

The two examples complement each other. They look at the same experience of speaking on front of a group of high school students who are interested in studying the degree of the speaker. However, one example reflects on an overarching process level, where the other reflects on a specific experience level.

These examples do not reference theoretical literature to support the reflection.

CARL framework (within Reflectors’ Toolkit)

The four F’s of active reviewing

This framework takes you through Facts, Feelings, Findings, Future.

Examples

The first example show how reflection can easily be applied in situations that go well. Here a retail worker deals with a challenging customer.

The second example is a waiter reflecting on their experience of forgetting an order.

These examples do not reference theoretical literature to support the reflection.

Four F’s of active reviewing (within Reflectors’ Toolkit)

Gibbs’ reflective cycle

This model takes you through Description, Feelings, Evaluation, Analysis, Conclusion, and Action plan.

Examples

The examples reflect each other. Both examples use the same scenario of a group where the work was divided between the members. When the group combined their work they found they much more work was needed.

The example reflections show how reflections can be brief and lengthy – both have their pros and cons.

These examples reference theoretical literature to support the reflection.

Gibbs’ reflective cycle (within Reflectors’ Toolkit)

The integrated reflective cycle

This model takes you through The Experience, Reflecting on Action, Theory, and Preparation.

Example
Consultation with a patient

The reflector follows the instructions of a theoretical model too rigidly in a patient consultation, and updates their understanding.

While this example is specific to a medical field, there are elements of nervousness and performance anxiety, which should be quite general.

This example references theoretical literature to support the reflection.

Integrated reflective cycle (within Reflectors’ Toolkit)

What? So what? Now What?

This model takes you through three core questions: What? So what? Now What?

Examples

The first example shows a reflection on getting a mark lower than what the reflector was hoping for.

The second example is a reflection on finding it challenge to participate in workshops.

These examples do not reference theoretical literature to support the reflection.

What? So what? Now What? (within Reflectors’ Toolkit)

Self-awareness activities

Goal setting.

Setting goals can be an extremely powerful activity. On this page a series of reflective questions is provided to ensure that the goal setting process is reflective.

Goal setting (within the Reflectors’ Toolkit)

Items for self-awareness

A fun activity that uses items as a foundation for the reflective process. The goal of the activity is to choose an item that mirrors a certain quality. When done correctly and reflectively it can be a helpful activity to get a new perspective. Types of questions could be: ‘Choose an item from your room that shows how you are as a learner’.

Items for self-awareness (within the Reflectors’ Toolkit)

Strengths and weaknesses

Being able to identify what your strengths and weaknesses are is valuable both for personal knowledge and to be able to communicate them to others. On this page two reflective activities are provided to help you identify your strengths and weaknesses.

Strengths and weaknesses (within the Reflectors’ Toolkit)

Knowing what your personal values are can be extremely beneficial to help you prioritise and make decisions. On this page two reflective approaches are provided for identifying your values.

Values (within the Reflectors’ Toolkit)

Writing letters to your future and past selves

Looking into the future and examining the past can help inform how we should act in the present. On this page directions for two reflective letter-writing activities are provided; one for your past self and one for your future self.

Writing letters to your future and past selves (within the Reflectors’ Toolkit)

Assessment of Critical Thinking

  • First Online: 10 December 2023

Cite this chapter

critical thinking self assessment tool

  • Dirk Jahn 3 &
  • Michael Cursio 4  

225 Accesses

The term “to assess” has various meanings, such as to judge, evaluate, estimate, gauge, or determine. Assessment is therefore a diagnostic inventory of certain characteristics of a section of observable reality on the basis of defined criteria. In a pedagogical context, assessments aim to make learners’ knowledge, skills, or attitudes observable in certain application situations and to assess them on the basis of observation criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

To give an example: Holistic Critical Thinking Rubric from East Georgia College; Available at https://studylib.net/doc/7608742/east-georgia-college-holistic-critical-thinking-rubric-cr… (04/03/2020).

Astleitner, H. (1998). Kritisches Denken. Basisqualifikation für Lehrer und Ausbildner . Studien.

Google Scholar  

Biggs, J. (2003). Aligning teaching and assessment to curriculum objectives . https://www.heacademy.ac.uk/sites/default/files/biggs-aligning-teaching-and-assessment.pdf . Accessed 21 Apr 2015.

Brookfield, S. (2003). Critical thinking in adulthood. In D. J. Fasko & D. J. Fasko (Eds.), Critical thinking and reasoning. Current research, theory, and practice (pp. 143–163). Hampton Press.

Ennis, R. H. (2003). Critical thinking assessment. In D. Fasko (Ed.), Critical thinking and reasoning. Current research, theory, and practice (pp. 293–314). Hampton Press.

Garrison, D. R. (1992). Critical thinking and self-directed learning in adult education: an analysis of responsibility and control issues. Adult Education Quarterly, 42 (3), 136–148.

Article   Google Scholar  

Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st century. A framework for research and practice . Routledge.

Book   Google Scholar  

Grotjahn, R. (1999). Testtheorie: Grundzüge und Anwendung in der Praxis. Materialien Deutsch als Fremdsprache, 53 , 304–341.

Halpern, D. F. (2003). The “how” and “why” of critical thinking assessment. In D. Fasko (Ed.), Critical thinking and reasoning: Current research, theory and practice . Hampton Press.

Handke, J., & Schäfer, A. M. (2012). E-learning, E-teaching und E-assessment in der Hochschullehre: Eine Anleitung: Eine Anleitung . Oldenbourg.

Ingenkamp, K. (1985). Lehrbuch der Pädagogischen Diagnostik . Beltz Verlag.

Jahn, D. (2012a). Kritisches Denken fördern können. Entwicklung eines didaktischen Designs zur Qualifizierung pädagogischer Professionals . Shaker.

Landis, M., Swain, K. D., Friehe, M. J., & Coufal, K. L. (2007). Evaluating critical thinking in class and online: Comparison of the Newman method and the Facione Rubric. Teacher Education Quarterly, 34 (4), 121–136.

Newman, D. R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Computing and Technology: An Electronic Journal for the 21st Century, 2 , 56–77.

Newman, D. R., Johnson, C., Cochrane, C. & Webb, B. (1996). An experiment in group learning technology: evaluating critical thinking in face-to-face and computer-supported seminars . Verfügbar unter: http://emoderators.com/ipct-j/1996/n1/newman/contents.html . Accessed 12 Apr.

Pandero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9 , 129–144.

Reinmann-Rothmeier, G., & Mandl, H. (1999). Unterrichten und Lernumgebungen gestalten (überarbeitete Fassung). Forschungsbericht Nr. 60. Ludwig-Maximilans-Universität München Institut für Pädagogische Psychologie und Empirische Pädagogik.

Rieck, K, unter Mitarbeit von Hoffmann, D., & Friege, G. (2005). Gute Aufgaben. In Modulbeschreibung des Programms SINUS-Transfer Grundschule. https://www.schulportal-thueringen.de/get-data/a79020fe-f99b-4153-8de5-cfff12f92f30/N1.pdf . Accessed 27 Jan 2020.

Sopka, S., Simon, M., & Beckers, S. (2013). “Assessment drives Learning”: Konzepte zur Erfolgs- und Qualitätskontrolle. In M. St. Pierre & G. Breuer (Eds.), Simulation in der Medizin . Springer.

Wilbers, K. (2014). Wirtschaftsunterricht gestalten. Toolbox (2. Aufl.). epubli.

Wilbers, K. (2019). Wirtschaftsunterricht gestalten. epubli GmbH. https://www.pedocs.de/volltexte/2019/17949/pdf/Wilbers_2019_Wi.rtschaftsunterricht_gestalten.pdf . Accessed 24 Okt 2019.

Download references

Author information

Authors and affiliations.

Friedrich Alexander Uni, Fortbildungszentrum Hochschullehre FBZHL, Fürth, Bayern, Germany

Friedrich Alexander Universität Erlangen-Nürnberg, Fortbildungszentrum Hochschullehre FBZHL, Fürth, Germany

Michael Cursio

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Fachmedien Wiesbaden GmbH, part of Springer Nature

About this chapter

Jahn, D., Cursio, M. (2023). Assessment of Critical Thinking. In: Critical Thinking. Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-41543-3_8

Download citation

DOI : https://doi.org/10.1007/978-3-658-41543-3_8

Published : 10 December 2023

Publisher Name : Springer VS, Wiesbaden

Print ISBN : 978-3-658-41542-6

Online ISBN : 978-3-658-41543-3

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Critical Thinking test

By 123test team . Updated May 12, 2023

Critical Thinking test reviews

This Critical Thinking test measures your ability to think critically and draw logical conclusions based on written information. Critical Thinking tests are often used in job assessments in the legal sector to assess a candidate's  analytical critical  thinking skills. A well known example of a critical thinking test is the Watson-Glaser Critical Thinking Appraisal .

Need more practice?

Score higher on your critical thinking test.

The test comprises of the following five sections with a total of 10 questions:

  • Analysing Arguments
  • Assumptions
  • Interpreting Information

Instructions Critical Thinking test

Each question presents one or more paragraphs of text and a question about the information in the text. It's your job to figure out which of the options is the correct answer.

Below is a statement that is followed by an argument. You should consider this argument to be true. It is then up to you to determine whether the argument is strong or weak. Do not let your personal opinion about the statement play a role in your evaluation of the argument.

Statement: It would be good if people would eat vegetarian more often. Argument: No, because dairy also requires animals to be kept that will have to be eaten again later.

Is this a strong or weak argument?

Strong argument Weak argument

Statement: Germany should no longer use the euro as its currency Argument: No, because that means that the 10 billion Deutschmark that the introduction of the euro has cost is money thrown away.

Overfishing is the phenomenon that too much fish is caught in a certain area, which leads to the disappearance of the fish species in that area. This trend can only be reversed by means of catch reduction measures. These must therefore be introduced and enforced.

Assumption: The disappearance of fish species in areas of the oceans is undesirable.

Is the assumption made from the text?

Assumption is made Assumption is not made

As a company, we strive for satisfied customers. That's why from now on we're going to keep track of how quickly our help desk employees pick up the phone. Our goal is for that phone to ring for a maximum of 20 seconds.

Assumption: The company has tools or ways to measure how quickly help desk employees pick up the phone.

  • All reptiles lay eggs
  • All reptiles are vertebrates
  • All snakes are reptiles
  • All vertebrates have brains
  • Some reptiles hatch their eggs themselves
  • Most reptiles have two lungs
  • Many snakes only have one lung
  • Cobras are poisonous snakes
  • All reptiles are animals

Conclusion: Some snakes hatch their eggs themselves.

Does the conclusion follow the statements?

Conclusion follows Conclusion does not follow

(Continue with the statements from question 5.)

Conclusion: Some animals that lay eggs only have one lung.

In the famous 1971 Stanford experiment, 24 normal, healthy male students were randomly assigned as 'guards' (12) or 'prisoners' (12). The guards were given a uniform and instructed to keep order, but not to use force. The prisoners were given prison uniforms. Soon after the start of the experiment, the guards made up all kinds of sentences for the prisoners. Insurgents were shot down with a fire extinguisher and public undressing or solitary confinement was also a punishment. The aggression of the guards became stronger as the experiment progressed. At one point, the abuses took place at night, because the guards thought that the researchers were not watching. It turned out that some guards also had fun treating the prisoners very cruelly. For example, prisoners got a bag over their heads and were chained to their ankles. Originally, the experiment would last 14 days. However, after six days the experiment was stopped.

The students who took part in the research did not expect to react the way they did in such a situation.

To what extent is this conclusion true, based on the given text?

True Probably true More information required Probably false False

(Continue with the text from 'Stanford experiment' in question 7.)

The results of the experiment support the claim that every young man (or at least some young men) is capable of turning into a sadist fairly quickly.

  • A flag is a tribute to the nation and should therefore not be hung outside at night. Hoisting the flag therefore happens at sunrise, bringing it down at sunset. Only when a country flag is illuminated by spotlights on both sides, it may remain hanging after sunset. There is a simple rule of thumb for the time of bringing down the flag. This is the moment when there is no longer any visible difference between the individual colors of the flag.
  • A flag may not touch the ground.
  • On the Dutch flag, unless entitled to do so, no decorations or other additions should be made. Also the use of a flag purely for decoration should be avoided. However, flag cloth may be used for decoration - for example in the form of drapes.
  • The orange pennant is only used on birthdays of members of the Royal House and on King's Day. The orange pennant should be as long or slightly longer than the diagonal of the flag.

Conclusion: One can assume that no Dutch flag will fly at government buildings at night, unless it is illuminated by spotlights on both sides.

Does the conclusion follow, based on the given text?

(Continue with the text from 'Dutch flag protocol' in question 9.)

Conclusion: If the protocol is followed, the orange pennant will always be longer than the horizontal bands/stripes of the flag.

Please answer the questions below. Not all questions are required but it will help us improve this test.

My educational level is

-- please select -- primary school high school college university PhD other

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

A conceptual framework for developing a critical thinking self-assessment scale

Affiliation.

  • 1 South Dakota State University, Brookings, SD, USA.
  • PMID: 23402245
  • DOI: 10.3928/01484834-20120215-01

Nurses must be talented critical thinkers to cope with the challenges related to the ever-changing health care system, population trends, and extended role expectations. Several countries now recognize critical thinking skills (CTS) as an expected outcome of nursing education programs. Critical thinking has been defined in multiple ways by philosophers, critical thinking experts, and educators. Nursing experts conceptualize critical thinking as a process involving cognitive and affective domains of reasoning. Nurse educators are often challenged with teaching and measuring CTS because of their latent nature and the lack of a uniform definition of the concept. In this review of the critical thinking literature, we examine various definitions, identify a set of constructs that define critical thinking, and suggest a conceptual framework on which to base a self-assessment scale for measuring CTS.

Copyright 2013, SLACK Incorporated.

PubMed Disclaimer

Similar articles

  • Critical thinking skills of baccalaureate nursing students at program entry and exit. Thompson C, Rebeschi LM. Thompson C, et al. Nurs Health Care Perspect. 1999 Sep-Oct;20(5):248-52. Nurs Health Care Perspect. 1999. PMID: 10754847
  • Critical thinking as an outcome of context-based learning among post RN students: a literature review. Worrell JA, Profetto-McGrath J. Worrell JA, et al. Nurse Educ Today. 2007 Jul;27(5):420-6. doi: 10.1016/j.nedt.2006.07.004. Epub 2006 Sep 1. Nurse Educ Today. 2007. PMID: 16945453 Review.
  • Promotion of critical thinking by using case studies as teaching method. Popil I. Popil I. Nurse Educ Today. 2011 Feb;31(2):204-7. doi: 10.1016/j.nedt.2010.06.002. Epub 2010 Jul 23. Nurse Educ Today. 2011. PMID: 20655632 Review.
  • Cross-cultural perspectives on critical thinking. Jenkins SD. Jenkins SD. J Nurs Educ. 2011 May;50(5):268-74. doi: 10.3928/01484834-20110228-02. Epub 2011 Feb 28. J Nurs Educ. 2011. PMID: 21366168
  • Accumulation of Content Validation Evidence for the Critical Thinking Self-Assessment Scale. Nair GG, Hellsten LM, Stamler LL. Nair GG, et al. J Nurs Meas. 2017 Apr 1;25(1):156-170. doi: 10.1891/1061-3749.25.1.156. J Nurs Meas. 2017. PMID: 28395706
  • Constructing a critical thinking evaluation framework for college students majoring in the humanities. Li S, Tang S, Geng X, Liu Q. Li S, et al. Front Psychol. 2022 Nov 25;13:1017885. doi: 10.3389/fpsyg.2022.1017885. eCollection 2022. Front Psychol. 2022. PMID: 36506989 Free PMC article.
  • Analysis of Critical Thinking Path of College Students Under STEAM Course. Sha J, Shu H, Kan Z. Sha J, et al. Front Psychol. 2021 Sep 3;12:723185. doi: 10.3389/fpsyg.2021.723185. eCollection 2021. Front Psychol. 2021. PMID: 34539526 Free PMC article.

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Ovid Technologies, Inc.

Other Literature Sources

  • scite Smart Citations

Research Materials

  • NCI CPTC Antibody Characterization Program

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Are You Good at Critical Thinking? [Self-Assessment Test]

Critical thinking is a key skill for you to possess if you want to succeed in today’s dynamic and complex work environment.

Critical thinking is defined as the process of analyzing information, facts, and situations objectively and making well-reasoned judgments, and decisions and solving problems.

Critical thinking usually involves asking questions, evaluating evidence, understanding context and circumstances, and integrating various perspectives to come up with sound conclusions.

The complexity of many modern jobs usually demands that employees be competent critical thinkers. Critical thinking allows you to make better and more informed decisions, find creative solutions to problems, and evaluate risk effectively. It also enables you to identify assumptions, biases, and fallacies in your thinking and of others, which ensures that your thinking remains on track and objective.

For instance, let’s say you work in marketing, and you have been tasked with identifying the most effective social media platform to launch a new product.

By using critical thinking skills, you’d first evaluate the different platforms available objectively, research the demographics, the features and the target audience, and then make an informed decision that would ensure that the product gets maximum exposure and visibility.

By now, you may be wondering if you possess a strong ability to think critically.

This is where the self-assessment comes in. Our self-assessment will enable you to identify your critical thinking strengths and weaknesses and provide you with recommendations to enhance your thinking skills.

It’s time to take the self-assessment test and begin your journey to becoming a more effective critical thinker.

Self Assessment Test

To conduct the self-assessment, simply answer all questions, and click the calculate results button at the end.

I seek out and evaluate different perspectives and ideas before arriving at a conclusion. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to identify and analyze a problem to develop creative solutions. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I can recognize and evaluate arguments made by others and can construct strong arguments of my own. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I use evidence and reasoning to support my ideas and decisions. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am open-minded and consider alternatives before making decisions. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to identify and question my own assumptions and biases. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I ask questions to clarify information and to challenge assumptions or conclusions. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I can effectively communicate my ideas and reasoning to others. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to think creatively and generate new ideas. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am willing to change my mind based on new information or evidence. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to identify the strengths and weaknesses of my own thinking and the thinking of others. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to analyze complex information and identify connections and patterns. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to anticipate potential consequences of a decision or action. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to evaluate risks and benefits when making a decision. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree I am able to identify and evaluate the validity and reliability of information sources. Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree

Interpreting Your Results

0-20 points.

If you scored 0-20 points, you might want to work on developing your critical thinking skills further. Critical thinking involves looking at issues objectively, analyzing them logically, and coming up with thoughtful, well-reasoned solutions. Consider seeking out resources to improve your critical thinking skills, such as books, online courses, or workshops. With practice, you can develop better critical thinking skills and become more self-aware.

21-40 Points

If you scored 21-40 points, you have some critical thinking skills, but there is room for improvement. Continue to develop your ability to analyze issues objectively, think logically, and evaluate evidence. Seek out opportunities to practice your critical thinking skills in your personal and professional life. By continuing to hone your skills, you will become a more effective problem-solver and decision-maker.

41-60 Points

If you scored 41-60 points, congratulations! You have strong critical thinking skills. You are able to look at complex problems and analyze them logically and objectively to come up with solutions. You are also able to evaluate evidence and make informed decisions. Continue to use and refine your critical thinking skills, and you will be an asset in many areas of your life, including work, relationships, and personal growth.

5 Quick Tips to Become Better at Critical Thinking

Critical thinking is a valuable skill that helps you make informed decisions, solve problems, and evaluate arguments. If you want to improve your critical thinking skills, here are five quick tips you can follow:

1. Clarify Your Thinking

Before you can start evaluating arguments or solving problems, you need to clarify your own thinking. This means being clear about what you believe, what you don’t know, and what assumptions you’re making. Start by asking yourself questions like “What do I know?”, “What do I need to know?” and “What am I assuming?” By clarifying your thinking, you can avoid jumping to conclusions and improve your ability to evaluate arguments.

2. Practice Active Listening

Critical thinking involves listening carefully to other people’s arguments and ideas. To become better at critical thinking, you need to practice active listening . This means paying full attention to what the other person is saying, asking questions to clarify their points, and considering their perspective. Active listening can help you identify assumptions, biases, and logical fallacies in other people’s arguments.

3. Ask Questions

Asking questions is a key part of critical thinking. When you encounter a new idea or argument, ask questions to help you understand it better. Some good questions to ask include “What evidence supports this claim?” “What is the source of this information?” and “What are the assumptions underlying this argument?” By asking questions, you can evaluate arguments more effectively and avoid being misled by faulty reasoning.

4. Evaluate the Evidence

To become a good critical thinker, you need to be able to evaluate evidence objectively. This means looking for evidence that supports or contradicts an argument, considering the quality of the evidence, and evaluating the sources of the evidence. When evaluating evidence, be aware of your own biases and assumptions and try to avoid cherry-picking evidence to support your own position.

5. Practice Problem-Solving

Critical thinking involves solving problems and making decisions based on evidence and logical reasoning. To become better at critical thinking, practice problem-solving. Identify problems in your daily life and brainstorm solutions, considering the advantages and disadvantages of each. By practicing problem-solving, you can develop your critical thinking skills and improve your ability to analyze complex problems.

Disclaimers

All the information on this website - https://melbado.com/ - is published in good faith and for general information purpose only. Melbado does not make any warranties about the completeness, reliability and accuracy of this information. Any action you take upon the information you find on this website (Melbado), is strictly at your own risk. Melbado will not be liable for any losses and/or damages in connection with the use of our website.

From our website, you can visit other websites by following hyperlinks to such external sites. While we strive to provide only quality links to useful and ethical websites, we have no control over the content and nature of these sites. These links to other websites do not imply a recommendation for all the content found on these sites. Site owners and content may change without notice and may occur before we have the opportunity to remove a link which may have gone 'bad'.

Please be also aware that when you leave our website, other sites may have different privacy policies and terms which are beyond our control. Please be sure to check the Privacy Policies of these sites as well as their "Terms of Service" before engaging in any business or uploading any information.

By using our website, you hereby consent to our disclaimer and agree to its terms.

IMAGES

  1. (PDF) CRITICAL THINKING ASSESSMENT SCALE (CTAS)

    critical thinking self assessment tool

  2. BC Core Competencies Critical Thinking Self Assessment for Intermediate

    critical thinking self assessment tool

  3. Tools Of Critical Thinking

    critical thinking self assessment tool

  4. [PDF] Self-Assessment of Critical Thinking Skills in EFL Writing

    critical thinking self assessment tool

  5. Preliminary psychometric characteristics of the critical thinking self

    critical thinking self assessment tool

  6. Self-Assessment: Critical/Creative Thinking Core Competency BC Curriculum

    critical thinking self assessment tool

VIDEO

  1. The Critical Thinker 002: Self-Defense

  2. How to COMPARE and CONTRAST using a SOLO Taxonomy Map

  3. How to build critical thinking in students?

  4. Critical Thinking 🤔 #education #criticalthinkingmastery #LearningStyles #TeacherTips

  5. Organizational Self-Assessment: A Strategy to Build Capacity

  6. Learning oriented assessment, critical thinking and English language speaking skills with Dr Mansoor

COMMENTS

  1. Critical Thinking Testing and Assessment

    The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students' abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to ...

  2. Critical Thinking: Where to Begin

    A Brief Definition: Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. A well-cultivated critical thinker: communicates effectively with others in figuring out solutions to complex problems. Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking.

  3. Structures for Student Self-Assessment

    Structures for Student Self-Assessment. Critical thinking is thinking that assesses itself. To the extent that our students need us to tell them how well they are doing, they are not thinking critically. Didactic instruction makes students overly dependent on the teacher. In such instruction, students rarely develop any perceptible intellectual ...

  4. CTS Tools for Faculty and Student Assessment

    Cornell Critical Thinking Test (CCTT) There are two forms of the CCTT, X and Z. Form X is for students in grades 4-14. Form Z is for advanced and gifted high school students, undergraduate and graduate students, and adults. Reliability estimates for Form Z range from .49 to .87 across the 42 groups who have been tested.

  5. Fostering and assessing student critical thinking: From theory to

    Vardi highlighted three dispositions involved in critical thinking: (1) self-regulation (self-discipline and self-management); (2) having an open, fair and reasonable mind, a preparedness to identify and face one's own biases, and preparedness to reconsider one's own views where warranted; (3) being committed to ongoing self-improvement and ...

  6. PDF Student Self-Assessment Critical Thinking Questionnaire

    The Student Self-Assessment Critical Thinking Questionnaire is a tool which has been designed to help students to assess their performance as critical thinkers. It is used after an activity or a project and can serve as a self-reflection tool or as a starting point for class discussion. The questions follow the student's critical thinking ...

  7. Thinking Critically: A Self Evaluation

    Critical Thinking Self Assessment. Critical thinking self-assessment is an evaluation of one's ability to think critically and analyze a situation. It seeks to understand how someone reasons and makes decisions, as well as their ability to think objectively and logically. It usually involves a series of questions or activities designed to ...

  8. Critical Thinking > Assessment (Stanford Encyclopedia of Philosophy)

    The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students' critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level ...

  9. INSIGHT BASECAMP: The new comprehensive and ...

    Insight Assessment's high-quality, expertly designed, and interactive critical thinking self-development tools are now available for the first time directly to individuals through INSIGHT BASECAMP. Adults, teens, and children can gain the reasoning skills and habits of mind that will last them a lifetime. With new short courses, quizzes, and ...

  10. Self-assessment

    Before performing a rubric-referenced self-assessment, students can be asked to assess the exemplar based on the same rubric they will be later using to assess their own work. Supporting technologies. FeedbackFruits is a user-friendly tool that can facilitate both self-assessment of work and self-assessment of skill. It makes it easy to ...

  11. Development and Validation of a Critical Thinking Assessment-Scale

    This study presents and validates the psychometric characteristics of a short form of the Critical Thinking Self-assessment Scale (CTSAS). The original CTSAS was composed of six subscales representing the six components of Facione's conceptualisation of critical thinking. The CTSAS short form kept the same structures and reduced the number of items from 115 in the original version, to 60 ...

  12. Assessing Student Learning: 6 Types of Assessment and How to Use Them

    Incorporating opportunities for peer feedback and self-reflection can further enhance the learning process and help students develop essential skills such as collaboration, communication, and critical thinking. 5. Self-assessment. Self-assessment is a valuable tool for encouraging students to engage in reflection and take ownership of their ...

  13. Critical Thinking Test: Free Practice Questions

    This formal examination, often referred to as the critical thinking assessment, is a benchmark for those aiming to demonstrate their proficiency in discernment and problem-solving. In addition, this evaluative tool meticulously gauges a range of skills, including logical reasoning, analytical thinking, and the ability to evaluate and synthesize ...

  14. Critical Thinking

    Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life. You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when ...

  15. List of tools for reflection

    Gibbs' reflective cycle. This model takes you through Description, Feelings, Evaluation, Analysis, Conclusion, and Action plan. Examples. Reflecting on a group work experience in about 1000 words. Reflection on a group work experience in about 300 words. Comments: The examples reflect each other. Both examples use the same scenario of a group ...

  16. PDF The Miniature Guide to Critical Thinking: Concepts & Tools

    ConCepts and tools. By Dr. Richard Paul and Dr. Linda Elder. The Foundation for Critical Thinking. www.criticalthinking.org 707-878-9100 [email protected]. Why A Critical Thinking Mini-Guide? This miniature guide focuses on of the essence of critical thinking concepts and tools distilled into pocket size.

  17. Assessment of Critical Thinking

    Through self-developed tasks and the associated observation or assessment tools, the assessment can be adapted quite specifically to one's own teaching and one's own concept of CT. ... D. R. (1992). Critical thinking and self-directed learning in adult education: an analysis of responsibility and control issues. Adult Education Quarterly ...

  18. Critical Thinking test

    Instructions Critical Thinking test. Each question presents one or more paragraphs of text and a question about the information in the text. It's your job to figure out which of the options is the correct answer. 1. Analysing arguments. Below is a statement that is followed by an argument. You should consider this argument to be true.

  19. PDF Critical Thinking Learning Program Self-Assessment

    Critical Thinking This course is a 29-minute interactive, media-rich case study focused on content analysis, problem solving and decision making. Critical thinking is useful for examining an issue or problem logically. This Challenge Series product explores applying the critical thinking process. TMS ID# 1349516 Skillsoft .5 hour on-line SCORE 2-3

  20. The Safe Care Framework™: A practical tool for critical thinking

    The definition of critical thinking used for the development of the S.C.F.™ is "critical thinking is purposeful, self-regulatory judgement resulting in interpretation, analysis, evaluation, and inference including evidential, conceptual, methodological, contextual considerations upon which judgement is placed" (Facione, 1990).

  21. Critical thinking skills in midwifery practice: Development of a self

    Utilising a self-assessment tool can cultivate essential professional skills including critical awareness, decision making, critical thinking and enhance learning. This pilot study indicates that the CACTiM (Student version) is a reliable and valid measure of critical thinking skills in midwifery practice.

  22. A conceptual framework for developing a critical thinking self

    Nursing experts conceptualize critical thinking as a process involving cognitive and affective domains of reasoning. Nurse educators are often challenged with teaching and measuring CTS because of their latent nature and the lack of a uniform definition of the concept. In this review of the critical thinking literature, we examine various ...

  23. Are You Good at Critical Thinking? [Self-Assessment Test]

    41-60 Points. If you scored 41-60 points, congratulations! You have strong critical thinking skills. You are able to look at complex problems and analyze them logically and objectively to come up with solutions. You are also able to evaluate evidence and make informed decisions.