• USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Applying Critical Thinking
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically involves moving beyond simply understanding information by questioning its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., evidence of being influenced by personal opinions and feelings rather than by external determinants ] . Applying critical thinking to investigating a research problem involves actively challenging basic assumptions and questioning the choices and potential motives underpinning how a study was designed and executed and how the author arrived at particular conclusions or recommended courses of action. Applying critical thinking to writing involves effectively synthesizing information and generating compelling arguments.

Hanscomb, Stuart. Critical Thinking: The Basics . 2nd edition. London: Routledge, 2023; Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design . New York: Routledge, 2017.

Thinking Critically

Applying Critical Thinking to Research and Writing

Professors like to use the term critical thinking; in fact, the idea of being a critical thinker permeates much of higher education writ large. In the classroom, the idea of thinking critically is often mentioned by professors when students ask how they should approach writing a research paper [other approaches your professor might mention include interdisciplinarity, compare and contrast, gendered perspective, global, etc.]. However, critical thinking is more than just an approach to research and writing. It is an acquired skill associated with becoming a complex learner capable of discerning important relationships among the elements of, as well as integrating multiple ways of understanding applied to, the research problem. Critical thinking is a lens through which you holistically interrogate a topic.

Given this, critical thinking encompasses a variety of inter-related connotations applied to writing a college-level research paper:

  • Integrated and Multi-Dimensional . Critical thinking is not focused on any one element of research, but instead, is applied holistically throughout the process of identifying the research problem, reviewing the literature, applying methods of analysis, describing the results, discussing their implications, and, if appropriate, offering recommendations for further research. It permeates the entire research endeavor from contemplating what to write about to proofreading the final product.
  • Humanizes the Research . Thinking critically can help humanize what is being studied by extending the scope of your analysis beyond the traditional boundaries of prior research. The scope of prior research, for example, could have involved only sampling homogeneous populations, only considered certain factors related to the investigation of a phenomenon, or was limited by the way the study was framed or contextualized. Critical thinking supports opportunities to think about incorporating the experiences of traditionally marginalized groups into the research, leading to a more inclusive and representative examination of the topic.
  • Non-Linear . This refers to analyzing a research problem in ways that do not rely on sequential decision-making or rational forms of reasoning. Creative thinking relies on intuitive judgement, flexibility, and unconventional approaches to investigating complex phenomena in order to discover new insights, connections, and potential solutions . Thinking critically involves going back and modifying your thinking as new evidence emerges , perhaps multiple times throughout the research process, and then drawing conclusions from multiple perspectives as a result of questioning initial impressions about the topic.
  • Normative . This refers to the idea that critical thinking can be used to challenge prior assumptions in ways that advocate for social justice, equity, and resilience, leading to research having a more transformative and expansive impact. In this respect, critical thinking can be viewed as a method for breaking away from dominant culture norms so as to produce research outcomes that illuminate previously hidden aspects of exploitation and injustice.
  • Power Dynamics . Research in the social sciences often includes examining aspects of power and influence, focusing on how it operates, how it can be acquired, and how it can be maintained, thereby shaping social relations, organizations, institutions, and the production and maintenance of knowledge. Thinking critically can reveal how societal structures and forces perpetuate power in ways that marginalizes and oppresses specific groups or communities within the contexts of history , politics, economics, culture, and other factors.
  • Reflection . A key component of critical thinking is practicing reflexivity; the act of turning ideas and concepts back onto yourself in order to reveal and clarify your own beliefs, assumptions, and perspectives. Being critically reflexive is important because it can reveal hidden biases you may have that could unintentionally influence how you interpret and validate information. The more reflexive you are, the better able and more comfortable you are in opening yourself up to new modes of understanding.
  • Rigorous Questioning . Thinking critically is guided by asking questions that lead to addressing complex principles, theories, concepts, or problems more effectively, and in so doing, help distinguish what is known from from what is not known [or that may be hidden]. Critical thinking involves deliberately framing inquiries not only as hypotheses or axioms, but as a way to apply systematic, disciplined,  in-depth forms of questioning about the research problem and in relation to your positionality as a researcher.
  • Social Change . An overarching goal of critical thinking applied to research and writing is to seek to identify and challenge forces of inequality, exploitation, oppression, and marinalization that contribute to maintaining the status quo within institutions of society. This can include, for example, schools, court system, businesses, government agencies, or religious organizations that have been created and maintained through certain ways of thinking within the dominant culture. Thinking critically fosters a sense of awareness and empathy about where social change is needed within the overall research process.

Critical thinking permeates the entire research and writing process. However, it applies in particular to the literature review and discussion sections of your paper. These two sections of a research paper most clearly reflect the external/internal duality of thinking critically.

In reviewing the literature, it is important to reflect upon specific aspects of a study, such as, 1) determining if the research design effectively establishes cause and effect relationships or provides insight into explaining why certain phenomena do or do not occur; 2) assessing whether the method of gathering data or information supports the objectives of your study; and, 3) evaluating if the assumptions used t o arrive at a specific conclusion are evidence-based and relevant to addressing the topic. Critically thinking applies to these elements of reviewing prior research by assessing how each source might perpetuate inequalities or hide the voices of others, thereby, limiting its applicability for understanding the scope of the problem and its impact throughout society.

Critical thinking applies to the discussion section of your paper because this is where you contemplate the results of your study and explain its significance in relation to addressing the research problem. Discussion involves more than just summarizing findings and describing outcomes. It includes deliberately considering the importance of the findings and providing reasoned explanations why your paper helps to fill a gap in the literature or expand knowledge and understanding in ways that inform practice. Critical thinking uses reflection to examine your own beliefs concerning the significance of the results in ways that avoid using biased judgment and decision making.

Using Questions to Enable Critical Thinking

At its most fundamental level, critical thinking is thinking about thinking in ways that improve the effectiveness of your ability to reason, analyze, synthesize, evaluate, and report information and, as a result, it advances deeper explorations of the topic*. From a practical standpoint, critical thinking is an act of introspective self-examination that involves formulating open-ended questions that inspire higher levels of reasoning about a research problem. The purpose of asking questions during the research process is to apply a framework of inquiry that challenges conventional assumptions, scrutinizes the evidence presented, determines how effectively arguments have been supported by that evidence, discerns patterns or trends in the findings, and helps imagine alternative outcomes if new or different factors were introduced.

Below are examples of questions that can stimulate critical thinking:

  • Why is this a problem?
  • Why does this research problem matter?
  • Does the problem matter to everyone or just certain groups?
  • How might your perspective change if you were on the other side of the argument?
  • What patterns or connections can you see in the results?
  • What key factors could have altered the outcomes described in the results?
  • What evidence would be needed to support any alternative outcomes?
  • Should there be any additional or alternative interpretations of the research outcomes?
  • What is the explanation for the cause of an event or phenomenon?
  • Why has a particular situation or condition arisen?
  • Who will be impacted by the recommendations posed by the author?
  • Who might be excluded from the author’s recommendations?
  • When and how will you know that the recommendations have worked?
  • In what ways can you apply knowledge from this study to new situations?
  • What is another way to look at how the study was designed?
  • How does the study contradict or confirm your understanding of the research problem?
  • Do the outcomes of the study inform your own lived experiences?
  • What do you think is the significance of this study and why?
  • What are the overall strengths and weakness of this study?

NOTE : Being a critical thinker doesn't just happen. Casting a critical eye on how effectively others have studied a research problem requires developing self-confidence in your own abilities to actively engage with information, to consistently ask how and why questions about the research, and to deliberately analyze arguments and recommendations set forth by the author. Examining critically your own beliefs and feeling about your writing involves a willingness to be comfortable questioning yourself in a way that promotes a strong sense of self-awareness and introspection. Together, these outward and inward looking habits can help improve your critical thinking skills and inform how to effectively research and write a college-level research paper.

* Kharbach, Med. “Examples of Critical Thinking Questions for Students.” Educational Technology and Mobile Learning blog , Last Update: November 10, 2023.

Behar-Horenstein, Linda S., and Lian Niu. “Teaching Critical Thinking Skills in Higher Education: A Review of the Literature.” Journal of College Teaching and Learning 8 (February 2011): 25-41; Bayou, Yemeserach and Tamene Kitila. "Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes." GIST–Education and Learning Research Journal 26 (2023): 123-154; “Bloom's Taxonomy.” Centre for Teaching Excellence. University of Waterloo; “Higher Order Thinking: Bloom’s Taxonomy.” The Learning Center. University of North Carolina; Butcher, Charity. "Using In-class Writing to Promote Critical Thinking and Application of Course Concepts." Journal of Political Science Education 18 (2022): 3-21; Krathwohl, David R. “A Revision of Bloom's Taxonomy: An Overview.” Theory into Practice 41 (Autumn 2002): 212-218; Loseke, Donileen R. Methodological Thinking: Basic Principles of Social Research Design. Thousand Oaks, CA: Sage, 2012; Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Hart, Claire et al. “Exploring Higher Education Students’ Critical Thinking Skills through Content Analysis.” Thinking Skills and Creativity 41 (September 2021): 100877; Lewis, Arthur and David Smith. "Defining Higher Order Thinking." Theory into Practice 32 (Summer 1993): 131-137; Sabrina, R., Emilda Sulasmi, and Mandra Saragih. "Student Critical Thinking Skills and Student Writing Ability: The Role of Teachers' Intellectual Skills and Student Learning." Cypriot Journal of Educational Sciences 17 (2022): 2493-2510. Suter, W. Newton. Introduction to Educational Research: A Critical Thinking Approach. 2nd edition. Thousand Oaks, CA: SAGE Publications, 2012; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design. New York: Routledge, 2017; Vance, Charles M., et al. "Understanding and Measuring Linear–Nonlinear Thinking Style for Enhanced Management Education and Professional Practice." Academy of Management Learning and Education 6 (2007): 167-185; Yeh, Hui-Chin, Shih-hsien Yang, Jo Shan Fu, and Yen-Chen Shih. "Developing College Students’ Critical Thinking through Reflective Writing." Higher Education Research & Development 42 (2023): 244-259.

  • << Previous: Academic Writing Style
  • Next: Choosing a Title >>
  • Last Updated: May 18, 2024 11:38 AM
  • URL: https://libguides.usc.edu/writingguide

Library Home

Critical Thinking in Academic Research - Second Edition

(4 reviews)

critical thinking in research paper

Cindy Gruwell, University of West Florida

Robin Ewing, St. Cloud State University

Copyright Year: 2022

Last Update: 2023

Publisher: Minnesota State Colleges and Universities

Language: English

Formats Available

Conditions of use.

Attribution-ShareAlike

Learn more about reviews.

Reviewed by Julie Jaszkowiak, Community Faculty, Metropolitan State University on 12/22/23

Organized in 11 parts, this his textbook includes introductory information about critical thinking and details about the academic research process. The basics of critical thinking related to doing academic research in Parts I and II. Parts III –... read more

Comprehensiveness rating: 5 see less

Organized in 11 parts, this his textbook includes introductory information about critical thinking and details about the academic research process. The basics of critical thinking related to doing academic research in Parts I and II. Parts III – XI provide specifics on various steps in doing academic research including details on finding and citing source material. There is a linked table of contents so the reader is able to jump to a specific section as needed. There is also a works cited page with information and links to works used for this textbook.

Content Accuracy rating: 5

The content of this textbook is accurate and error free. It contains examples that demonstrate concepts from a variety of disciplines such as “hard science” or “popular culture” that assist in eliminating bias. The authors are librarians so it is clear that their experience as such leads to clear and unbiased content.

Relevance/Longevity rating: 5

General concepts about critical thinking and academic research methodology is well defined and should not become obsolete. Specific content regarding use of citation tools and attribution structure may change but the links to various research sites allow for simple updates.

Clarity rating: 5

This textbook is written in a conversational manner that allows for a more personal interaction with the textbook. It is like the reader is having a conversation with a librarian. Each part has an introduction section that fully defines concepts and terms used for that part.

Consistency rating: 5

In addition to the written content, this textbook contains links to short quizzes at the end of each section. This is consistent throughout each part. Embedded links to additional information are included as necessary.

Modularity rating: 4

This textbook is arranged in 11 modular parts with each part having multiple sections. All of these are linked so a reader can go to a distinct part or section to find specific information. There are some links that refer back to previous sections in the document. It can be challenging to return to where you were once you have jumped to a different section.

Organization/Structure/Flow rating: 5

There is clear definition as to what information is contained within each of the parts and subsequent sections. The textbook follows the logical flow of the process of researching and writing a research paper.

Interface rating: 4

The pictures have alternative text that appears when you hover over the text. There is one picture on page 102 that is a link to where the downloaded picture is from. The pictures are clear and supportive of the text for a visual learner. All the links work and go to either the correct area of the textbook or to a valid website. If you are going to use the embedded links to go to other sections of the textbook you need to keep track of where you are as it can sometimes get confusing as to where you went based on clicking links.

Grammatical Errors rating: 4

This is not really a grammatical error but I did notice on some of the quizzes if you misspelled a work for fill in the blank it was incorrect. It was also sometimes challenging to come up with the correct word for the fill in the blanks.

Cultural Relevance rating: 5

There are no examples or text that are culturally insensitive or offensive. The examples are general and would be applicable to a variety of students study many different academic subjects. There are references and information to many research tools from traditional such as checking out books and articles from the library to more current such as blogs and other electronic sources. This information appeals to a wide expanse of student populations.

I really enjoyed the quizzes at the end of each section. It is very beneficial to test your knowledge and comprehension of what you just read. Often I had to return and reread the content more critically based on my quiz results! They are just the right length to not disrupt the overall reading of the textbook and cover the important content and learning objectives.

Reviewed by Sara Stigberg, Adjunct Reference Librarian, Truman College, City Colleges of Chicago on 3/15/23

Critical Thinking in Academic Research thoroughly covers the basics of academic research for undergraduates, including well-guided deeper dives into relevant areas. The authors root their introduction to academic research principles and practices... read more

Critical Thinking in Academic Research thoroughly covers the basics of academic research for undergraduates, including well-guided deeper dives into relevant areas. The authors root their introduction to academic research principles and practices in the Western philosophical tradition, focused on developing students' critical thinking skills and habits around inquiry, rationales, and frameworks for research.

This text conforms to the principles and frames of the Framework for Information Literacy for Higher Education, published by the Association of College and Research Libraries. It includes excellent, clear, step-by-step guides to help students understand rationales and techniques for academic research.

Essential for our current information climate, the authors present relevant information for students who may be new to academic research, in ways and with content that is not too broad or too narrow, or likely to change drastically in the near future.

The authors use clear and well-considered language and explanations of ideas and terms, contextualizing the scholarly research process and tools in a relatable manner. As mentioned earlier, this text includes excellent step-by-step guides, as well as illustrations, visualizations, and videos to instruct students in conducting academic research.

(4.75) The terminology and framework of this text are consistent. Early discussions of critical thinking skills are tied in to content in later chapters, with regard to selecting different types of sources and search tools, as well as rationales for choosing various formats of source references. Consciously making the theme of critical thinking as applied to the stages of academic research more explicit and frequent within the text would further strengthen it, however.

Modularity rating: 5

Chapters are divided in a logical, progressive manner throughout the text. The use of embedded links to further readings and some other relevant sections of the text are an excellent way of providing references and further online information, without overwhelming or side-tracking the reader.

Topics in the text are organized in logical, progressive order, transitioning cleanly from one focus to the next. Each chapter begins with a helpful outline of topics that will be covered within it.

There are no technical issues with the interface for this text. Interactive learning tools such as the many self-checks and short quizzes that are included throughout the text are a great bonus for reinforcing student learning, and the easily-accessible table of contents was very helpful. There are some slight inconsistencies across chapters, however, relative to formatting images and text and spacing, and an image was missing in the section on Narrowing a Topic. Justifying copy rather than aligning-left would prevent hyphenation, making the text more streamlined.

Grammatical Errors rating: 5

(4.75) A few minor punctuation errors are present.

The authors of this text use culturally-relevant examples and inclusive language. The chapter on Barriers to Critical Thinking works directly to break down bias and preconceived notions.

Overall, Critical Thinking in Academic Research is an excellent general textbook for teaching the whys and hows of academic research to undergraduates. A discussion of annotated bibliographies would be a great addition for future editions of the text. ---- (As an aside for the authors, I am curious if the anonymous data from the self-checks and quizzes is being collected and analyzed for assessment purposes. I'm sure it would be interesting!)

Reviewed by Ann Bell-Pfeifer, Program Director/ Instructor, Minnesota State Community and Technical College on 2/15/23

The book has in depth coverage of academic research. A formal glossary and index were not included. read more

Comprehensiveness rating: 4 see less

The book has in depth coverage of academic research. A formal glossary and index were not included.

The book appears error free and factual.

The content is current and would support students who are pursuing writing academic research papers.

Excellent explanations for specific terms were included throughout the text.

The text is easy to follow with a standardized format and structure.

The text contains headings and topics in each section.

It is easy to follow the format and review each section.

Interface rating: 5

The associated links were useful and not distracting.

No evidence of grammatical errors were found in the book.

The book is inclusive.

The book was informative, easy to follow, and sequential allowing the reader to digest each section before moving into another.

Reviewed by Jenny Inker, Assistant Professor, Virginia Commonwealth University on 8/23/22

This book provides a comprehensive yet easily comprehensible introduction to critical thinking in academic research. The author lays a foundation with an introduction to the concepts of critical thinking and analyzing and making arguments, and... read more

This book provides a comprehensive yet easily comprehensible introduction to critical thinking in academic research. The author lays a foundation with an introduction to the concepts of critical thinking and analyzing and making arguments, and then moves into the details of developing research questions and identifying and appropriately using research sources. There are many wonderful links to other open access publications for those who wish to read more or go deeper.

The content of the book appears to be accurate and free of bias.

The examples used throughout the book are relevant and up-to-date, making it easy to see how to apply the concepts in real life.

The text is very accessibly written and the content is presented in a simple, yet powerful way that helps the reader grasp the concepts easily. There are many short, interactive exercises scattered throughout each chapter of the book so that the reader can test their own knowledge as they go along. It would be even better if the author had provided some simple feedback explaining why quiz answers are correct or incorrect in order to bolster learning, but this is a very minor point and the interactive exercises still work well without this.

The book appears consistent throughout with regard to use of terminology and tone of writing. The basic concepts introduced in the early chapters are used consistently throughout the later chapters.

This book has been wonderfully designed into bite sized chunks that do not overwhelm the reader. This is perhaps its best feature, as this encourages the reader to take in a bit of information, digest it, check their understanding of it, and then move on to the next concept. I loved this!

The book is organized in a manner that introduces the basic architecture of critical thinking first, and then moves on to apply it to the subject of academic research. While the entire book would be helpful for college students (undergraduates particularly), the earlier chapters on critical thinking and argumentation also stand well on their own and would be of great utility to students in general.

This book was extremely easy to navigate with a clear, drop down list of chapters and subheadings on the left side of the screen. When the reader clicks on links to additional material, these open up in a new tab which keeps things clear and organized. Images and charts were clear and the overall organization is very easy to follow.

I came across no grammatical errors in the text.

Cultural Relevance rating: 4

This is perhaps an area where the book could do a little more. I did not come across anything that seemed culturally insensitive or offensive but on the other hand, the book might have taken more opportunities to represent a greater diversity of races, ethnicities, and backgrounds.

This book seems tailor made for undergraduate college students and I would highly recommend it. I think it has some use for graduate students as well, although some of the examples are perhaps little basic for this purpose. As well as using this book to guide students on doing academic research, I think it could also be used as a very helpful introduction to the concept of critical thinking by focusing solely on chapters 1-4.

Table of Contents

  • Introduction
  • Part I. What is Critical Thinking?
  • Part II. Barriers to Critical Thinking
  • Part III. Analyzing Arguments
  • Part IV. Making an Argument
  • Part V. Research Questions
  • Part VI. Sources and Information Needs
  • Part VII. Types of Sources
  • Part VIII. Precision Searching
  • Part IX. Evaluating Sources
  • Part X. Ethical Use and Citing Sources
  • Part XI. Copyright Basics
  • Works Cited
  • About the Authors

Ancillary Material

About the book.

Critical Thinking in Academic Research - 2nd Edition provides examples and easy-to-understand explanations to equip students with the skills to develop research questions, evaluate and choose the right sources, search for information, and understand arguments. This 2nd Edition includes new content based on student feedback as well as additional interactive elements throughout the text.

About the Contributors

Cindy Gruwell is an Assistant Librarian/Coordinator of Scholarly Communication at the University of West Florida. She is the library liaison to the department of biology and the College of Health which has extensive nursing programs, public health, health administration, movement, and medical laboratory sciences. In addition to supporting health sciences faculty, she oversees the Argo IRCommons (Institutional Repository) and provides scholarly communication services to faculty across campus. Cindy graduated with her BA (history) and MLS from the University of California, Los Angeles and has a Masters in Education from Bemidji State University. Cindy’s research interests include academic research support, publishing, and teaching.

Robin Ewing is a Professor/Collections Librarian at St. Cloud State University. Robin is the liaison to the College of Education and Learning Design. She oversees content selection for the Library’s collections. Robin graduated with her BBA (Management) and MLIS from the University of Oklahoma. She also has a Masters of Arts in Teaching from Bemidji State University. Robin’s research interests include collection analysis, assessment, and online teaching.

Contribute to this Page

Logo for M Libraries Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

11.5 Critical Thinking and Research Applications

Learning objectives.

  • Analyze source materials to determine how they support or refute the working thesis.
  • Identify connections between source materials and eliminate redundant or irrelevant source materials.
  • Identify instances when it is appropriate to use human sources, such as interviews or eyewitness testimony.
  • Select information from sources to begin answering the research questions.
  • Determine an appropriate organizational structure for the research paper that uses critical analysis to connect the writer’s ideas and information taken from sources.

At this point in your project, you are preparing to move from the research phase to the writing phase. You have gathered much of the information you will use, and soon you will be ready to begin writing your draft. This section helps you transition smoothly from one phase to the next.

Beginning writers sometimes attempt to transform a pile of note cards into a formal research paper without any intermediary step. This approach presents problems. The writer’s original question and thesis may be buried in a flood of disconnected details taken from research sources. The first draft may present redundant or contradictory information. Worst of all, the writer’s ideas and voice may be lost.

An effective research paper focuses on the writer’s ideas—from the question that sparked the research process to how the writer answers that question based on the research findings. Before beginning a draft, or even an outline, good writers pause and reflect. They ask themselves questions such as the following:

  • How has my thinking changed based on my research? What have I learned?
  • Was my working thesis on target? Do I need to rework my thesis based on what I have learned?
  • How does the information in my sources mesh with my research questions and help me answer those questions? Have any additional important questions or subtopics come up that I will need to address in my paper?
  • How do my sources complement each other? What ideas or facts recur in multiple sources?
  • Where do my sources disagree with each other, and why?

In this section, you will reflect on your research and review the information you have gathered. You will determine what you now think about your topic. You will synthesize , or put together, different pieces of information that help you answer your research questions. Finally, you will determine the organizational structure that works best for your paper and begin planning your outline.

Review the research questions and working thesis you developed in Chapter 11 “Writing from Research: What Will I Learn?” , Section 11.2 “Steps in Developing a Research Proposal” . Set a timer for ten minutes and write about your topic, using your questions and thesis to guide your writing. Complete this exercise without looking over your notes or sources. Base your writing on the overall impressions and concepts you have absorbed while conducting research. If additional, related questions come to mind, jot them down.

Selecting Useful Information

At this point in the research process, you have gathered information from a wide variety of sources. Now it is time to think about how you will use this information as a writer.

When you conduct research, you keep an open mind and seek out many promising sources. You take notes on any information that looks like it might help you answer your research questions. Often, new ideas and terms come up in your reading, and these, too, find their way into your notes. You may record facts or quotations that catch your attention even if they did not seem immediately relevant to your research question. By now, you have probably amassed an impressively detailed collection of notes.

You will not use all of your notes in your paper.

Good researchers are thorough. They look at multiple perspectives, facts, and ideas related to their topic, and they gather a great deal of information. Effective writers, however, are selective. They determine which information is most relevant and appropriate for their purpose. They include details that develop or explain their ideas—and they leave out details that do not. The writer, not the pile of notes, is the controlling force. The writer shapes the content of the research paper.

While working through Chapter 11 “Writing from Research: What Will I Learn?” , Section 11.4 “Strategies for Gathering Reliable Information” , you used strategies to filter out unreliable or irrelevant sources and details. Now you will apply your critical-thinking skills to the information you recorded—analyzing how it is relevant, determining how it meshes with your ideas, and finding how it forms connections and patterns.

Writing at Work

When you create workplace documents based on research, selectivity remains important. A project team may spend months conducting market surveys to prepare for rolling out a new product, but few managers have time to read the research in its entirety. Most employees want the research distilled into a few well-supported points. Focused, concise writing is highly valued in the workplace.

Identify Information That Supports Your Thesis

In Note 11.81 “Exercise 1” , you revisited your research questions and working thesis. The process of writing informally helped you see how you might begin to pull together what you have learned from your research. Do not feel anxious, however, if you still have trouble seeing the big picture. Systematically looking through your notes will help you.

Begin by identifying the notes that clearly support your thesis. Mark or group these, either physically or using the cut-and-paste function in your word-processing program. As you identify the crucial details that support your thesis, make sure you analyze them critically. Ask the following questions to focus your thinking:

  • Is this detail from a reliable, high-quality source? Is it appropriate for me to cite this source in an academic paper? The bulk of the support for your thesis should come from reliable, reputable sources. If most of the details that support your thesis are from less-reliable sources, you may need to do additional research or modify your thesis.
  • Is the link between this information and my thesis obvious—or will I need to explain it to my readers? Remember, you have spent more time thinking and reading about this topic than your audience. Some connections might be obvious to both you and your readers. More often, however, you will need to provide the analysis or explanation that shows how the information supports your thesis. As you read through your notes, jot down ideas you have for making those connections clear.
  • What personal biases or experiences might affect the way I interpret this information? No researcher is 100 percent objective. We all have personal opinions and experiences that influence our reactions to what we read and learn. Good researchers are aware of this human tendency. They keep an open mind when they read opinions or facts that contradict their beliefs.

It can be tempting to ignore information that does not support your thesis or that contradicts it outright. However, such information is important. At the very least, it gives you a sense of what has been written about the issue. More importantly, it can help you question and refine your own thinking so that writing your research paper is a true learning process.

Find Connections between Your Sources

As you find connections between your ideas and information in your sources, also look for information that connects your sources. Do most sources seem to agree on a particular idea? Are some facts mentioned repeatedly in many different sources? What key terms or major concepts come up in most of your sources regardless of whether the sources agree on the finer points? Identifying these connections will help you identify important ideas to discuss in your paper.

Look for subtler ways your sources complement one another, too. Does one author refer to another’s book or article? How do sources that are more recent build upon the ideas developed in earlier sources?

Be aware of any redundancies in your sources. If you have amassed solid support from a reputable source, such as a scholarly journal, there is no need to cite the same facts from an online encyclopedia article that is many steps removed from any primary research. If a given source adds nothing new to your discussion and you can cite a stronger source for the same information, use the stronger source.

Determine how you will address any contradictions found among different sources. For instance, if one source cites a startling fact that you cannot confirm anywhere else, it is safe to dismiss the information as unreliable. However, if you find significant disagreements among reliable sources, you will need to review them and evaluate each source. Which source presents a sounder argument or more solid evidence? It is up to you to determine which source is the most credible and why.

Finally, do not ignore any information simply because it does not support your thesis. Carefully consider how that information fits into the big picture of your research. You may decide that the source is unreliable or the information is not relevant, or you may decide that it is an important point you need to bring up. What matters is that you give it careful consideration.

As Jorge reviewed his research, he realized that some of the information was not especially useful for his purpose. His notes included several statements about the relationship between soft drinks that are high in sugar and childhood obesity—a subtopic that was too far outside of the main focus of the paper. Jorge decided to cut this material.

Reevaluate Your Working Thesis

A careful analysis of your notes will help you reevaluate your working thesis and determine whether you need to revise it. Remember that your working thesis was the starting point—not necessarily the end point—of your research. You should revise your working thesis if your ideas changed based on what you read. Even if your sources generally confirmed your preliminary thinking on the topic, it is still a good idea to tweak the wording of your thesis to incorporate the specific details you learned from research.

Jorge realized that his working thesis oversimplified the issues. He still believed that the media was exaggerating the benefits of low-carb diets. However, his research led him to conclude that these diets did have some advantages. Read Jorge’s revised thesis.

Although following a low-carbohydrate diet can benefit some people, these diets are not necessarily the best option for everyone who wants to lose weight or improve their health.

Synthesizing and Organizing Information

By now your thinking on your topic is taking shape. You have a sense of what major ideas to address in your paper, what points you can easily support, and what questions or subtopics might need a little more thought. In short, you have begun the process of synthesizing information—that is, of putting the pieces together into a coherent whole.

It is normal to find this part of the process a little difficult. Some questions or concepts may still be unclear to you. You may not yet know how you will tie all of your research together. Synthesizing information is a complex, demanding mental task, and even experienced researchers struggle with it at times. A little uncertainty is often a good sign! It means you are challenging yourself to work thoughtfully with your topic instead of simply restating the same information.

Use Your Research Questions to Synthesize Information

You have already considered how your notes fit with your working thesis. Now, take your synthesis a step further. Analyze how your notes relate to your major research question and the subquestions you identified in Chapter 11 “Writing from Research: What Will I Learn?” , Section 11.2 “Steps in Developing a Research Proposal” . Organize your notes with headings that correspond to those questions. As you proceed, you might identify some important subtopics that were not part of your original plan, or you might decide that some questions are not relevant to your paper.

Categorize information carefully and continue to think critically about the material. Ask yourself whether the sources are reliable and whether the connections between ideas are clear.

Remember, your ideas and conclusions will shape the paper. They are the glue that holds the rest of the content together. As you work, begin jotting down the big ideas you will use to connect the dots for your reader. (If you are not sure where to begin, try answering your major research question and subquestions. Add and answer new questions as appropriate.) You might record these big ideas on sticky notes or type and highlight them within an electronic document.

Jorge looked back on the list of research questions that he had written down earlier. He changed a few to match his new thesis, and he began a rough outline for his paper.

Jorge's rough outline

Review your research questions and working thesis again. This time, keep them nearby as you review your research notes.

  • Identify information that supports your working thesis.
  • Identify details that call your thesis into question. Determine whether you need to modify your thesis.
  • Use your research questions to identify key ideas in your paper. Begin categorizing your notes according to which topics are addressed. (You may find yourself adding important topics or deleting unimportant ones as you proceed.)
  • Write out your revised thesis and at least two or three big ideas.

You may be wondering how your ideas are supposed to shape the paper, especially since you are writing a research paper based on your research. Integrating your ideas and your information from research is a complex process, and sometimes it can be difficult to separate the two.

Some paragraphs in your paper will consist mostly of details from your research. That is fine, as long as you explain what those details mean or how they are linked. You should also include sentences and transitions that show the relationship between different facts from your research by grouping related ideas or pointing out connections or contrasts. The result is that you are not simply presenting information; you are synthesizing, analyzing, and interpreting it.

Plan How to Organize Your Paper

The final step to complete before beginning your draft is to choose an organizational structure. For some assignments, this may be determined by the instructor’s requirements. For instance, if you are asked to explore the impact of a new communications device, a cause-and-effect structure is obviously appropriate. In other cases, you will need to determine the structure based on what suits your topic and purpose. For more information about the structures used in writing, see Chapter 10 “Rhetorical Modes” .

The purpose of Jorge’s paper was primarily to persuade. With that in mind, he planned the following outline.

An outline for Jorge's paper

Review the organizational structures discussed in this section and Chapter 10 “Rhetorical Modes” . Working with the notes you organized earlier, follow these steps to begin planning how to organize your paper.

  • Create an outline that includes your thesis, major subtopics, and supporting points.
  • The major headings in your outline will become sections or paragraphs in your paper. Remember that your ideas should form the backbone of the paper. For each major section of your outline, write out a topic sentence stating the main point you will make in that section.
  • As you complete step 2, you may find that some points are too complex to explain in a sentence. Consider whether any major sections of your outline need to be broken up and jot down additional topic sentences as needed.
  • Review your notes and determine how the different pieces of information fit into your outline as supporting points.

Collaboration

Please share the outline you created with a classmate. Examine your classmate’s outline and see if any questions come to mind or if you see any area that would benefit from an additional point or clarification. Return the outlines to each other and compare observations.

The structures described in this section and Chapter 10 “Rhetorical Modes” can also help you organize information in different types of workplace documents. For instance, medical incident reports and police reports follow a chronological structure. If the company must choose between two vendors to provide a service, you might write an e-mail to your supervisor comparing and contrasting the choices. Understanding when and how to use each organizational structure can help you write workplace documents efficiently and effectively.

Key Takeaways

  • An effective research paper focuses on presenting the writer’s ideas using information from research as support.
  • Effective writers spend time reviewing, synthesizing, and organizing their research notes before they begin drafting a research paper.
  • It is important for writers to revisit their research questions and working thesis as they transition from the research phase to the writing phrase of a project. Usually, the working thesis will need at least minor adjustments.
  • To organize a research paper, writers choose a structure that is appropriate for the topic and purpose. Longer papers may make use of more than one structure.

Writing for Success Copyright © 2015 by University of Minnesota is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

  • Open access
  • Published: 11 September 2019

Inquiry and critical thinking skills for the next generation: from artificial intelligence back to human intelligence

  • Jonathan Michael Spector   ORCID: orcid.org/0000-0002-6270-3073 1 &
  • Shanshan Ma 1  

Smart Learning Environments volume  6 , Article number:  8 ( 2019 ) Cite this article

32k Accesses

53 Citations

32 Altmetric

Metrics details

Along with the increasing attention to artificial intelligence (AI), renewed emphasis or reflection on human intelligence (HI) is appearing in many places and at multiple levels. One of the foci is critical thinking. Critical thinking is one of four key 21st century skills – communication, collaboration, critical thinking and creativity. Though most people are aware of the value of critical thinking, it lacks emphasis in curricula. In this paper, we present a comprehensive definition of critical thinking that ranges from observation and inquiry to argumentation and reflection. Given a broad conception of critical thinking, a developmental approach beginning with children is suggested as a way to help develop critical thinking habits of mind. The conclusion of this analysis is that more emphasis should be placed on developing human intelligence, especially in young children and with the support of artificial intelligence. While much funding and support goes to the development of artificial intelligence, this should not happen at the expense of human intelligence. Overall, the purpose of this paper is to argue for more attention to the development of human intelligence with an emphasis on critical thinking.

Introduction

In recent decades, advancements in Artificial Intelligence (AI) have developed at an incredible rate. AI has penetrated into people’s daily life on a variety of levels such as smart homes, personalized healthcare, security systems, self-service stores, and online shopping. One notable AI achievement was when AlphaGo, a computer program, defeated the World Go Champion Mr. Lee Sedol in 2016. In the previous year, AlphaGo won in a competition against a professional Go player (Silver et al. 2016 ). As Go is one of the most challenging games, the wins of AI indicated a breakthrough. Public attention has been further drawn to AI since then, and AlphaGo continues to improve. In 2017, a new version of AlphaGo beat Ke Jie, the current world No.1 ranking Go player. Clearly AI can manage high levels of complexity.

Given many changes and multiple lines of development and implement, it is somewhat difficult to define AI to include all of the changes since the 1980s (Luckin et al. 2016 ). Many definitions incorporate two dimensions as a starting point: (a) human-like thinking, and (b) rational action (Russell and Norvig 2009 ). Basically, AI is a term used to label machines (computers) that imitate human cognitive functions such as learning and problem solving, or that manage to deal with complexity as well as human experts.

AlphaGo’s wins against human players were seen as a comparison between artificial and human intelligence. One concern is that AI has already surpassed HI; other concerns are that AI will replace humans in some settings or that AI will become uncontrollable (Epstein 2016 ; Fang et al. 2018 ). Scholars worry that AI technology in the future might trigger the singularity (Good 1966 ), a hypothesized future that the development of technology becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization (Vinge 1993 ).

The famous theoretical physicist Stephen Hawking warned that AI might end mankind, yet the technology he used to communicate involved a basic form of AI (Cellan-Jones 2014 ). This example highlights one of the basic dilemmas of AI – namely, what are the overall benefits of AI versus its potential drawbacks, and how to move forward given its rapid development? Obviously, basic or controllable AI technologies are not what people are afraid of. Spector et al. 1993 distinguished strong AI and weak AI. Strong AI involves an application that is intended to replace an activity performed previously by a competent human, while weak AI involves an application that aims to enable a less experienced human to perform at a much higher level. Other researchers categorize AI into three levels: (a) artificial narrow intelligence (Narrow AI), (b) artificial general intelligence (General AI), and (c) artificial super intelligence (Super AI) (Siau and Yang 2017 ; Zhang and Xie 2018 ). Narrow AI, sometimes called weak AI, refers to a computer that focus on a narrow task such as AlphaZero or a self-driving car. General AI, sometimes referred to as strong AI, is the simulation of human-level intelligence, which can perform more cognitive tasks as well as most humans do. Super AI is defined by Bostrom ( 1998 ) as “an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills” (p.1).

Although the consequence of singularity and its potential benefits or harm to the human race have been intensely debated, an undeniable fact is that AI is capable of undertaking recursive self-improvement. With the increasing improvement of this capability, more intelligent generations of AI will appear rapidly. On the other hand, HI has its own limits and its development requires continuous efforts and investment from generation to generation. Education is the main approach humans use to develop and improve HI. Given the extraordinary growth gap between AI and HI, eventually AI can surpass HI. However, that is no reason to neglect the development and improvement of HI. In addition, in contrast to the slow development rate of HI, the growth of funding support to AI has been rapidly increasing according to the following comparison of support for artificial and human intelligence.

The funding support for artificial and human intelligence

There are challenges in comparing artificial and human intelligence by identifying funding for both. Both terms are somewhat vague and can include a variety of aspects. Some analyses will include big data and data analytics within the sphere of artificial intelligence and others will treat them separately. Some will include early childhood developmental research within the sphere of support for HI and others treat them separately. Education is a major way of human beings to develop and improve HI. The investments in education reflect the efforts put on the development of HI, and they pale in comparison with investments in AI.

Sources also vary from governmental funding of research and development to business and industry investments in related research and development. Nonetheless, there are strong indications of increased funding support for AI in North America, Europe and Asia, especially in China. The growth in funding for AI around the world is explosive. According to ZDNet, AI funding more than doubled from 2016 to 2017 and more than tripled from 2016 to 2018. The growth in funding for AI in the last 10 years has been exponential. According to Venture Scanner, there are approximately 2500 companies that have raised $60 billion in funding from 3400 investors in 72 different countries (see https://www.slideshare.net/venturescanner/artificial-intelligence-q1-2019-report-highlights ). Areas included in the Venture Scanner analysis included virtual assistants, recommendation engines, video recognition, context-aware computing, speech recognition, natural language processing, machine learning, and more.

The above data on AI funding focuses primarily on companies making products. There is no direct counterpart in the area of HI where the emphasis is on learning and education. What can be seen, however, are trends within each area. The above data suggest exponential growth in support for AI. In contrast, according to the Urban Institute, per-student funding in the USA has been relatively flat for nearly two decades, with a few states showing modest increases and others showing none (see http://apps.urban.org/features/education-funding-trends/ ). Funding for education is complicated due to the various sources. In the USA, there are local, state and federal sources to consider. While that mixture of funding sources is complex, it is clear that federal and state spending for education in the USA experienced an increase after World War II. However, since the 1980s, federal spending for education has steadily declined, and state spending on education in most states has declined since 2010 according to a government report (see https://www.usgovernmentspending.com/education_spending ). This decline in funding reflects the decreasing emphasis on the development of HI, which is a dangerous signal.

Decreased support for education funding in the USA is not typical of what is happening in other countries, according to The Hechinger Report (see https://hechingerreport.org/rest-world-invests-education-u-s-spends-less/ ). For example, in the period of 2010 to 2014, American spending on elementary and high school education declined 3%, whereas in the same period, education spending in the 35 countries in the OECD rose by 5% with some countries experiencing very significant increases (e.g., 76% in Turkey).

Such data can be questioned in terms of how effectively funds are being spent or how poorly a country was doing prior to experiencing a significant increase. However, given the performance of American students on the Program for International Student Assessment (PISA), the relative lack of funding support in the USA is roughly related with the mediocre performance on PISA tests (see https://nces.ed.gov/surveys/pisa/pisa2015/index.asp ). Research by Darling-Hammond ( 2014 ) indicated that in order to improve learning and reduce the achievement gap, systematic government investments in high-need schools would be more effective if the focus was on capacity building, improving the knowledge and skills of educators and the quality of curriculum opportunities.

Though HI could not be simply defined by the performance on PISA test, improving HI requires systematic efforts and funding support in high-need areas as well. So, in the following section, we present a reflection on HI.

Reflection on human intelligence

Though there is a variety of definitions of HI, from the perspective of psychology, according to Sternberg ( 1999 ), intelligence is a form of developing expertise, from a novice or less experienced person to an expert or more experienced person, a student must be through multiple learning (implicit and explicit) and thinking (critical and creative) processes. In this paper, we adopted such a view and reflected on HI in the following section by discussing learning and critical thinking.

What is learning?

We begin with Gagné’s ( 1985 ) definition of learning as characterized by stable and persistent changes in what a person knows or can do. How do humans learn? Do you recall how to prove that the square root of 2 is not a rational number, something you might have learned years ago? The method is intriguing and is called an indirect proof or a reduction to absurdity – assume that the square root of 2 is a rational number and then apply truth preserving rules to arrive at a contradiction to show that the square root of 2 cannot be a rational number. We recommend this as an exercise for those readers who have never encountered that method of learning and proof. (see https://artofproblemsolving.com/wiki/index.php/Proof_by_contradiction ). Yet another interesting method of learning is called the process of elimination, sometimes accredited to Arthur Conan Doyle’s ( 1926 ) in The Adventure of the Blanched Soldier – Sherlock Holmes says to Dr. Watson that the process of elimination “starts upon the supposition that when you have eliminated all which is impossible, that whatever remains, however improbable, must be the truth ” (see https://www.dfw-sherlock.org/uploads/3/7/3/8/37380505/1926_november_the_adventure_of_the_blanched_soldier.pdf ).

The reason to mention Sherlock Holmes early in this paper is to emphasize the role that observation plays in learning. The character Sherlock Holmes was famous for his observation skills that led to his so-called method of deductive reasoning (a process of elimination), which is what logicians would classify as inductive reasoning as the conclusions of that reasoning process are primarily probabilistic rather than certain, unlike the proof of the irrationality of the square root of 2 mentioned previously.

In dealing with uncertainty, it seems necessary to make observations and gather evidence that can lead one to a likely conclusion. Is that not what reasonable people and accomplished detectives do? It is certainly what card counters do at gambling houses; they observe high and low value cards that have already been played in order to estimate the likelihood of the next card being a high or low value card. Observation is a critical process in dealing with uncertainty.

Moreover, humans typically encounter many uncertain situations in the course of life. Few people encounter situations which require resolution using a mathematical proof such as the one with which this article began. Jonassen ( 2000 , 2011 ) argued that problem solving is one of the most important and frequent activities in which people engage. Moreover, many of the more challenging problems are ill-structured in the sense that (a) there is incomplete information pertaining to the situation, or (b) the ideal resolution of the problem is unknown, or (c) how to transform a problematic situation into an acceptable situation is unclear. In short, people are confronted with uncertainty nearly every day and in many different ways. The so called key 21st century skills of communication, collaboration, critical thinking and creativity (the 4 Cs; see http://www.battelleforkids.org/networks/p21 ) are important because uncertainty is a natural and inescapable aspect of the human condition. The 4 Cs are interrelated and have been presented by Spector ( 2018 ) as interrelated capabilities involving logic and epistemology in the form of the new 3Rs – namely, re-examining, reasoning, and reflecting. Re-examining is directly linked to observation as a beginning point for inquiry. The method of elimination is one form of reasoning in which a person engages to solve challenging problems. Reflecting on how well one is doing in the life-long enterprise of solving challenging problems is a higher kind of meta-cognitive activity in which accomplished problem-solvers engage (Ericsson et al. 1993 ; Flavell 1979 ).

Based on these initial comments, a comprehensive definition of critical thinking is presented next in the form of a framework.

A framework of critical thinking

Though there is variety of definitions of critical thinking, a concise definition of critical thinking remains elusive. For delivering a direct understanding of critical thinking to readers such as parents and school teachers, in this paper, we present a comprehensive definition of critical thinking through a framework that includes many of the definitions offered by others. Critical thinking, as treated broadly herein, is a multi-dimensioned and multifaceted human capability. Critical thinking has been interpreted from three perspectives: education, psychology, and epistemology, all of which are represented in the framework that follows.

In a developmental approach to critical thinking, Spector ( 2019 ) argues that critical thinking involves a series of cumulative and related abilities, dispositions and other variables (e.g., motivation, criteria, context, knowledge). This approach proceeds from experience (e.g., observing something unusual) and then to various forms of inquiry, investigation, examination of evidence, exploration of alternatives, argumentation, testing conclusions, rethinking assumptions, and reflecting on the entire process.

Experience and engagement are ongoing throughout the process which proceeds from relatively simple experiences (e.g., direct and immediate observation) to more complex interactions (e.g., manipulation of an actual or virtual artifact and observing effects).

The developmental approach involves a variety of mental processes and non-cognitive states, which help a person’s decision making to become purposeful and goal directed. The associated critical thinking skills enable individuals to be likely to achieve a desired outcome in a challenging situation.

In the process of critical thinking, apart from experience, there are two additional cognitive capabilities essential to critical thinking – namely, metacognition and self-regulation . Many researchers (e.g., Schraw et al. 2006 ) believe that metacognition has two components: (a) awareness and understanding of one’s own thoughts, and (b) the ability to regulate one’s own cognitive processes. Some other researchers put more emphasis on the latter component. For example, Davies ( 2015 ) described metacognition as the capacity to monitor the quality of one’s thinking process, and then to make appropriate changes. However, the American Psychology Association (APA) defines metacognition as an awareness and understanding of one’s own thought with the ability to control related cognitive processes (see https://psycnet.apa.org/record/2008-15725-005 ).

Although the definition and elaboration of these two concepts deserve further exploration, they are often used interchangeably (Hofer and Sinatra 2010 ; Schunk 2008 ). Many psychologists see the two related capabilities of metacognition and self-regulation as being closely related - two sides on one coin, so to speak. Metacognition involves or emphasizes awareness, whereas self-regulation involves and emphasizes appropriate control. These two concepts taken together enable a person to create a self-regulatory mechanism, which monitors and regulates the corresponding skills (e.g., observation, inquiry, interpretation, explanation, reasoning, analysis, evaluation, synthesis, reflection, and judgement).

As to the critical thinking skills, it should be noted that there is much discussion about the generalizability and domain specificity of them, just as there is about problem-solving skills in general (Chi et al. 1982 ; Chiesi et al. 1979 ; Ennis 1989 ; Fischer 1980 ). The research supports the notion that to achieve high levels of expertise and performance, one must develop high levels of domain knowledge. As a consequence, becoming a highly effective critical thinker in a particular domain of inquiry requires significant domain knowledge. One may achieve such levels in a domain in which one has significant domain knowledge and experience but not in a different domain in which one has little domain knowledge and experience. The processes involved in developing high levels of critical thinking are somewhat generic. Therefore, it is possible to develop critical thinking in nearly any domain when the two additional capabilities of metacognition and self-regulation are coupled with motivation and engagement and supportive emotional states (Ericsson et al. 1993 ).

Consequently, the framework presented here (see Fig. 1 ) is built around three main perspectives about critical thinking (i.e., educational, psychological and epistemological) and relevant learning theories. This framework provides a visual presentation of critical thinking with four dimensions: abilities (educational perspective), dispositions (psychological perspective), levels (epistemological perspective) and time. Time is added to emphasize the dynamic nature of critical thinking in terms of a specific context and a developmental approach.

figure 1

Critical thinking often begins with simple experiences such as observing a difference, encountering a puzzling question or problem, questioning someone’s statement, and then leads, in some instances to an inquiry, and then to more complex experiences such as interactions and application of higher order thinking skills (e.g., logical reasoning, questioning assumptions, considering and evaluating alternative explanations).

If the individual is not interested in what was observed, an inquiry typically does not begin. Inquiry and critical thinking require motivation along with an inquisitive disposition. The process of critical thinking requires the support of corresponding internal indispositions such as open-mindedness and truth-seeking. Consequently, a disposition to initiate an inquiry (e.g., curiosity) along with an internal inquisitive disposition (e.g., that links a mental habit to something motivating to the individual) are both required (Hitchcock 2018 ). Initiating dispositions are those that contribute to the start of inquiry and critical thinking. Internal dispositions are those that initiate and support corresponding critical thinking skills during the process. Therefore, critical thinking dispositions consist of initiating dispositions and internal dispositions. Besides these factors, critical thinking also involves motivation. Motivation and dispositions are not mutually exclusive, for example, curiosity is a disposition and also a motivation.

Critical thinking abilities and dispositions are two main components of critical thinking, which involve such interrelated cognitive constructs as interpretation, explanation, reasoning, evaluation, synthesis, reflection, judgement, metacognition and self-regulation (Dwyer et al. 2014 ; Davies 2015 ; Ennis 2018 ; Facione 1990 ; Hitchcock 2018 ; Paul and Elder 2006 ). There are also some other abilities such as communication, collaboration and creativity, which are now essential in current society (see https://en.wikipedia.org/wiki/21st_century_skills ). Those abilities along with critical thinking are called the 4Cs; they are individually monitored and regulated through metacognitive and self-regulation processes.

The abilities involved in critical thinking are categorized in Bloom’s taxonomy into higher order skills (e.g., analyzing and synthesizing) and lower level skills (e.g., remembering and applying) (Anderson and Krathwohl 2001 ; Bloom et al. 1956 ).

The thinking process can be depicted as a spiral through both lower and higher order thinking skills. It encompasses several reasoning loops. Some of them might be iterative until a desired outcome is achieved. Each loop might be a mix of higher order thinking skills and lower level thinking skills. Each loop is subject to the self-regulatory mechanism of metacognition and self-regulation.

But, due to the complexity of human thinking, a specific spiral with reasoning loops is difficult to represent. Therefore, instead of a visualized spiral with an indefinite number of reasoning loops, the developmental stages of critical thinking are presented in the diagram (Fig. 1 ).

Besides, most of the definitions of critical thinking are based on the imagination about ideal critical thinkers such as the consensus generated from the Delphi report (Facione 1990 ). However, according to Dreyfus and Dreyfus ( 1980 ), in the course of developing an expertise, students would pass through five stages. Those five stages are “absolute beginner”, “advanced beginner”, “competent performer”, “proficient performer,” and “intuitive expert performer”. Dreyfus and Dreyfus ( 1980 ) described the five stages the result of the successive transformations of four mental functions: recollection, recognition, decision making, and awareness.

In the course of developing critical thinking and expertise, individuals will pass through similar stages which are accompanied with the increasing practices and accumulation of experience. Through the intervention and experience of developing critical thinking, as a novice, tasks are decomposed into context-free features which could be recognized by students without the experience of particular situations. For further improving, students need to be able to monitor their awareness, and with a considerable experience. They can note recurrent meaningful component patterns in some contexts. Gradually, increased practices expose students to a variety of whole situations which enable the students to recognize tasks in a more holistic manner as a professional. On the other hand, with the increasing accumulation of experience, individuals are less likely to depend simply on abstract principles. The decision will turn to something intuitive and highly situational as well as analytical. Students might unconsciously apply rules, principles or abilities. A high level of awareness is absorbed. At this stage, critical thinking is turned into habits of mind and in some cases expertise. The description above presents a process of critical thinking development evolving from a novice to an expert, eventually developing critical thinking into habits of mind.

We mention the five-stage model proposed by Dreyfus and Dreyfus ( 1980 ) to categorize levels of critical thinking and emphasize the developmental nature involved in becoming a critical thinker. Correspondingly, critical thinking is categorized into 5 levels: absolute beginner (novice), advanced beginner (beginner), competent performer (competent), proficient performer (proficient), and intuitive expert (expert).

Ability level and critical thinker (critical thinking) level together represent one of the four dimensions represented in Fig. 1 .

In addition, it is noteworthy that the other two elements of critical thinking are the context and knowledge in which the inquiry is based. Contextual and domain knowledge must be taken into account with regard to critical thinking, as previously argued. Besides, as Hitchcock ( 2018 ) argued, effective critical thinking requires knowledge about and experience applying critical thinking concepts and principles as well.

Critical thinking is considered valuable across disciplines. But except few courses such as philosophy, critical thinking is reported lacking in most school education. Most of researchers and educators thus proclaim that integrating critical thinking across the curriculum (Hatcher 2013 ). For example, Ennis ( 2018 ) provided a vision about incorporating critical thinking across the curriculum in higher education. Though people are aware of the value of critical thinking, few of them practice it. Between 2012 and 2015, in Australia, the demand of critical thinking as one of the enterprise skills for early-career job increased 125% (Statista Research Department, 2016). According to a survey across 1000 adults by The Reboot Foundation 2018 , more than 80% of respondents believed that critical thinking skills are lacking in today’s youth. Respondents were deeply concerned that schools do not teach critical thinking. Besides, the investigation also found that respondents were split over when and how to teach critical thinking, clearly.

In the previous analysis of critical thinking, we presented the mechanism of critical thinking instead of a concise definition. This is because, given the various perspectives of interpreting critical thinking, it is not easy to come out with an unitary definition, but it is essential for the public to understand how critical thinking works, the elements it involves and the relationships between them, so they can achieve an explicit understanding.

In the framework, critical thinking starts from simple experience such as observing a difference, then entering the stage of inquiry, inquiry does not necessarily turn the thinking process into critical thinking unless the student enters a higher level of thinking process or reasoning loops such as re-examining, reasoning, reflection (3Rs). Being an ideal critical thinker (or an expert) requires efforts and time.

According to the framework, simple abilities such as observational skills and inquiry are indispensable to lead to critical thinking, which suggests that paying attention to those simple skills at an early stage of children can be an entry point to critical thinking. Considering the child development theory by Piaget ( 1964 ), a developmental approach spanning multiple years can be employed to help children develop critical thinking at each corresponding development stage until critical thinking becomes habits of mind.

Although we emphasized critical thinking in this paper, for the improvement of intelligence, creative thinking and critical thinking are separable, they are both essential abilities that develop expertise, eventually drive the improvement of HI at human race level.

As previously argued, there is a similar pattern among students who think critically in different domains, but students from different domains might perform differently in creativity because of different thinking styles (Haller and Courvoisier 2010 ). Plus, students have different learning styles and preferences. Personalized learning has been the most appropriate approach to address those differences. Though the way of realizing personalized learning varies along with the development of technologies. Generally, personalized learning aims at customizing learning to accommodate diverse students based on their strengths, needs, interests, preferences, and abilities.

Meanwhile, the advancement of technology including AI is revolutionizing education; students’ learning environments are shifting from technology-enhanced learning environments to smart learning environments. Although lots of potentials are unrealized yet (Spector 2016 ), the so-called smart learning environments rely more on the support of AI technology such as neural networks, learning analytics and natural language processing. Personalized learning is better supported and realized in a smart learning environment. In short, in the current era, personalized learning is to use AI to help learners perform at a higher level making adjustments based on differences of learners. This is the notion with which we conclude – the future lies in using AI to improve HI and accommodating individual differences.

The application of AI in education has been a subject for decades. There are efforts heading to such a direction though personalized learning is not technically involved in them. For example, using AI technology to stimulate critical thinking (Zhu 2015 ), applying a virtual environment for building and assessing higher order inquiry skills (Ketelhut et al. 2010 ). Developing computational thinking through robotics (Angeli and Valanides 2019 ) is another such promising application of AI to support the development of HI.

However, almost all of those efforts are limited to laboratory experiments. For accelerating the development rate of HI, we argue that more emphasis should be given to the development of HI at scale with the support of AI, especially in young children focusing on critical and creative thinking.

In this paper, we argue that more emphasis should be given to HI development. Rather than decreasing the funding of AI, the analysis of progress in artificial and human intelligence indicates that it would be reasonable to see increased emphasis placed on using various AI techniques and technologies to improve HI on a large and sustainable scale. Well, most researchers might agree that AI techniques or the situation might be not mature enough to support such a large-scale development. But it would be dangerous if HI development is overlooked. Based on research and theory drawn from psychology as well as from epistemology, the framework is intended to provide a practical guide to the progressive development of inquiry and critical thinking skills in young children as children represent the future of our fragile planet. And we suggested a sustainable development approach for developing inquiry and critical thinking (See, Spector 2019 ). Such an approach could be realized through AI and infused into HI development. Besides, a project is underway in collaboration with NetDragon to develop gamified applications to develop the relevant skills and habits of mind. A game-based assessment methodology is being developed and tested at East China Normal University that is appropriate for middle school children. The intention of the effort is to refocus some of the attention on the development of HI in young children.

Availability of data and materials

Not applicable.

Abbreviations

Artificial Intelligence

Human Intelligence

L.W. Anderson, D.R. Krathwohl, A taxonomy for learning, teaching, and assessing: A revision of bloom’s taxonomy of educational objectives (Allyn & Bacon, Boston, 2001)

Google Scholar  

Angeli, C., & Valanides, N. (2019). Developing young children’s computational thinking with educational robotics: An interaction effect between gender and scaffolding strategy. Comput. Hum. Behav. Retrieved from https://doi.org/10.1016/j.chb.2019.03.018

B.S. Bloom, M.D. Engelhart, E.J. Furst, W.H. Hill, D.R. Krathwohl, Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive Domain (David McKay Company, New York, 1956)

Bostrom, N. (1998). How long before superintelligence? Retrieved from https://nickbostrom.com/superintelligence.html

R. Cellan-Jones, Stephen hawking warns artificial intelligence could end mankind. BBC. News. 2 , 2014 (2014)

M.T.H. Chi, R. Glaser, E. Rees, in Advances in the Psychology of Human Intelligence , ed. by R. S. Sternberg. Expertise in problem solving (Erlbaum, Hillsdale, 1982), pp. 7–77

H.L. Chiesi, G.J. Spliich, J.F. Voss, Acquisition of domain-related information in relation to high and low domain knowledge. J. Verbal Learn. Verbal Behav. 18 , 257–273 (1979)

Article   Google Scholar  

L. Darling-Hammond, What can PISA tell US about US education policy? N. Engl. J. Publ. Policy. 26 (1), 4 (2014)

M. Davies, in Higher education: Handbook of theory and research . A Model of Critical Thinking in Higher Education (Springer, Cham, 2015), pp. 41–92

Chapter   Google Scholar  

A.C. Doyle, in The Strand Magazine . The adventure of the blanched soldier (1926) Retrieved from https://www.dfw-sherlock.org/uploads/3/7/3/8/37380505/1926_november_the_adventure_of_the_blanched_soldier.pdf

S.E. Dreyfus, H.L. Dreyfus, A five-stage model of the mental activities involved in directed skill acquisition (no. ORC-80-2) (University of California-Berkeley Operations Research Center, Berkeley, 1980)

Book   Google Scholar  

C.P. Dwyer, M.J. Hogan, I. Stewart, An integrated critical thinking framework for the 21st century. Think. Skills Creat. 12 , 43–52 (2014)

R.H. Ennis, Critical thinking and subject specificity: Clarification and needed research. Educ. Res. 18 , 4–10 (1989)

R.H. Ennis, Critical thinking across the curriculum: A vision. Topoi. 37 (1), 165–184 (2018)

Epstein, Z. (2016). Has artificial intelligence already surpassed the human brain? Retrieved from https://bgr.com/2016/03/10/alphago-beats-lee-sedol-again/

K.A. Ericsson, R.T. Krampe, C. Tesch-Römer, The role of deliberate practice in the acquisition of expert performance. Psychol. Rev. 100 (3), 363–406 (1993)

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction [Report for the American Psychology Association]. Retrieved from https://files.eric.ed.gov/fulltext/ED315423.pdf

J. Fang, H. Su, Y. Xiao, Will Artificial Intelligence Surpass Human Intelligence? (2018). https://doi.org/10.2139/ssrn.3173876

K.W. Fischer, A theory of cognitive development: The control and construction of hierarchies of skills. Psychol. Rev. 87 , 477–431 (1980)

J.H. Flavell, Metacognition and cognitive monitoring: A new area of cognitive development inquiry. Am. Psychol. 34 (10), 906–911 (1979)

R.M. Gagné, The conditions of learning and theory of instruction , 4th edn. (Holt, Rinehart, & Winston, New York, 1985)

I.J. Good, Speculations concerning the first ultraintelligent machine. Adv Comput. 6 , 31-88 (1966)

C.S. Haller, D.S. Courvoisier, Personality and thinking style in different creative domains. Psychol. Aesthet. Creat. Arts. 4 (3), 149 (2010)

D.L. Hatcher, Is critical thinking across the curriculum a plausible goal? OSSA. 69 (2013) Retrieved from https://scholar.uwindsor.ca/ossaarchive/OSSA10/papersandcommentaries/69

Hitchcock, D. (2018). Critical thinking. Retrieved from https://plato.stanford.edu/entries/critical-thinking/

B.K. Hofer, G.M. Sinatra, Epistemology, metacognition, and self-regulation: Musings on an emerging field. Metacogn. Learn. 5 (1), 113–120 (2010)

D.H. Jonassen, Toward a design theory of problem solving. Educ. Technol. Res. Dev. 48 (4), 63–85 (2000)

D.H. Jonassen, Learning to Solve Problems: A Handbook for Designing Problem-Solving Learning Environments (Routledge, New York, 2011)

D.J. Ketelhut, B.C. Nelson, J. Clarke, C. Dede, A multi-user virtual environment for building and assessing higher order inquiry skills in science. Br. J. Educ. Technol. 41 (1), 56–68 (2010)

R. Luckin, W. Holmes, M. Griffiths, L.B. Forcier, Intelligence Unleashed: An Argument for AI in Education (Pearson Education, London, 2016) Retrieved from http://oro.open.ac.uk/50104/1/Luckin%20et%20al.%20-%202016%20-%20Intelligence%20Unleashed.%20An%20argument%20for%20AI%20in%20Educ.pdf

R. Paul, L. Elder, The miniature guide to critical thinking: Concepts and tools , 4th edn. (2006) Retrieved from https://www.criticalthinking.org/files/Concepts_Tools.pdf

J. Piaget, Part I: Cognitive development in children: Piaget development and learning. J. Res. Sci. Teach. 2 (3), 176–186 (1964)

S.J. Russell, P. Norvig, Artificial Intelligence: A Modern Approach , 3rd edn. (Prentice Hall, Upper Saddle River, 2009) ISBN 978-0-136042594

G. Schraw, K.J. Crippen, K. Hartley, Promoting self-regulation in science education: Metacognition as part of a broader perspective on learning. Res. Sci. Educ. 36 (1–2), 111–139 (2006)

D.H. Schunk, Metacognition, self-regulation, and self-regulated learning: Research recommendations. Educ. Psychol. Rev. 20 (4), 463–467 (2008)

K. Siau, Y. Yang, in Twelve Annual Midwest Association for Information Systems Conference (MWAIS 2017) . Impact of artificial intelligence, robotics, and machine learning on sales and marketing (2017), pp. 18–19

D. Silver, A. Huang, C.J. Maddison, A. Guez, L. Sifre, G. Van Den Driessche, et al., Mastering the game of Go with deep neural networks and tree search. Nature. 529 (7587), 484 (2016)

J. M. Spector, M. C. Polson, D. J. Muraida (eds.), Automating Instructional Design: Concepts and Issues (Educational Technology Publications, Englewood Cliffs, 1993)

J.M. Spector, Smart Learning Environments: Concepts and Issues . In G. Chamblee & L. Langub (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 2728–2737). (Association for the Advancement of Computing in Education (AACE), Savannah, GA, United States, 2016). Retrieved June 4, 2019 from https://www.learntechlib.org/primary/p/172078/ .

J. M. Spector, Thinking and learning in the anthropocene: The new 3 Rs . Discussion paper presented at the International Big History Association Conference, Philadelphia, PA (2018). Retrieved from http://learndev.org/dl/HLAIBHA2018/Spector%2C%20J.%20M.%20(2018).%20Thinking%20and%20Learning%20in%20the%20Anthropocene.pdf .

J. M. Spector, Complexity, Inquiry Critical Thinking, and Technology: A Holistic and Developmental Approach . In Mind, Brain and Technology (pp. 17–25). (Springer, Cham, 2019).

R.J. Sternberg, Intelligence as developing expertise. Contemp. Educ. Psychol. 24 (4), 359–375 (1999)

The Reboot Foundation. (2018). The State of Critical Thinking: A New Look at Reasoning at Home, School, and Work. Retrieved from https://reboot-foundation.org/wp-content/uploads/_docs/REBOOT_FOUNDATION_WHITE_PAPER.pdf

V. Vinge, The Coming Technological Singularity: How to Survive in the Post-Human Era . Resource document. NASA Technical report server. Retrieved from https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19940022856.pdf . Accessed 20 June 2019.

D. Zhang, M. Xie, Artificial Intelligence’s Digestion and Reconstruction for Humanistic Feelings . In 2018 International Seminar on Education Research and Social Science (ISERSS 2018) (Atlantis Press, Paris, 2018)

X. Zhu, in Twenty-Ninth AAAI Conference on Artificial Intelligence . Machine Teaching: An Inverse Problem to Machine Learning and an Approach toward Optimal Education (2015)

Download references

Acknowledgements

We wish to acknowledge the generous support of NetDragon and the Digital Research Centre at the University of North Texas.

Initial work is being funded through the NetDragon Digital Research Centre at the University of North Texas with Author as the Principal Investigator.

Author information

Authors and affiliations.

Department of Learning Technologies, University of North Texas Denton, Texas, TX, 76207, USA

Jonathan Michael Spector & Shanshan Ma

You can also search for this author in PubMed   Google Scholar

Contributions

The authors contributed equally to the effort. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Jonathan Michael Spector .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Spector, J.M., Ma, S. Inquiry and critical thinking skills for the next generation: from artificial intelligence back to human intelligence. Smart Learn. Environ. 6 , 8 (2019). https://doi.org/10.1186/s40561-019-0088-z

Download citation

Received : 06 June 2019

Accepted : 27 August 2019

Published : 11 September 2019

DOI : https://doi.org/10.1186/s40561-019-0088-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Artificial intelligence
  • Critical thinking
  • Developmental model
  • Human intelligence
  • Inquiry learning

critical thinking in research paper

  • Open access
  • Published: 09 March 2020

Rubrics to assess critical thinking and information processing in undergraduate STEM courses

  • Gil Reynders 1 , 2 ,
  • Juliette Lantz 3 ,
  • Suzanne M. Ruder 2 ,
  • Courtney L. Stanford 4 &
  • Renée S. Cole   ORCID: orcid.org/0000-0002-2807-1500 1  

International Journal of STEM Education volume  7 , Article number:  9 ( 2020 ) Cite this article

68k Accesses

58 Citations

4 Altmetric

Metrics details

Process skills such as critical thinking and information processing are commonly stated outcomes for STEM undergraduate degree programs, but instructors often do not explicitly assess these skills in their courses. Students are more likely to develop these crucial skills if there is constructive alignment between an instructor’s intended learning outcomes, the tasks that the instructor and students perform, and the assessment tools that the instructor uses. Rubrics for each process skill can enhance this alignment by creating a shared understanding of process skills between instructors and students. Rubrics can also enable instructors to reflect on their teaching practices with regard to developing their students’ process skills and facilitating feedback to students to identify areas for improvement.

Here, we provide rubrics that can be used to assess critical thinking and information processing in STEM undergraduate classrooms and to provide students with formative feedback. As part of the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project, rubrics were developed to assess these two skills in STEM undergraduate students’ written work. The rubrics were implemented in multiple STEM disciplines, class sizes, course levels, and institution types to ensure they were practical for everyday classroom use. Instructors reported via surveys that the rubrics supported assessment of students’ written work in multiple STEM learning environments. Graduate teaching assistants also indicated that they could effectively use the rubrics to assess student work and that the rubrics clarified the instructor’s expectations for how they should assess students. Students reported that they understood the content of the rubrics and could use the feedback provided by the rubric to change their future performance.

The ELIPSS rubrics allowed instructors to explicitly assess the critical thinking and information processing skills that they wanted their students to develop in their courses. The instructors were able to clarify their expectations for both their teaching assistants and students and provide consistent feedback to students about their performance. Supporting the adoption of active-learning pedagogies should also include changes to assessment strategies to measure the skills that are developed as students engage in more meaningful learning experiences. Tools such as the ELIPSS rubrics provide a resource for instructors to better align assessments with intended learning outcomes.

Introduction

Why assess process skills.

Process skills, also known as professional skills (ABET Engineering Accreditation Commission, 2012 ), transferable skills (Danczak et al., 2017 ), or cognitive competencies (National Research Council, 2012 ), are commonly cited as critical for students to develop during their undergraduate education (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ). Process skills such as problem-solving, critical thinking, information processing, and communication are widely applicable to many academic disciplines and careers, and they are receiving increased attention in undergraduate curricula (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ) and workplace hiring decisions (Gray & Koncz, 2018 ; Pearl et al., 2019 ). Recent reports from multiple countries (Brewer & Smith, 2011 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ) indicate that these skills are emphasized in multiple undergraduate academic disciplines, and annual polls of about 200 hiring managers indicate that employers may place more importance on these skills than in applicants’ content knowledge when making hiring decisions (Deloitte Access Economics, 2014 ; Gray & Koncz, 2018 ). The assessment of process skills can provide a benchmark for achievement at the end of an undergraduate program and act as an indicator of student readiness to enter the workforce. Assessing these skills may also enable instructors and researchers to more fully understand the impact of active learning pedagogies on students.

A recent meta-analysis of 225 studies by Freeman et al. ( 2014 ) showed that students in active learning environments may achieve higher content learning gains than students in traditional lectures in multiple STEM fields when comparing scores on equivalent examinations. Active learning environments can have many different attributes, but they are commonly characterized by students “physically manipulating objects, producing new ideas, and discussing ideas with others” (Rau et al., 2017 ) in contrast to students sitting and listening to a lecture. Examples of active learning pedagogies include POGIL (Process Oriented Guided Inquiry Learning) (Moog & Spencer, 2008 ; Simonson, 2019 ) and PLTL (Peer-led Team Learning) (Gafney & Varma-Nelson, 2008 ; Gosser et al., 2001 ) in which students work in groups to complete activities with varying levels of guidance from an instructor. Despite the clear content learning gains that students can achieve from active learning environments (Freeman et al., 2014 ), the non-content-gains (including improvements in process skills) in these learning environments have not been explored to a significant degree. Active learning pedagogies such as POGIL and PLTL place an emphasis on students developing non-content skills in addition to content learning gains, but typically only the content learning is assessed on quizzes and exams, and process skills are not often explicitly assessed (National Research Council, 2012 ). In order to fully understand the effects of active learning pedagogies on all aspects of an undergraduate course, evidence-based tools must be used to assess students’ process skill development. The goal of this work was to develop resources that could enable instructors to explicitly assess process skills in STEM undergraduate classrooms in order to provide feedback to themselves and their students about the students’ process skills development.

Theoretical frameworks

The incorporation of these rubrics and other currently available tools for use in STEM undergraduate classrooms can be viewed through the lenses of constructive alignment (Biggs, 1996 ) and self-regulated learning (Zimmerman, 2002 ). The theory of constructivism posits that students learn by constructing their own understanding of knowledge rather than acquiring the meaning from their instructor (Bodner, 1986 ), and constructive alignment extends the constructivist model to consider how the alignment between a course’s intended learning outcomes, tasks, and assessments affects the knowledge and skills that students develop (Biggs, 2003 ). Students are more likely to develop the intended knowledge and skills if there is alignment between the instructor’s intended learning outcomes that are stated at the beginning of a course, the tasks that the instructor and students perform, and the assessment strategies that the instructor uses (Biggs, 1996 , 2003 , 2014 ). The nature of the tasks and assessments indicates what the instructor values and where students should focus their effort when studying. According to Biggs ( 2003 ) and Ramsden ( 1997 ), students see assessments as defining what they should learn, and a misalignment between the outcomes, tasks, and assessments may hinder students from achieving the intended learning outcomes. In the case of this work, the intended outcomes are improved process skills. In addition to aligning the components of a course, it is also critical that students receive feedback on their performance in order to improve their skills. Zimmerman’s theory of self-regulated learning (Zimmerman, 2002 ) provides a rationale for tailoring assessments to provide feedback to both students and instructors.

Zimmerman’s theory of self-regulated learning defines three phases of learning: forethought/planning, performance, and self-reflection. According to Zimmerman, individuals ideally should progress through these three phases in a cycle: they plan a task, perform the task, and reflect on their performance, then they restart the cycle on a new task. If a student is unable to adequately progress through the phases of self-regulated learning on their own, then feedback provided by an instructor may enable the students to do so (Butler & Winne, 1995 ). Thus, one of our criteria when creating rubrics to assess process skills was to make the rubrics suitable for faculty members to use to provide feedback to their students. Additionally, instructors can use the results from assessments to give themselves feedback regarding their students’ learning in order to regulate their teaching. This theory is called self-regulated learning because the goal is for learners to ultimately reflect on their actions to find ways to improve. We assert that, ideally, both students and instructors should be “learners” and use assessment data to reflect on their actions, although with different aims. Students need consistent feedback from an instructor and/or self-assessment throughout a course to provide a benchmark for their current performance and identify what they can do to improve their process skills (Black & Wiliam, 1998 ; Butler & Winne, 1995 ; Hattie & Gan, 2011 ; Nicol & Macfarlane-Dick, 2006 ). Instructors need feedback on the extent to which their efforts are achieving their intended goals in order to improve their instruction and better facilitate the development of process skills through course experiences.

In accordance with the aforementioned theoretical frameworks, tools used to assess undergraduate STEM student process skills should be tailored to fit the outcomes that are expected for undergraduate students and be able to provide formative assessment and feedback to both students and faculty about the students’ skills. These tools should also be designed for everyday classroom use to enable students to regularly self-assess and faculty to provide consistent feedback throughout a semester. Additionally, it is desirable for assessment tools to be broadly generalizable to measure process skills in multiple STEM disciplines and institutions in order to increase the rubrics’ impact on student learning. Current tools exist to assess these process skills, but they each lack at least one of the desired characteristics for providing regular feedback to STEM students.

Current tools to assess process skills

Current tests available to assess critical thinking include the Critical Thinking Assessment Test (CAT) (Stein & Haynes, 2011 ), California Critical Thinking Skills Test (Facione, 1990a , 1990b ), and Watson Glaser Critical Thinking Appraisal (Watson & Glaser, 1964 ). These commercially available, multiple-choice tests are not designed to provide regular, formative feedback throughout a course and have not been implemented for this purpose. Instead, they are designed to provide summative feedback with a focus on assessing this skill at a programmatic or university level rather than for use in the classroom to provide formative feedback to students. Rather than using tests to assess process skills, rubrics could be used instead. Rubrics are effective assessment tools because they can be quick and easy to use, they provide feedback to both students and instructors, and they can evaluate individual aspects of a skill to give more specific feedback (Brookhart & Chen, 2014 ; Smit & Birri, 2014 ). Rubrics for assessing critical thinking are available, but they have not been used to provide feedback to undergraduate STEM students nor were they designed to do so (Association of American Colleges and Universities, 2019 ; Saxton et al., 2012 ). The Critical Thinking Analytic Rubric is designed specifically to assess K-12 students to enhance college readiness and has not been broadly tested in collegiate STEM courses (Saxton et al., 2012 ). The critical thinking rubric developed by the Association of American Colleges and Universities (AAC&U) as part its Valid Assessment of Learning in Undergraduate Education (VALUE) Institute and Liberal Education and America’s Promise (LEAP) initiative (Association of American Colleges and Universities, 2019 ) is intended for programmatic assessment rather than specifically giving feedback to students throughout a course. As with tests for assessing critical thinking, current rubrics to assess critical thinking are not designed to act as formative assessments and give feedback to STEM faculty and undergraduates at the course or task level. Another issue with the assessment of critical thinking is the degree to which the construct is measurable. A National Research Council report (National Research Council, 2011 ) has suggested that there is little evidence of a consistent, measurable definition for critical thinking and that it may not be different from one’s general cognitive ability. Despite this issue, we have found that critical thinking is consistently listed as a programmatic outcome in STEM disciplines (American Chemical Society Committee on Professional Training, 2015 ; The Royal Society, 2014 ), so we argue that it is necessary to support instructors as they attempt to assess this skill.

Current methods for evaluating students’ information processing include discipline-specific tools such as a rubric to assess physics students’ use of graphs and equations to solve work-energy problems (Nguyen et al., 2010 ) and assessments of organic chemistry students’ ability to “[manipulate] and [translate] between various representational forms” including 2D and 3D representations of chemical structures (Kumi et al., 2013 ). Although these assessment tools can be effectively used for their intended context, they were not designed for use in a wide range of STEM disciplines or for a variety of tasks.

Despite the many tools that exist to measure process skills, none has been designed and tested to facilitate frequent, formative feedback to STEM undergraduate students and faculty throughout a semester. The rubrics described here have been designed by the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project (Cole et al., 2016 ) to assess undergraduate STEM students’ process skills and to facilitate feedback at the classroom level with the potential to track growth throughout a semester or degree program. The rubrics described here are designed to assess critical thinking and information processing in student written work. Rubrics were chosen as the format for our process skill assessment tools because the highest level of each category in rubrics can serve as an explicit learning outcome that the student is expected to achieve (Panadero & Jonsson, 2013 ). Rubrics that are generalizable to multiple disciplines and institutions can enable the assessment of student learning outcomes and active learning pedagogies throughout a program of study and provide useful tools for a greater number of potential users.

Research questions

This work sought to answer the following research questions for each rubric:

Does the rubric adequately measure relevant aspects of the skill?

How well can the rubrics provide feedback to instructors and students?

Can multiple raters use the rubrics to give consistent scores?

This work received Institutional Review Board approval prior to any data collection involving human subjects. The sources of data used to construct the process skill rubrics and answer these research questions were (1) peer-reviewed literature on how each skill is defined, (2) feedback from content experts in multiple STEM disciplines via surveys and in-person, group discussions regarding the appropriateness of the rubrics for each discipline, (3) interviews with students whose work was scored with the rubrics and teaching assistants who scored the student work, and (4) results of applying the rubrics to samples of student work.

Defining the scope of the rubrics

The rubrics described here and the other rubrics in development by the ELIPSS Project are intended to measure process skills, which are desired learning outcomes identified by the STEM community in recent reports (National Research Council, 2012 ; Singer et al., 2012 ). In order to measure these skills in multiple STEM disciplines, operationalized definitions of each skill were needed. These definitions specify which aspects of student work (operations) would be considered evidence for the student using that skill and establish a shared understanding of each skill by members of each STEM discipline. The starting point for this work was the process skill definitions developed as part of the POGIL project (Cole et al., 2019a ). The POGIL community includes instructors from a variety of disciplines and institutions and represented the intended audience for the rubrics: faculty who value process skills and want to more explicitly assess them. The process skills discussed in this work were defined as follows:

Critical thinking is analyzing, evaluating, or synthesizing relevant information to form an argument or reach a conclusion supported with evidence.

Information processing is evaluating, interpreting, and manipulating or transforming information.

Examples of critical thinking include the tasks that students are asked to perform in a laboratory course. When students are asked to analyze the data they collected, combine data from different sources, and generate arguments or conclusions about their data, we see this as critical thinking. However, when students simply follow the so-called “cookbook” laboratory instructions that require them to confirm pre-determined conclusions, we do not think students are engaging in critical thinking. One example of information processing is when organic chemistry students are required to re-draw molecules in different formats. The students must evaluate and interpret various pieces of one representation, and then they recreate the molecule in another representation. However, if students are asked to simply memorize facts or algorithms to solve problems, we do not see this as information processing.

Iterative rubric development

The development process was the same for the information processing rubric and the critical thinking rubric. After defining the scope of the rubric, an initial version was drafted based upon the definition of the target process skill and how each aspect of the skill is defined in the literature. A more detailed discussion of the literature that informed each rubric category is included in the “Results and Discussion” section. This initial version then underwent iterative testing in which the rubric was reviewed by researchers, practitioners, and students. The rubric was first evaluated by the authors and a group of eight faculty from multiple STEM disciplines who made up the ELIPSS Project’s primary collaborative team (PCT). The PCT was a group of faculty members with experience in discipline-based education research who employ active-learning pedagogies in their classrooms. This initial round of evaluation was intended to ensure that the rubric measured relevant aspects of the skill and was appropriate for each PCT member’s discipline. This evaluation determined how well the rubrics were aligned with each instructor’s understanding of the process skill including both in-person and email discussions that continued until the group came to consensus that each rubric category could be applied to student work in courses within their disciplines. There has been an ongoing debate regarding the role of disciplinary knowledge in critical thinking and the extent to which critical thinking is subject-specific (Davies, 2013 ; Ennis, 1990 ). This work focuses on the creation of rubrics to measure process skills in different domains, but we have not performed cross-discipline comparisons. This initial round of review was also intended to ensure that the rubrics were ready for classroom testing by instructors in each discipline. Next, each rubric was tested over three semesters in multiple classroom environments, illustrated in Table 1 . The rubrics were applied to student work chosen by each PCT member. The PCT members chose the student work based on their views of how the assignments required students to engage in process skills and show evidence of those skills. The information processing and critical thinking rubrics shown in this work were each tested in at least three disciplines, course levels, and institutions.

After each semester, the feedback was collected from the faculty testing the rubric, and further changes to the rubric were made. Feedback was collected in the form of survey responses along with in-person group discussions at annual project meetings. After the first iteration of completing the survey, the PCT members met with the authors to discuss how they were interpreting each survey question. This meeting helped ensure that the surveys were gathering valid data regarding how well the rubrics were measuring the desired process skill. Questions in the survey such as “What aspects of the student work provided evidence for the indicated process skill?” and “Are there edits to the rubric/descriptors that would improve your ability to assess the process skill?” allowed the authors to determine how well the rubric scores were matching the student work and identify necessary changes to the rubric. Further questions asked about the nature and timing of the feedback given to students in order to address the question of how well the rubrics provide feedback to instructors and students. The survey questions are included in the Supporting Information . The survey responses were analyzed qualitatively to determine themes related to each research question.

In addition to the surveys given to faculty rubric testers, twelve students were interviewed in fall 2016 and fall 2017. In the United States of America, the fall semester typically runs from August to December and is the first semester of the academic year. Each student participated in one interview which lasted about 30 min. These interviews were intended to gather further data to answer questions about how well the rubrics were measuring the identified process skills that students were using when they completed their assignments and to ensure that the information provided by the rubrics made sense to students. The protocol for these interviews is included in the Supporting Information . In fall 2016, the students interviewed were enrolled in an organic chemistry laboratory course for non-majors at a large, research-intensive university in the United States. Thirty students agreed to have their work analyzed by the research team, and nine students were interviewed. However, the rubrics were not a component of the laboratory course grading. Instead, the first author assessed the students’ reports for critical thinking and information processing, and then the students were provided electronic copies of their laboratory reports and scored rubrics in advance of the interview. The first author had recently been a graduate teaching assistant for the course and was familiar with the instructor’s expectations for the laboratory reports. During the interview, the students were given time to review their reports and the completed rubrics, and then they were asked about how well they understood the content of the rubrics and how accurately each category score represented their work.

In fall 2017, students enrolled in a physical chemistry thermodynamics course for majors were interviewed. The physical chemistry course took place at the same university as the organic laboratory course, but there was no overlap between participants. Three students and two graduate teaching assistants (GTAs) were interviewed. The course included daily group work, and process skill assessment was an explicit part of the instructor’s curriculum. At the end of each class period, students assessed their groups using portions of ELIPSS rubrics, including the two process skill rubrics included in this paper. About every 2 weeks, the GTAs assessed the student groups with a complete ELIPSS rubric for a particular skill, then gave the groups their scored rubrics with written comments. The students’ individual homework problem sets were assessed once with rubrics for three skills: critical thinking, information processing, and problem-solving. The students received the scored rubric with written comments when the graded problem set was returned to them. In the last third of the semester, the students and GTAs were interviewed about how rubrics were implemented in the course, how well the rubric scores reflected the students’ written work, and how the use of rubrics affected the teaching assistants’ ability to assess the student skills. The protocols for these interviews are included in the Supporting Information .

Gathering evidence for utility, validity, and reliability

The utility, validity, and reliability of the rubrics were measured throughout the development process. The utility is the degree to which the rubrics are perceived as practical to experts and practitioners in the field. Through multiple meetings, the PCT faculty determined that early drafts of the rubric seemed appropriate for use in their classrooms, which represented multiple STEM disciplines. Rubric utility was reexamined multiple times throughout the development process to ensure that the rubrics would remain practical for classroom use. Validity can be defined in multiple ways. For example, the Standards for Educational and Psychological Testing (Joint Committee on Standards for Educational Psychological Testing, 2014 ) defines validity as “the degree to which all the accumulated evidence supports the intended interpretation of test scores for the proposed use.” For the purposes of this work, we drew on the ways in which two distinct types of validity were examined in the rubric literature: content validity and construct validity. Content validity is the degree to which the rubrics cover relevant aspects of each process skill (Moskal & Leydens, 2000 ). In this case, the process skill definition and a review of the literature determined which categories were included in each rubric. The literature review was finished once the data was saturated: when no more new aspects were found. Construct validity is the degree to which the levels of each rubric category accurately reflect the process that students performed (Moskal & Leydens, 2000 ). Evidence of construct validity was gathered via the faculty surveys, teaching assistant interviews, and student interviews. In the student interviews, students were given one of their completed assignments and asked to explain how they completed the task. Students were then asked to explain how well each category applied to their work and if any changes were needed to the rubric to more accurately reflect their process. Due to logistical challenges, we were not able to obtain evidence for convergent validity, and this is further discussed in the “Limitations” section.

Adjacent agreement, also known as “interrater agreement within one,” was chosen as the measure of interrater reliability due to its common use in rubric development projects (Jonsson & Svingby, 2007 ). The adjacent agreement is the percentage of cases in which two raters agree on a rating or are different by one level (i.e., they give adjacent ratings to the same work). Jonsson and Svingby ( 2007 ) found that most of the rubrics they reviewed had adjacent agreement scores of 90% or greater. However, they noted that the agreement threshold varied based on the number of possible levels of performance for each category in the rubric, with three and four being the most common numbers of levels. Since the rubrics discussed in this report have six levels (scores of zero through five) and are intended for low-stakes assessment and feedback, the goal of 80% adjacent agreement was selected. To calculate agreement for the critical thinking and information processing rubrics, two researchers discussed the scoring criteria for each rubric and then independently assessed the organic chemistry laboratory reports.

Results and discussion

The process skill rubrics to assess critical thinking and information processing in student written work were completed after multiple rounds of revision based on feedback from various sources. These sources include feedback from instructors who tested the rubrics in their classrooms, TAs who scored student work with the rubrics, and students who were assessed with the rubrics. The categories for each rubric will be discussed in terms of the evidence that the rubrics measure the relevant aspects of the skill and how they can be used to assess STEM undergraduate student work. Each category discussion will begin with a general explanation of the category followed by more specific examples from the organic chemistry laboratory course and physical chemistry lecture course to demonstrate how the rubrics can be used to assess student work.

Information processing rubric

The definition of information processing and the focus of the rubric presented here (Fig. 1 ) are distinct from cognitive information processing as defined by the educational psychology literature (Driscoll, 2005 ). The rubric shown here is more aligned with the STEM education construct of representational competency (Daniel et al., 2018 ).

figure 1

Rubric for assessing information processing

When solving a problem or completing a task, students must evaluate the provided information for relevance or importance to the task (Hanson, 2008 ; Swanson et al., 1990 ). All the information provided in a prompt (e.g., homework or exam questions) may not be relevant for addressing all parts of the prompt. Students should ideally show evidence of their evaluation process by identifying what information is present in the prompt/model, indicating what information is relevant or not relevant, and indicating why information is relevant. Responses with these characteristics would earn high rubric scores for this category. Although students may not explicitly state what information is necessary to address a task, the information they do use can act as indirect evidence of the degree to which they have evaluated all of the available information in the prompt. Evidence for students inaccurately evaluating information for relevance includes the inclusion of irrelevant information or the omission of relevant information in an analysis or in completing a task. When evaluating the organic chemistry laboratory reports, the focus for the evaluating category was the information students presented when identifying the chemical structure of their products. For students who received a high score, this information included their measured value for the product’s melting point, the literature (expected) value for the melting point, and the peaks in a nuclear magnetic resonance (NMR) spectrum. NMR spectroscopy is a commonly used technique in chemistry to obtain structural information about a compound. Lower scores were given if students omitted any of the necessary information or if they included unnecessary information. For example, if a student discussed their reaction yield when discussing the identity of their product, they would receive a low Evaluating score because the yield does not help them determine the identity of their product; the yield, in this case, would be unnecessary information. In the physical chemistry course, students often did not show evidence that they determined which information was relevant to answer the homework questions and thus earned low evaluating scores. These omissions will be further addressed in the “Interpreting” section.

Interpreting

In addition to evaluating, students must often interpret information using their prior knowledge to explain the meaning of something, make inferences, match data to predictions, and extract patterns from data (Hanson, 2008 ; Nakhleh, 1992 ; Schmidt et al., 1989 ; Swanson et al., 1990 ). Students earn high scores for this category if they assign correct meaning to labeled information (e.g., text, tables, graphs, diagrams), extract specific details from information, explain information in their own words, and determine patterns in information. For the organic chemistry laboratory reports, students received high scores if they accurately interpreted their measured values and NMR peaks. Almost every student obtained melting point values that were different than what was expected due to measurement error or impurities in their products, so they needed to describe what types of impurities could cause such discrepancies. Also, each NMR spectrum contained one peak that corresponded to the solvent used to dissolve the students’ product, so the students needed to use their prior knowledge of NMR spectroscopy to recognize that peak did not correspond to part of their product.

In physical chemistry, the graduate teaching assistant often gave students low scores for inaccurately explaining changes to chemical systems such as changes in pressure or entropy. The graduate teaching assistant who assessed the student work used the rubric to identify both the evaluating and interpreting categories as weaknesses in many of the students’ homework submissions. However, the students often earned high scores for the manipulating and transforming categories, so the GTA was able to give students specific feedback on their areas for improvement while also highlighting their strengths.

Manipulating and transforming (extent and accuracy)

In addition to evaluating and interpreting information, students may be asked to manipulate and transform information from one form to another. These transformations should be complete and accurate (Kumi et al., 2013 ; Nguyen et al., 2010 ). Students may be required to construct a figure based on written information, or conversely, they may transform information in a figure into words or mathematical expressions. Two categories for manipulating and transforming (i.e., extent and accuracy) were included to allow instructors to give more specific feedback. It was often found that students would either transform little information but do so accurately, or transform much information and do so inaccurately; the two categories allowed for differentiated feedback to be provided. As stated above, the organic chemistry students were expected to transform their NMR spectral data into a table and provide a labeled structure of their final product. Students were given high scores if they converted all of the relevant peaks from their spectrum into the table format and were able to correctly match the peaks to the hydrogen atoms in their products. Students received lower scores if they were only able to convert the information for a few peaks or if they incorrectly matched the peaks to the hydrogen atoms.

Critical thinking rubric

Critical thinking can be broadly defined in different contexts, but we found that the categories included in the rubric (Fig. 2 ) represented commonly accepted aspects of critical thinking (Danczak et al., 2017 ) and suited the needs of the faculty collaborators who tested the rubric in their classrooms.

figure 2

Rubric for assessing critical thinking

When completing a task, students must evaluate the relevance of information that they will ultimately use to support a claim or conclusions (Miri et al., 2007 ; Zohar et al., 1994 ). An evaluating category is included in both critical thinking and information processing rubrics because evaluation is a key aspect of both skills. From our previous work developing a problem-solving rubric (manuscript in preparation) and our review of the literature for this work (Danczak et al., 2017 ; Lewis & Smith, 1993 ), the overlap was seen between information processing, critical thinking, and problem-solving. Additionally, while the Evaluating category in the information processing rubric assesses a student’s ability to determine the importance of information to complete a task, the evaluating category in the critical thinking rubric places a heavier emphasis on using the information to support a conclusion or argument.

When scoring student work with the evaluating category, students receive high scores if they indicate what information is likely to be most relevant to the argument they need to make, determine the reliability of the source of their information, and determine the quality and accuracy of the information itself. The information used to assess this category can be indirect as with the Evaluating category in the information processing rubric. In the organic chemistry laboratory reports, students needed to make an argument about whether they successfully produced the desired product, so they needed to discuss which information was relevant to their claims about the product’s identity and purity. Students received high scores for the evaluating category when they accurately determined that the melting point and nearly all peaks except the solvent peak in the NMR spectrum indicated the identity of their product. Students received lower scores for evaluating when they left out relevant information because this was seen as evidence that the student inaccurately evaluated the information’s relevance in supporting their conclusion. They also received lower scores when they incorrectly stated that a high yield indicated a pure product. Students were given the opportunity to demonstrate their ability to evaluate the quality of information when discussing their melting point. Students sometimes struggled to obtain reliable melting point data due to their inexperience in the laboratory, so the rubric provided a way to assess the student’s ability to critique their own data.

In tandem with evaluating information, students also need to analyze that same information to extract meaningful evidence to support their conclusions (Bailin, 2002 ; Lai, 2011 ; Miri et al., 2007 ). The analyzing category provides an assessment of a student’s ability to discuss information and explore the possible meaning of that information, extract patterns from data/information that could be used as evidence for their claims, and summarize information that could be used as evidence. For example, in the organic chemistry laboratory reports, students needed to compare the information they obtained to the expected values for a product. Students received high scores for the analyzing category if they could extract meaningful structural information from the NMR spectrum and their two melting points (observed and expected) for each reaction step.

Synthesizing

Often, students are asked to synthesize or connect multiple pieces of information in order to draw a conclusion or make a claim (Huitt, 1998 ; Lai, 2011 ). Synthesizing involves identifying the relationships between different pieces of information or concepts, identifying ways that different pieces of information or concepts can be combined, and explaining how the newly synthesized information can be used to reach a conclusion and/or support an argument. While performing the organic chemistry laboratory experiments, students obtained multiple types of information such as the melting point and NMR spectrum in addition to other spectroscopic data such as an infrared (IR) spectrum. Students received high scores for this category when they accurately synthesized these multiple data types by showing how the NMR and IR spectra could each reveal different parts of a molecule in order to determine the molecule’s entire structure.

Forming arguments (structure and validity)

The final key aspect of critical thinking is forming a well-structured and valid argument (Facione, 1984 ; Glassner & Schwarz, 2007 ; Lai, 2011 ; Lewis & Smith, 1993 ). It was observed that students can earn high scores for evaluating, analyzing, and synthesizing, but still struggle to form arguments. This was particularly common in assessing problem sets in the physical chemistry course.

As with the manipulating and transforming categories in the information processing rubric, two forming arguments categories were included to allow instructors to give more specific feedback. Some students may be able to include all of the expected structural elements of their arguments but use faulty information or reasoning. Conversely, some students may be able to make scientifically valid claims but not necessarily support them with evidence. The two forming arguments categories are intended to accurately assess both of these scenarios. For the forming arguments (structure) category, students earn high scores if they explicitly state their claim or conclusion, list the evidence used to support the argument, and provide reasoning to link the evidence to their claim/conclusion. Students who do not make a claim or who provide little evidence or reasoning receive lower scores.

For the forming arguments (validity) category, students earn high scores if their claim is accurate and their reasoning is logical and clearly supports the claim with provided evidence. Organic chemistry students earned high scores for the forms and supports arguments categories if they made explicit claims about the identity and purity of their product and provided complete and accurate evidence for their claim(s) such as the melting point values and positions of NMR peaks that correspond to their product. Additionally, the students provided evidence for the purity of their products by pointing to the presence or absence of peaks in their NMR spectrum that would match other potential side products. They also needed to provide logical reasoning for why the peaks indicated the presence or absence of a compound. As previously mentioned, the physical chemistry students received lower scores for the forming arguments categories than for the other aspects of critical thinking. These students were asked to make claims about the relationships between entropy and heat and then provide relevant evidence to justify these claims. Often, the students would make clearly articulated claims but would provide little evidence to support them. As with the information processing rubric, the critical thinking rubric allowed the GTAs to assess aspects of these skills independently and identify specific areas for student improvement.

Validity and reliability

The goal of this work was to create rubrics that can accurately assess student work (validity) and be consistently implemented by instructors or researchers within multiple STEM fields (reliability). The evidence for validity includes the alignment of the rubrics with literature-based descriptions of each skill, review of the rubrics by content experts from multiple STEM disciplines, interviews with undergraduate students whose work was scored using the rubrics, and interviews of the GTAs who scored the student work.

The definitions for each skill, along with multiple iterations of the rubrics, underwent review by STEM content experts. As noted earlier, the instructors who were testing the rubrics were given a survey at the end of each semester and were invited to offer suggested changes to the rubric to better help them assess their students. After multiple rubric revisions, survey responses from the instructors indicated that the rubrics accurately represented the breadth of each process skill as seen in each expert’s content area and that each category could be used to measure multiple levels of student work. By the end of the rubrics’ development, instructors were writing responses such as “N/A” or “no suggestions” to indicate that the rubrics did not need further changes.

Feedback from the faculty also indicated that the rubrics were measuring the intended constructs by the ways they responded to the survey item “What aspects of the student work provided evidence for the indicated process skill?” For example, one instructor noted that for information processing, she saw evidence of the manipulating and transforming categories when “students had to transform their written/mathematical relationships into an energy diagram.” Another instructor elicited evidence of information processing during an in-class group quiz: “A question on the group quiz was written to illicit [sic] IP [information processing]. Students had to transform a structure into three new structures and then interpret/manipulate the structures to compare the pKa values [acidity] of the new structures.” For this instructor, the structures written by the students revealed evidence of their information processing by showing what information they omitted in the new structures or inaccurately transformed. For critical thinking, an instructor assessed short research reports with the critical thinking rubric and “looked for [the students’] ability to use evidence to support their conclusions, to evaluate the literature studies, and to develop their own judgements by synthesizing the information.” Another instructor used the critical thinking rubric to assess their students’ abilities to choose an instrument to perform a chemical analysis. According to the instructor, the students provided evidence of their critical thinking because “in their papers, they needed to justify their choice of instrument. This justification required them to evaluate information and synthesize a new understanding for this specific chemical analysis.”

Analysis of student work indicates multiple levels of achievement for each rubric category (illustrated in Fig. 3 ), although there may have been a ceiling effect for the evaluating and the manipulating and transforming (extent) categories in information processing for organic chemistry laboratory reports because many students earned the highest possible score (five) for those categories. However, other implementations of the ELIPSS rubrics (Reynders et al., 2019 ) have shown more variation in student scores for the two process skills.

figure 3

Student rubric scores from an organic chemistry laboratory course. The two rubrics were used to evaluate different laboratory reports. Thirty students were assessed for information processing and 28 were assessed for critical thinking

To provide further evidence that the rubrics were measuring the intended skills, students in the physical chemistry course were interviewed about their thought processes and how well the rubric scores reflected the work they performed. During these interviews, students described how they used various aspects of information processing and critical thinking skills. The students first described how they used information processing during a problem set where they had to answer questions about a diagram of systolic and diastolic blood pressure. Students described how they evaluated and interpreted the graph to make statements such as “diastolic [pressure] is our y-intercept” and “volume is the independent variable.” The students then demonstrated their ability to transform information from one form to another, from a graph to a mathematic equation, by recognizing “it’s a linear relationship so I used Y equals M X plus B ” and “integrated it cause it’s the change, the change in V [volume]. For critical thinking, students described their process on a different problem set. In this problem set, the students had to explain why the change of Helmholtz energy and the change in Gibbs free energy were equivalent under a certain given condition. Students first demonstrated how they evaluated the relevant information and analyzed what would and would not change in their system. One student said, “So to calculate the final pressure, I think I just immediately went to the ideal gas law because we know the final volume and the number of moles won’t change and neither will the temperature in this case. Well, I assume that it wouldn’t.” Another student showed evidence of their evaluation by writing out all the necessary information in one place and stating, “Whenever I do these types of problems, I always write what I start with which is why I always have this line of information I’m given.” After evaluating and analyzing, students had to form an argument by claiming that the two energy values were equal and then defending that claim. Students explained that they were not always as clear as they could be when justifying their claim. For instance, one student said, “Usually I just write out equations and then hope people understand what I’m doing mathematically” but they “probably could have explained it a little more.”

Student feedback throughout the organic chemistry course and near the end of the physical chemistry course indicated that the rubric scores were accurate representations of the students’ work with a few exceptions. For example, some students felt like they should have received either a lower or higher score for certain categories, but they did say that the categories themselves applied well to their work. Most notably, one student reported that the forms and supports arguments categories in the critical thinking rubric did not apply to her work because she “wasn’t making an argument” when she was demonstrating that the Helmholtz and Gibbs energy values were equal in her thermodynamics assignment. We see this as an instance where some students and instructors may define argument in different ways. The process skill definitions and the rubric categories are meant to articulate intended learning outcomes from faculty members to their students, so if a student defines the skills or categories differently than the faculty member, then the rubrics can serve to promote a shared understanding of the skill.

As previously mentioned, reliability was measured by two researchers assessing ten laboratory reports independently to ensure that multiple raters could use the rubrics consistently. The average adjacent agreement scores were 92% for critical thinking and 93% for information processing. The exact agreement scores were 86% for critical thinking and 88% for information processing. Additionally, two different raters assessed a statistics assignment that was given to sixteen first-year undergraduates. The average pairwise adjacent agreement scores were 89% for critical thinking and 92% for information processing for this assignment. However, the exact agreement scores were much lower: 34% for critical thinking and 36% for information processing. In this case, neither rater was an expert in the content area. While the exact agreement scores for the statistics assignment are much lower than desirable, the adjacent agreement scores do meet the threshold for reliability as seen in other rubrics (Jonsson & Svingby, 2007 ) despite the disparity in expertise. Based on these results, it may be difficult for multiple raters to give exactly the same scores to the same work if they have varying levels of content knowledge, but it is important to note that the rubrics are primarily intended for formative assessment that can facilitate discussions between instructors and students about the ways for students to improve. The high level of adjacent agreement scores indicates that multiple raters can identify the same areas to improve in examples of student work.

Instructor and teaching assistant reflections

The survey responses from faculty members determined the utility of the rubrics. Faculty members reported that when they used the rubrics to define their expectations and be more specific about their assessment criteria, the students seemed to be better able to articulate the areas in which they needed improvement. As one instructor put it, “having the rubrics helped open conversations and discussions” that were not happening before the rubrics were implemented. We see this as evidence of the clear intended learning outcomes that are an integral aspect of achieving constructive alignment within a course. The instructors’ specific feedback to the students, and the students’ increased awareness of their areas for improvement, may enable the students to better regulate their learning throughout a course. Additionally, the survey responses indicated that the faculty members were changing their teaching practices and becoming more cognizant of how assignments did or did not elicit the process skill evidence that they desired. After using the rubrics, one instructor said, “I realize I need to revise many of my activities to more thoughtfully induce process skill development.” We see this as evidence that the faculty members were using the rubrics to regulate their teaching by reflecting on the outcomes of their practices and then planning for future teaching. These activities represent the reflection and forethought/planning aspects of self-regulated learning on the part of the instructors. Graduate teaching assistants in the physical chemistry course indicated that the rubrics gave them a way to clarify the instructor’s expectations when they were interacting with the students. As one GTA said, “It’s giving [the students] feedback on direct work that they have instead of just right or wrong. It helps them to understand like ‘Okay how can I improve? What areas am I lacking in?’” A more detailed account of how the instructors and teaching assistants implemented the rubrics has been reported elsewhere (Cole et al., 2019a ).

Student reflections

Students in both the organic and physical chemistry courses reported that they could use the rubrics to engage in the three phases of self-regulated learning: forethought/planning, performing, and reflecting. In an organic chemistry interview, one student was discussing how they could improve their low score for the synthesizing category of critical thinking by saying “I could use the data together instead of trying to use them separately,” thus demonstrating forethought/planning for their later work. Another student described how they could use the rubric while performing a task: “I could go through [the rubric] as I’m writing a report…and self-grade.” Finally, one student demonstrated how they could use the rubrics to reflect on their areas for improvement by saying that “When you have the five column [earn a score of five], I can understand that I’m doing something right” but “I really need to work on revising my reports.” We see this as evidence that students can use the rubrics to regulate their own learning, although classroom facilitation can have an effect on the ways in which students use the rubric feedback (Cole et al., 2019b ).

Limitations

The process skill definitions presented here represent a consensus understanding among members of the POGIL community and the instructors who participated in this study, but these skills are often defined in multiple ways by various STEM instructors, employers, and students (Danczak et al., 2017 ). One issue with critical thinking, in particular, is the broadness of how the skill is defined in the literature. Through this work, we have evidence via expert review to indicate that our definitions represent common understandings among a set of STEM faculty. Nonetheless, we cannot claim that all STEM instructors or researchers will share the skill definitions presented here.

There is currently a debate in the STEM literature (National Research Council, 2011 ) about whether the critical thinking construct is domain-general or domain-specific, that is, whether or not one’s critical thinking ability in one discipline can be applied to another discipline. We cannot make claims about the generalness of the construct based on the data presented here because the same students were not tested across multiple disciplines or courses. Additionally, we did not gather evidence for convergent validity, which is “the degree to which an operationalized construct is similar to other operationalized constructs that it theoretically should be similar to” (National Research Council, 2011 ). In other words, evidence for convergent validity would be the comparison of multiple measures of information processing or critical thinking. However, none of the instructors who used the ELIPSS rubrics also used a secondary measure of the constructs. Although the rubrics were examined by a multidisciplinary group of collaborators, this group was primarily chemists and included eight faculties from other disciplines, so the content validity of the rubrics may be somewhat limited.

Finally, the generalizability of the rubrics is limited by the relatively small number of students who were interviewed about their work. During their interviews, the students in the organic and physical chemistry courses each said that they could use the rubric scores as feedback to improve their skills. Additionally, as discussed in the “Validity and Reliability” section, the processes described by the students aligned with the content of the rubric and provided evidence of the rubric scores’ validity. However, the data gathered from the student interviews only represents the views of a subset of students in the courses, and further study is needed to determine the most appropriate contexts in which the rubrics can be implemented.

Conclusions and implications

Two rubrics were developed to assess and provide feedback on undergraduate STEM students’ critical thinking and information processing. Faculty survey responses indicated that the rubrics measured the relevant aspects of each process skill in the disciplines that were examined. Faculty survey responses, TA interviews, and student interviews over multiple semesters indicated that the rubric scores accurately reflected the evidence of process skills that the instructors wanted to see and the processes that the students performed when they were completing their assignments. The rubrics showed high inter-rater agreement scores, indicating that multiple raters could identify the same areas for improvement in student work.

In terms of constructive alignment, courses should ideally have alignment between their intended learning outcomes, student and instructor activities, and assessments. By using the ELIPSS rubrics, instructors were able to explicitly articulate the intended learning outcomes of their courses to their students. The instructors were then able to assess and provide feedback to students on different aspects of their process skills. Future efforts will be focused on modifying student assignments to enable instructors to better elicit evidence of these skills. In terms of self-regulated learning, students indicated in the interviews that the rubric scores were accurate representations of their work (performances), could help them reflect on their previous work (self-reflection), and the feedback they received could be used to inform their future work (forethought). Not only did the students indicate that the rubrics could help them regulate their learning, but the faculty members indicated that the rubrics had helped them regulate their teaching. With the individual categories on each rubric, the faculty members were better able to observe their students’ strengths and areas for improvement and then tailor their instruction to meet those needs. Our results indicated that the rubrics helped instructors in multiple STEM disciplines and at multiple institutions reflect on their teaching and then make changes to better align their teaching with their desired outcomes.

Overall, the rubrics can be used in a number of different ways to modify courses or for programmatic assessment. As previously stated, instructors can use the rubrics to define expectations for their students and provide them with feedback on desired skills throughout a course. The rubric categories can be used to give feedback on individual aspects of student process skills to provide specific feedback to each student. If an instructor or department wants to change from didactic lecture-based courses to active learning ones, the rubrics can be used to measure non-content learning gains that stem from the adoption of such pedagogies. Although the examples provided here for each rubric were situated in chemistry contexts, the rubrics were tested in multiple disciplines and institution types. The rubrics have the potential for wide applicability to assess not only laboratory reports but also homework assignments, quizzes, and exams. Assessing these tasks provides a way for instructors to achieve constructive alignment between their intended outcomes and their assessments, and the rubrics are intended to enhance this alignment to improve student process skills that are valued in the classroom and beyond.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

American Association of Colleges and Universities

Critical Thinking Assessment Test

Comprehensive University

Enhancing Learning by Improving Process Skills in STEM

Liberal Education and America’s Promise

Nuclear Magnetic Resonance

Primary Collaborative Team

Peer-led Team Learning

Process Oriented Guided Inquiry Learning

Primarily Undergraduate Institution

Research University

Science, Technology, Engineering, and Mathematics

Valid Assessment of Learning in Undergraduate Education

ABET Engineering Accreditation Commission. (2012). Criteria for Accrediting Engineering Programs . Retrieved from http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2016-2017/ .

American Chemical Society Committee on Professional Training. (2015). Unergraduate Professional Education in Chemistry: ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs . Retrieved from https://www.acs.org/content/dam/acsorg/about/governance/committees/training/2015-acs-guidelines-for-bachelors-degree-programs.pdf

Association of American Colleges and Universities. (2019). VALUE Rubric Development Project. Retrieved from https://www.aacu.org/value/rubrics .

Bailin, S. (2002). Critical Thinking and Science Education. Science and Education, 11 , 361–375.

Article   Google Scholar  

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32 (3), 347–364.

Biggs, J. (2003). Aligning teaching and assessing to course objectives. Teaching and learning in higher education: New trends and innovations, 2 , 13–17.

Google Scholar  

Biggs, J. (2014). Constructive alignment in university teaching. HERDSA Review of higher education, 1 (1), 5–22.

Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5 (1), 7–74.

Bodner, G. M. (1986). Constructivism: A theory of knowledge. Journal of Chemical Education, 63 (10), 873–878.

Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science . DC : Washington .

Brookhart, S. M., & Chen, F. (2014). The quality and effectiveness of descriptive rubrics. Educational Review , 1–26.

Butler, D. L., & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Educational Research, 65 (3), 245–281.

Cole, R., Lantz, J., & Ruder, S. (2016). Enhancing Learning by Improving Process Skills in STEM. Retrieved from http://www.elipss.com .

Cole, R., Lantz, J., & Ruder, S. (2019a). PO: The Process. In S. R. Simonson (Ed.), POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners (pp. 42–68). Sterling, VA: Stylus Publishing.

Cole, R., Reynders, G., Ruder, S., Stanford, C., & Lantz, J. (2019b). Constructive Alignment Beyond Content: Assessing Professional Skills in Student Group Interactions and Written Work. In M. Schultz, S. Schmid, & G. A. Lawrie (Eds.), Research and Practice in Chemistry Education: Advances from the 25 th IUPAC International Conference on Chemistry Education 2018 (pp. 203–222). Singapore: Springer.

Chapter   Google Scholar  

Danczak, S., Thompson, C., & Overton, T. (2017). ‘What does the term Critical Thinking mean to you?’A qualitative analysis of chemistry undergraduate, teaching staff and employers' views of critical thinking. Chemistry Education Research and Practice, 18 , 420–434.

Daniel, K. L., Bucklin, C. J., Leone, E. A., & Idema, J. (2018). Towards a Definition of Representational Competence. In Towards a Framework for Representational Competence in Science Education (pp. 3–11). Switzerland: Springer.

Davies, M. (2013). Critical thinking and the disciplines reconsidered. Higher Education Research & Development, 32 (4), 529–544.

Deloitte Access Economics. (2014). Australia's STEM Workforce: a survey of employers. Retrieved from https://www2.deloitte.com/au/en/pages/economics/articles/australias-stem-workforce-survey.html .

Driscoll, M. P. (2005). Psychology of learning for instruction . Boston, MA: Pearson Education.

Ennis, R. H. (1990). The extent to which critical thinking is subject-specific: Further clarification. Educational researcher, 19 (4), 13–16.

Facione, P. A. (1984). Toward a theory of critical thinking. Liberal Education, 70 (3), 253–261.

Facione, P. A. (1990a). The California Critical Thinking Skills Test--College Level . In Technical Report #1 . Experimental Validation and Content : Validity .

Facione, P. A. (1990b). The California critical thinking skills test—college level . In Technical Report #2 . Factors Predictive of CT : Skills .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111 (23), 8410–8415.

Gafney, L., & Varma-Nelson, P. (2008). Peer-led team learning: evaluation, dissemination, and institutionalization of a college level initiative (Vol. 16): Springer Science & Business Media, Netherlands.

Glassner, A., & Schwarz, B. B. (2007). What stands and develops between creative and critical thinking? Argumentation? Thinking Skills and Creativity, 2 (1), 10–18.

Gosser, D. K., Cracolice, M. S., Kampmeier, J. A., Roth, V., Strozak, V. S., & Varma-Nelson, P. (2001). Peer-led team learning: A guidebook: Prentice Hall Upper Saddle River, NJ .

Gray, K., & Koncz, A. (2018). The key attributes employers seek on students' resumes. Retrieved from http://www.naceweb.org/about-us/press/2017/the-key-attributes-employers-seek-on-students-resumes/ .

Hanson, D. M. (2008). A cognitive model for learning chemistry and solving problems: implications for curriculum design and classroom instruction. In R. S. Moog & J. N. Spencer (Eds.), Process-Oriented Guided Inquiry Learning (pp. 15–19). Washington, DC: American Chemical Society.

Hattie, J., & Gan, M. (2011). Instruction based on feedback. Handbook of research on learning and instruction , 249-271.

Huitt, W. (1998). Critical thinking: an overview. In Educational psychology interactive Retrieved from http://www.edpsycinteractive.org/topics/cogsys/critthnk.html .

Joint Committee on Standards for Educational Psychological Testing. (2014). Standards for Educational and Psychological Testing : American Educational Research Association.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2 (2), 130–144.

Kumi, B. C., Olimpo, J. T., Bartlett, F., & Dixon, B. L. (2013). Evaluating the effectiveness of organic chemistry textbooks in promoting representational fluency and understanding of 2D-3D diagrammatic relationships. Chemistry Education Research and Practice, 14 , 177–187.

Lai, E. R. (2011). Critical thinking: a literature review. Pearson's Research Reports, 6 , 40–41.

Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32 , 131–137.

Miri, B., David, B., & Uri, Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: a case of critical thinking. Research in Science Education, 37 , 353–369.

Moog, R. S., & Spencer, J. N. (Eds.). (2008). Process oriented guided inquiry learning (POGIL) . Washington, DC: American Chemical Society.

Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research and Evaluation, 7 , 1–11.

Nakhleh, M. B. (1992). Why some students don't learn chemistry: Chemical misconceptions. Journal of Chemical Education, 69 (3), 191.

National Research Council. (2011). Assessing 21st Century Skills: Summary of a Workshop . Washington, DC: The National Academies Press.

National Research Council. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century . Washington, DC: The National Academies Press.

Nguyen, D. H., Gire, E., & Rebello, N. S. (2010). Facilitating Strategies for Solving Work-Energy Problems in Graphical and Equational Representations. 2010 Physics Education Research Conference, 1289 , 241–244.

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31 (2), 199–218.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: a review. Educational Research Review, 9 , 129–144.

Pearl, A. O., Rayner, G., Larson, I., & Orlando, L. (2019). Thinking about critical thinking: An industry perspective. Industry & Higher Education, 33 (2), 116–126.

Ramsden, P. (1997). The context of learning in academic departments. The experience of learning, 2 , 198–216.

Rau, M. A., Kennedy, K., Oxtoby, L., Bollom, M., & Moore, J. W. (2017). Unpacking “Active Learning”: A Combination of Flipped Classroom and Collaboration Support Is More Effective but Collaboration Support Alone Is Not. Journal of Chemical Education, 94 (10), 1406–1414.

Reynders, G., Suh, E., Cole, R. S., & Sansom, R. L. (2019). Developing student process skills in a general chemistry laboratory. Journal of Chemical Education , 96 (10), 2109–2119.

Saxton, E., Belanger, S., & Becker, W. (2012). The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments. Assessing Writing, 17 , 251–270.

Schmidt, H. G., De Volder, M. L., De Grave, W. S., Moust, J. H. C., & Patel, V. L. (1989). Explanatory Models in the Processing of Science Text: The Role of Prior Knowledge Activation Through Small-Group Discussion. J. Educ. Psychol., 81 , 610–619.

Simonson, S. R. (Ed.). (2019). POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners . Sterling, VA: Stylus Publishing, LLC.

Singer, S. R., Nielsen, N. R., & Schweingruber, H. A. (Eds.). (2012). Discipline-Based education research: understanding and improving learning in undergraduate science and engineering . Washington D.C.: The National Academies Press.

Smit, R., & Birri, T. (2014). Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies. Studies in Educational Evaluation, 43 , 5–13.

Stein, B., & Haynes, A. (2011). Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test. Change: The Magazine of Higher Learning, 43 , 44–49.

Swanson, H. L., Oconnor, J. E., & Cooney, J. B. (1990). An Information-Processing Analysis of Expert and Novice Teachers Problem-Solving. American Educational Research Journal, 27 (3), 533–556.

The Royal Society. (2014). Vision for science and mathematics education: The Royal Society Science Policy Centre . London: England.

Watson, G., & Glaser, E. M. (1964). Watson-Glaser Critical Thinking Appraisal Manual . New York, NY: Harcourt, Brace, and World.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41 (2), 64–70.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The Effect of the Biology Critical Thinking Project on the Development of Critical Thinking. Journal of Research in Science Teaching, 31 , 183–196.

Download references

Acknowledgements

We thank members of our Primary Collaboration Team and Implementation Cohorts for collecting and sharing data. We also thank all the students who have allowed us to examine their work and provided feedback.

Supporting information

• Product rubric survey

• Initial implementation survey

• Continuing implementation survey

This work was supported in part by the National Science Foundation under collaborative grants #1524399, #1524936, and #1524965. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Department of Chemistry, University of Iowa, W331 Chemistry Building, Iowa City, Iowa, 52242, USA

Gil Reynders & Renée S. Cole

Department of Chemistry, Virginia Commonwealth University, Richmond, Virginia, 23284, USA

Gil Reynders & Suzanne M. Ruder

Department of Chemistry, Drew University, Madison, New Jersey, 07940, USA

Juliette Lantz

Department of Chemistry, Ball State University, Muncie, Indiana, 47306, USA

Courtney L. Stanford

You can also search for this author in PubMed   Google Scholar

Contributions

RC, JL, and SR performed an initial literature review that was expanded by GR. All authors designed the survey instruments. GR collected and analyzed the survey and interview data with guidance from RC. GR revised the rubrics with extensive input from all other authors. All authors contributed to reliability measurements. GR drafted all manuscript sections. RC provided extensive comments during manuscript revisions; JL, SR, and CS also offered comments. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Renée S. Cole .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Supporting Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Reynders, G., Lantz, J., Ruder, S.M. et al. Rubrics to assess critical thinking and information processing in undergraduate STEM courses. IJ STEM Ed 7 , 9 (2020). https://doi.org/10.1186/s40594-020-00208-5

Download citation

Received : 01 October 2019

Accepted : 20 February 2020

Published : 09 March 2020

DOI : https://doi.org/10.1186/s40594-020-00208-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Constructive alignment
  • Self-regulated learning
  • Process skills
  • Professional skills
  • Critical thinking
  • Information processing

critical thinking in research paper

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Adv Med Educ Prof
  • v.2(3); 2014 Jul

The role of critical thinking skills and learning styles of university students in their academic performance

Zohre ghazivakili.

1 Emergency medical services department, Paramedical school, Alborz University of Medical Sciences, Karaj, Iran;

ROOHANGIZ NOROUZI NIA

2 Educational Development Center, Alborz University of Medical Sciences, Karaj, Iran;

FARIDE PANAHI

3 Nursing and midwifery school, Shahid Beheshti University of Medical Sciences, Tehran, Iran;

MEHRDAD KARIMI

4 Department of Epidemiology and Biostatistics, Public Health School, Tehran, Iran;

HAYEDE GHOLSORKHI

5 Medical school, Alborz University of Medical Sciences, Karaj, Iran;

ZARRIN AHMADI

6 Amirkabir University of Technology(Polytechnic), Tehran, Iran

Introduction: The Current world needs people who have a lot of different abilities such as cognition and application of different ways of thinking, research, problem solving, critical thinking skills and creativity. In addition to critical thinking, learning styles is another key factor which has an essential role in the process of problem solving. This study aimed to determine the relationship between learning styles and critical thinking of students and their academic performance in Alborz University of Medical Science.

Methods: This cross-correlation study was performed in 2012, on 216 students of Alborz University who were selected randomly by the stratified random sampling. The data was obtained via a three-part questionnaire included demographic data, Kolb standardized questionnaire of learning style and California critical thinking standardized questionnaire. The academic performance of the students was extracted by the school records. The validity of the instruments was determined in terms of content validity, and the reliability was gained through internal consistency methods. Cronbach's alpha coefficient was found to be 0.78 for the California critical thinking questionnaire. The Chi Square test, Independent t-test, one way ANOVA and Pearson correlation test were used to determine relationship between variables. The Package SPSS14 statistical software was used to analyze data with a significant level of p<0.05.

Results: Our findings indicated the significant difference of mean score in four learning style, suggesting university students with convergent learning style have better performance than other groups. Also learning style had a relationship with age, gender, field of study, semester and job. The results about the critical thinking of the students showed that the mean of deductive reasoning and evaluation skills were higher than that of other skills and analytical skills had the lowest mean and there was a positive significant relationship between the students’ performance with inferential skill and the total score of critical thinking skills (p<0.05). Furthermore, evaluation skills and deductive reasoning had significant relationship. On the other hand, the mean total score of critical thinking had significant difference between different learning styles.

Conclusion: The results of this study showed that the learning styles, critical thinking and academic performance are significantly associated with one another. Considering the growing importance of critical thinking in enhancing the professional competence of individuals, it's recommended to use teaching methods consistent with the learning style because it would be more effective in this context.

Introduction

The current world needs people with a lot of capabilities such as understanding and using different ways of thinking, research, problem solving, critical thinking and creativity. Critical thinking is one of the aspects of thinking that has been accepted as a way to overcome the difficulties and to facilitate the access to information in life ( 1 ).

To Watson and Glizer, critical thinking is a combination of knowledge, attitude, and performance of every individual. They also believe that there are some skills of critical thinking such as perception, assumption recognition deduction, interpretation and evaluation of logical reasoning. They argue that the ability of critical thinking, processing and evaluation of previous information with new information result from inductive and deductive reasoning of solving problems. Watson and Glizer definition of critical thinking has been the basis of critical thinking tests that are widely used to measure the critical thinking today ( 2 ).

World Federation for Medical Education has considered critical thinking one of the medical training standards so that in accredited colleges this subject is one of the key points. In fact, one of the criteria for the accreditation of a learning institute is the measurement of critical thinking in its students ( 3 ).

In addition to critical thinking, learning style, i.e. the information processing method, of the learners, is an important key factor that has a major role in problem solving. According to David Kolb’s theory, learning is a four-step process that includes concrete experience, reflective observation, abstract conceptualization and active experimentation. This position represents two dimensions: concrete experience versus abstract thinking, and reflective observation to active experimentation. These dimensions include four learning styles: divergent, convergent, assimilate, and accommodate. According to Kolb and Ferry, the learner needs four different abilities to function efficiently: Learning styles involve several variables such as academic performance of learner, higher education improvement; critical thinking and problem solving ( 4 ).

Due to the importance of learning styles and critical thinking in students' academic performance, a large volume of educational research has been devoted to these issues in different countries. Demirhan, Besoluk and Onder (2011) in their study on critical thinking and students’ academic performance from the first semester to two years later have found that contrary to expectations the students’ critical thinking level reduced but the total mean of students’ scores increased. This is due to the fact that the students are likely to increase adaptive behavior with environment and university and reduce the stress during their education ( 1 ).

In another study over 330 students in Turkey, the students who had divergent learning style, had lower scores in critical thinking in contrast with students who have accommodator learning style ( 5 ).

Also Mahmoud examined the relationship between critical thinking and learning styles of the Bachelor students with their academic performance in 2012. In this study all the nursing students of the university in the semesters four, six and eight were studied. The results did not show any significant relationship between critical thinking and learning styles of nursing students with their academic performance ( 6 ).

Another research by Nasrabadi in 2012 showed a positive relationship between critical thinking attitudes and student's academic achievement. The results showed that there was a significant difference between the levels of critical thinking of assimilating and converge styles. Also converging, diverging, assimilating and accommodating styles had the highest level of critical thinking, respectively ( 4 ). Among other studies we can refer to Sharma’s study in 2011 whose results suggested a relationship between the academic performance and learning styles ( 7 ).

Today university students should not only think but also should think differently and should not only remember the knowledge in their mind but also should research the best learning style among different learning styles. Therefore, the study on the topic of how the students think and how they learn has received great emphasis in recent years. In this regard, with the importance of the subject, researchers attempted to doa research in this area to determine the relationship between critical thinking and learning styles with academic performance of the students at Alborz University of Medical Sciences.

This study is a descriptive-analytic, cross sectional study and investigates the relationship between critical thinking and learning styles with students’ academic performance of Alborz University of Medical Science in 2012. After approval and permission from university’s authorities and in coordination with official faculties, the critical thinking and learning styles questionnaire was given to the undergraduate students in associate degree, bachelor, medicine (second semester and after that). The total number of participants in the study was 216 students with different majors such as medical, nursing and midwifery, and health and medical emergency students. The tool to collect the data was a two-part questionnaire of Kolb's learning styles and California's critical thinking skills test (form B). The Kolb's questionnaire has two parts. The first part asks for demographic information and the second part includes 12 multiple choice questions. The participants respond to the questions with regard to how they learn, and the scores of respondents are ranked from 1 to 4 in which 4 is most consistent with the participants’ learning style 3 to some extent, 2 poorly consistent and 1 not consistent To find the participants’ learning styles, the first choice of all 12 questions were added together and this was repeated for other choices. Thus, four total scores for the four learning styles were obtained, the first for concrete experience learning style, the second for reflective observation of learning style, the third for abstract conceptualization learning style and the forth for active experimentation learning style. The highest score determined the learning style of the participant. The California critical thinking skills test (form B) includes 34 multiple choice questions with one correct answer in five different areas of critical thinking skills, including evaluation, inference, analysis, inductive reasoning and deductive reasoning. The answering time was 45 minutes and the final score is 34 and the achieved score in each section of the test varies from 0 to 16. In the evaluation section, the maximum point is 14, in analysis section 9, in inference section 11, in inductive reasoning 16 and in deductive reasoning the maximum point was 14. So there were 6 scores for each participant, which included a critical thinking total score and 5 score for critical thinking skills. Dehghani, Jafari Sani, Pakmehr and Malekzadeh found that the reliability of the questionnaire was 78% in a research. In the study of Khalili et al., the confidence coefficient was 62% and construct validity of all subscales with positive and high correlation were reported between 60%-65%. So this test was reliable for the research. Collecting the information was conducted in two stages. In the first stage, the questionnaires were given to the students and the objectives and importance of the research were mentioned. In the next stage, the students' academic performance was reviewed. After data collection, the data were coded and analyzed, using the SPSS 14 ( SPSS Inc, Chicago, IL, USA) software. To describe the data, descriptive statistics were used such as mean and standard deviation for continues variables and frequency for qualitative variables. Chi Square test, Independent t-test, one way ANOVA and Pearson correlation test were used to determine the relationship between variables at a significant level of p<0.05.

Research hypothesis

  • There is a relationship between Alborz University of Medical Sciences students’ learning styles and their demographic information. 
  • There is a relationship between Alborz University of Medical Sciences students’ critical thinking and their demographic information. 
  • There is a relationship between Alborz University of Medical Sciences students’ academic performance and their demographic information. 
  • There is a relationship between Alborz University of Medical Sciences students’ learning styles and their academic performance. 
  • There is a relationship between Alborz University of Medical Sciences students’ learning styles and their critical thinking. 

225 questionnaires were distributed of which 216 were completely responded (96%). The age range of the participants was from 16 to 45 with the mean age of (22.44±3.7). 52.8% of participants (n=114) were female, 83.3% (n=180) were single, 30.1% of participants’ (n=65) major was pediatric anesthesiology of OR, 35.2% of participants (n=76) were in fourth semester, 74.5% (n=161) were unemployed and 48.6 % (n=105) had Persian ethnicity.

The range of participants’ average grade points, which were considered as their academic performance, were from 12.51 to 19.07 with a mean of (16.75±1.3). According to Kolbs' pattern, 42.7% (n=85) had the convergent learning style (the maximum percentage) followed by 33.2 % (n= 66) with the assimilating style and only 9.5%, (n= 19) with the accommodating style (the minimum percentage).

Among the 5 critical thinking skills, the maximum mean score belonged to deductive reasoning skill (3.38±1.58) and the minimum mean score belonged to analysis skill (1.67±1.08).

Table 1 shows the frequency distribution and demographic variables and the academic performance of the students. According to the Chi-square (Χ 2 ) p-value, there was a significant relationship between gender and learning style (p=0.032), so that nearly 50 percent of males had the assimilating learning style and nearly 52 percent of the females had the convergent learning style.

The relationship between demographic variable and student’s academic performance with learning styles

The relationship between employment, major and semester of studying with the learning style was significant at a p-value of 0.049, 0.006, 0.009 and 0.001, respectively. The mean and standard deviation of age and students' academic performance in the four learning styles are reported in Table 1 .

Using the one way analysis of variance (One way ANOVA) and comparing the mean age of four groups, we found a significant relation between age and academic performance with learning style (p=0.049).

The students with convergent learning style had a better academic performance than those with other learning styles and in the performance of those with the assimilating learning style the weakest.

Table 2 shows the relationship between the total score of critical thinking skills and each of the demographic variables and academic performance. The results of the t-test and one way ANOVA variance analysis are reported to investigate the relationship between each variable with skills below the mean standard deviation.

Relationships between CCT Skills and demographic variables Using t-test and ANOVA. Pearson Correlation coefficient between age and Student's performance with CCT Skills was reported

* Significant in surface 0.05 

** Significant in surface 0.01

Based on the t-test and ANOVA, p-value of t and F, the mean of total score of critical thinking skills had only significant relationship with students’ major (p=0.020). Also a significant relationship was found between the major of students and gender with inference skill; semester of study with deductive reasoning skill, and ethnicity with 2 skills of inference and deductive reasoning (p<0.05).

Also regarding the relationship between age and the student academic performance with each of the critical thinking skills, the Pearson correlation coefficient results indicated a significant positive relationship but a negative relationship between age and analysis skill, i.e. with the increase of age, the score of analysis skill was reduced (p<0.05). Academic performance of the students had a direct significant relationship with critical thinking total score and inference skill; the more the score, the better the academic performance of students (p<0.05).

Table 3 shows the mean and standard deviation of learning styles score in the 4 groups of learning style. Using ANOVA one way ANOVA, the relationship between learning style and critical thinking skills and the comparison of the mean score for each skill in four styles are reported in the last column of the Table 3 .

The Relationship between critical thinking styles with learning styles

Based on the p-value of ANOVA, the mean of evaluation skill and inductive reasoning skill had a significant difference and the relationship between these two skills with learning style was significant (p<0.05). Also the mean of critical thinking’s total score was significantly different in the four groups and the relationship between total score with learning style was significant, too (p<0.05).

An external file that holds a picture, illustration, etc.
Object name is jamp-2-95-g001.jpg

The mean and confidence interval of university students’ performance in four learning  styles

An external file that holds a picture, illustration, etc.
Object name is jamp-2-95-g002.jpg

The mean and confidene interval of critical thinking skills

The study findings showed that the popular learning style among the students was the convergent style followed by the assimilating style which is consistent with Kolb's theory stating that medical science students usually have this learning style ( 8 ). This result was consistent with the results of other studies ( 9 , 10 ). In Yenice's study in which the student of training teacher were the target of the project, the most frequent learning styles were divergent and assimilating styles and these differences originate from the different target group of study in 2012 ( 11 ).

This study showed a significant relationship between learning style and gender, age, semester and employment. Meyari et al. did not find any significant relationship between learning style, age and gender of the freshman but for the fifth semester students, a significant relationship with age and gender was found ( 10 ). Also in Yenice's study, no relationship with learning style, gender, semester and age was found.

Furthermore, in the first semester divergent style, in the second semester assimilating style and in the third and fourth semester divergent style were accounted for the highest percentage. Also in the group age of 17-20 years the assimilating style and the age of 21-24 years the divergent style were dominant styles ( 11 ).

In the present study, it was found a significant positive relationship between convergent learning style and academic performance. Also in the study of Pooladi et al. the majority of the students had convergent style and they also found a significant relationship between learning style, total mean score and the mean of practical courses ( 12 ). Nasrabadi et al. found that students with the highest achievement were those with convergent style with a significant difference with those with divergent style ( 4 ). But the results are inconsistent to Meyari et al.’s ( 10 ).

In this study, the obtained mean score from the critical thinking questionnaire was (7.15±2.41) that was compared with that in the study of Khalili and Hoseinzadeh which was to validate and make reliable the critical thinking skills questionnaire of California (form B) in the Iranian nursing students; the mean of total score was about the 11th percentile of this study ( 13 ).

In other words, the computed score for critical thinking of the students participating was lower than 11 score that is in the 50th percentile and of course is lower than normal range.

Hariri and Bagherinezhad had shown that the computed score for Bachelor and Master students of Health faculty was also lower than the norm in Iran ( 14 ). Also Mayer and Dayer came to a similar conclusion in critical thinking skill in the Agricultural university of Florida’s students in 2006 ( 15 ).

But in Gharib et al.’s study, the total score of critical thinking test among the freshman and senior of Health-care management was in normal range ( 16 ). Wangensteen et al., found that the critical thinking skills of the newest graduate nursing students were relatively high in Sweden in 2010 ( 17 ).

In this study, students of all levels (Associate, Bachelor and PhD) with various fields of study participated but other studies have been limited to certain graduate courses that may explain the differences in levels of special critical thinking skills score in this study. In this study we found a significant relationship between total score of critical thinking and major of the students. This result is consistent with Serin et al. ( 18 ).

It was found a significant relationship between major of participants, gender and inference skill, semester and deductive reasoning skill, ethnicity and both inference and deductive reasoning skills.

In the Yenice's study significant relationship between critical thinking, group of age, gender and semester was seen ( 11 ). In Wangensteen et al.’s ( 17 ) study in the older age group, the level of critical thinking score increased. In Serin et al.’s ( 18 ) study the level of communication skills in girls was better than that in boys. And also a significant relationship was found between critical thinking and academic semester, but in Mayer and Dayer’s study no significant relationship between critical thinking levels and gender was found ( 4 , 15 ).

The results also showed that the total score of critical thinking and analytical skills of students and their performance had a significant relationship. Nasrabady et al.’s study also showed that there was a positive relationship between critical thinking reflection attitude and academic achievement ( 4 ). This is contradictory with what Demirhan, Bosluk and Ander found ( 6 , 15 ).

The results of the relationship between learning style and critical thinking indicated that the relationship between evaluation and inductive reasoning was significant to learning style (p<0.05). The relationship of critical thinking total score with learning style was also significant (p<0.05). Thus the total score for those with the conforming style of critical skills was more than that with other styles. But in the subgroup of inference skills, those with the convergent style had a higher mean than those with other styles.

Yenice found a negative relationship between critical thinking score and divergent learning style and a positive relation between critical thinking score and accommodating style ( 11 ).

Siriopoulos and Pomonis in their study compared the learning style and critical thinking skills of students in two phases: at the beginning and end of education and came to this conclusion that the learning style of students changed in the second phase.

For example, the divergent, convergent and accommodating styles languished and the assimilating style (combination of abstract thinking and reflective observation) was noticeably strengthened. However, those with converging learning style had higher levels of critical thinking.

The level of students’ critical thinking was lower in all international standards styles. Perhaps it was because of widely used teacher-centered teaching methods (lectures) in that university ( 19 ).

The results in the study of Nasrabady et al. showed that there was a significant difference between the level of learners’ critical thinking and divergent and assimilating styles ( 4 ).

Those with converging, diverging, assimilating and accommodating styles had the highest level of critical thinking, respectively.

Also there was a positive significant relationship between the reflective observation method and critical thinking and also a negative significant relationship between the abstract conceptualization method and critical thinking ( 4 ). But in another study that Mahmud has done in 2012, he did not find any significant relationship between learning style, critical thinking and students’ performance ( 6 ).

The results of this study showed that the students’ critical thinking skills of this university aren't acceptable. Also learning styles, critical thinking and academic performance have significant relationship with each other. Due to the important role of critical thinking in enhancing professional competence, it is recommend using teaching methods which are consistent with the learning styles.

Acknowledgment

This study is based on a research project that was approved in Research Deputy of Alborz University of Medical sciences. We sincerely appreciate all in Research Deputy of Alborz University of Medical sciences who supported us financially and morally and all students and colleagues who participated in this study.

Conflict of Interest: None declared.

References:

  • Free Tools for Students
  • Harvard Referencing Generator

Free Harvard Referencing Generator

Generate accurate Harvard reference lists quickly and for FREE, with MyBib!

🤔 What is a Harvard Referencing Generator?

A Harvard Referencing Generator is a tool that automatically generates formatted academic references in the Harvard style.

It takes in relevant details about a source -- usually critical information like author names, article titles, publish dates, and URLs -- and adds the correct punctuation and formatting required by the Harvard referencing style.

The generated references can be copied into a reference list or bibliography, and then collectively appended to the end of an academic assignment. This is the standard way to give credit to sources used in the main body of an assignment.

👩‍🎓 Who uses a Harvard Referencing Generator?

Harvard is the main referencing style at colleges and universities in the United Kingdom and Australia. It is also very popular in other English-speaking countries such as South Africa, Hong Kong, and New Zealand. University-level students in these countries are most likely to use a Harvard generator to aid them with their undergraduate assignments (and often post-graduate too).

🙌 Why should I use a Harvard Referencing Generator?

A Harvard Referencing Generator solves two problems:

  • It provides a way to organise and keep track of the sources referenced in the content of an academic paper.
  • It ensures that references are formatted correctly -- inline with the Harvard referencing style -- and it does so considerably faster than writing them out manually.

A well-formatted and broad bibliography can account for up to 20% of the total grade for an undergraduate-level project, and using a generator tool can contribute significantly towards earning them.

⚙️ How do I use MyBib's Harvard Referencing Generator?

Here's how to use our reference generator:

  • If citing a book, website, journal, or video: enter the URL or title into the search bar at the top of the page and press the search button.
  • Choose the most relevant results from the list of search results.
  • Our generator will automatically locate the source details and format them in the correct Harvard format. You can make further changes if required.
  • Then either copy the formatted reference directly into your reference list by clicking the 'copy' button, or save it to your MyBib account for later.

MyBib supports the following for Harvard style:

🍏 What other versions of Harvard referencing exist?

There isn't "one true way" to do Harvard referencing, and many universities have their own slightly different guidelines for the style. Our generator can adapt to handle the following list of different Harvard styles:

  • Cite Them Right
  • Manchester Metropolitan University (MMU)
  • University of the West of England (UWE)

Image of daniel-elias

Daniel is a qualified librarian, former teacher, and citation expert. He has been contributing to MyBib since 2018.

  • Augsburg.edu
  • Inside Augsburg

Search Strommen Center for Meaningful Work

  • Faculty & Staff
  • Graduate Students
  • First Generation
  • International
  • Students With Disabilities
  • Undocumented
  • Business & Finance
  • Culture and Language
  • Environmental Sustainability
  • Government, Law & Policy
  • Health Professions
  • Human & Social Services
  • Information Technology & Data
  • Marketing, Media & Communications
  • Resumes and Cover Letters
  • Expand Your Network / Mentor
  • Explore Your Interests / Self Assessment
  • Negotiate an Offer
  • Prepare for an Interview
  • Prepare for Graduate School
  • Search for a Job / Internship
  • Job Fair Preparation
  • Start Your Internship
  • Choosing a Major
  • Career Collaborative
  • Travelers EDGE
  • Meet the Team

Critical Thinking: A Simple Guide and Why It’s Important

  • Share This: Share Critical Thinking: A Simple Guide and Why It’s Important on Facebook Share Critical Thinking: A Simple Guide and Why It’s Important on LinkedIn Share Critical Thinking: A Simple Guide and Why It’s Important on X

Critical Thinking: A Simple Guide and Why It’s Important was originally published on Ivy Exec .

Strong critical thinking skills are crucial for career success, regardless of educational background. It embodies the ability to engage in astute and effective decision-making, lending invaluable dimensions to professional growth.

At its essence, critical thinking is the ability to analyze, evaluate, and synthesize information in a logical and reasoned manner. It’s not merely about accumulating knowledge but harnessing it effectively to make informed decisions and solve complex problems. In the dynamic landscape of modern careers, honing this skill is paramount.

The Impact of Critical Thinking on Your Career

☑ problem-solving mastery.

Visualize critical thinking as the Sherlock Holmes of your career journey. It facilitates swift problem resolution akin to a detective unraveling a mystery. By methodically analyzing situations and deconstructing complexities, critical thinkers emerge as adept problem solvers, rendering them invaluable assets in the workplace.

☑ Refined Decision-Making

Navigating dilemmas in your career path resembles traversing uncertain terrain. Critical thinking acts as a dependable GPS, steering you toward informed decisions. It involves weighing options, evaluating potential outcomes, and confidently choosing the most favorable path forward.

☑ Enhanced Teamwork Dynamics

Within collaborative settings, critical thinkers stand out as proactive contributors. They engage in scrutinizing ideas, proposing enhancements, and fostering meaningful contributions. Consequently, the team evolves into a dynamic hub of ideas, with the critical thinker recognized as the architect behind its success.

☑ Communication Prowess

Effective communication is the cornerstone of professional interactions. Critical thinking enriches communication skills, enabling the clear and logical articulation of ideas. Whether in emails, presentations, or casual conversations, individuals adept in critical thinking exude clarity, earning appreciation for their ability to convey thoughts seamlessly.

☑ Adaptability and Resilience

Perceptive individuals adept in critical thinking display resilience in the face of unforeseen challenges. Instead of succumbing to panic, they assess situations, recalibrate their approaches, and persist in moving forward despite adversity.

☑ Fostering Innovation

Innovation is the lifeblood of progressive organizations, and critical thinking serves as its catalyst. Proficient critical thinkers possess the ability to identify overlooked opportunities, propose inventive solutions, and streamline processes, thereby positioning their organizations at the forefront of innovation.

☑ Confidence Amplification

Critical thinkers exude confidence derived from honing their analytical skills. This self-assurance radiates during job interviews, presentations, and daily interactions, catching the attention of superiors and propelling career advancement.

So, how can one cultivate and harness this invaluable skill?

✅ developing curiosity and inquisitiveness:.

Embrace a curious mindset by questioning the status quo and exploring topics beyond your immediate scope. Cultivate an inquisitive approach to everyday situations. Encourage a habit of asking “why” and “how” to deepen understanding. Curiosity fuels the desire to seek information and alternative perspectives.

✅ Practice Reflection and Self-Awareness:

Engage in reflective thinking by assessing your thoughts, actions, and decisions. Regularly introspect to understand your biases, assumptions, and cognitive processes. Cultivate self-awareness to recognize personal prejudices or cognitive biases that might influence your thinking. This allows for a more objective analysis of situations.

✅ Strengthening Analytical Skills:

Practice breaking down complex problems into manageable components. Analyze each part systematically to understand the whole picture. Develop skills in data analysis, statistics, and logical reasoning. This includes understanding correlation versus causation, interpreting graphs, and evaluating statistical significance.

✅ Engaging in Active Listening and Observation:

Actively listen to diverse viewpoints without immediately forming judgments. Allow others to express their ideas fully before responding. Observe situations attentively, noticing details that others might overlook. This habit enhances your ability to analyze problems more comprehensively.

✅ Encouraging Intellectual Humility and Open-Mindedness:

Foster intellectual humility by acknowledging that you don’t know everything. Be open to learning from others, regardless of their position or expertise. Cultivate open-mindedness by actively seeking out perspectives different from your own. Engage in discussions with people holding diverse opinions to broaden your understanding.

✅ Practicing Problem-Solving and Decision-Making:

Engage in regular problem-solving exercises that challenge you to think creatively and analytically. This can include puzzles, riddles, or real-world scenarios. When making decisions, consciously evaluate available information, consider various alternatives, and anticipate potential outcomes before reaching a conclusion.

✅ Continuous Learning and Exposure to Varied Content:

Read extensively across diverse subjects and formats, exposing yourself to different viewpoints, cultures, and ways of thinking. Engage in courses, workshops, or seminars that stimulate critical thinking skills. Seek out opportunities for learning that challenge your existing beliefs.

✅ Engage in Constructive Disagreement and Debate:

Encourage healthy debates and discussions where differing opinions are respectfully debated.

This practice fosters the ability to defend your viewpoints logically while also being open to changing your perspective based on valid arguments. Embrace disagreement as an opportunity to learn rather than a conflict to win. Engaging in constructive debate sharpens your ability to evaluate and counter-arguments effectively.

✅ Utilize Problem-Based Learning and Real-World Applications:

Engage in problem-based learning activities that simulate real-world challenges. Work on projects or scenarios that require critical thinking skills to develop practical problem-solving approaches. Apply critical thinking in real-life situations whenever possible.

This could involve analyzing news articles, evaluating product reviews, or dissecting marketing strategies to understand their underlying rationale.

In conclusion, critical thinking is the linchpin of a successful career journey. It empowers individuals to navigate complexities, make informed decisions, and innovate in their respective domains. Embracing and honing this skill isn’t just an advantage; it’s a necessity in a world where adaptability and sound judgment reign supreme.

So, as you traverse your career path, remember that the ability to think critically is not just an asset but the differentiator that propels you toward excellence.

IMAGES

  1. How to Write a Critical Thinking Essay: Free Essay Example

    critical thinking in research paper

  2. (PDF) Critical thought on critical thinking research

    critical thinking in research paper

  3. Definition Essay: Critical thinking paper apa format

    critical thinking in research paper

  4. How to Write a Critical Thinking Essay: Examples, Topics, & Outline

    critical thinking in research paper

  5. Critical Paper Summary Template

    critical thinking in research paper

  6. (PDF) Critical Thinking in Research and Analysis for Business Development

    critical thinking in research paper

VIDEO

  1. Study Skills

  2. What is Critical Thinking?

  3. Importance of Critical Thinking for Research Students

  4. How to write a critical review: research skills for MSc students

  5. What is Critical Thinking and 7 Reasons Why Critical Thinking is Important

  6. 5 tips to improve your critical thinking

COMMENTS

  1. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  2. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  3. What Is Critical Thinking?

    Critical thinking examples. Critical thinking can help you to identify reliable sources of information that you can cite in your research paper.It can also guide your own research methods and inform your own arguments.. Outside of academia, critical thinking can help you to be aware of both your own and others' biases and assumptions.

  4. How to Write a Research Paper: Critical Thinking

    A person with good critical thinking skills is able to do the following: Understand the logical connections between ideas. Identify, construct and evaluate arguments. Detect inconsistencies and common mistakes in reasoning. Solve problems systematically. Identify the relevance and importance of ideas. Reflect on the justification of one's own ...

  5. An Evaluative Review of Barriers to Critical Thinking in Educational

    1. Introduction. Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; Quinn et al. 2020).

  6. Applying Critical Thinking

    Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically involves moving beyond simply understanding information by questioning its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., evidence of being influenced by ...

  7. Critical Thinking in Academic Research

    Critical Thinking in Academic Research - 2nd Edition provides examples and easy-to-understand explanations to equip students with the skills to develop research questions, evaluate and choose the right sources, search for information, and understand arguments. This 2nd Edition includes new content based on student feedback as well as additional interactive elements throughout the text.

  8. Trends and hotspots in critical thinking research over the past two

    The data on the most cited publications revealed that critical thinking instruction had become a popular research topic. This paper provides an overall bibliometric analysis from a global view of critical thinking development in education over the past two decades that should help scholars to deeply understand the development and status of ...

  9. Assessing Critical Thinking in Higher Education: Current State and

    Critical thinking was also considered important for high school and 2-year college graduates as well. The importance of critical thinking is further confirmed in a recent research study conducted by Educational Testing Service (ETS, 2013). In this research, provosts or vice presidents of academic affairs from more than 200 institutions were ...

  10. (PDF) Teaching Critical Thinking Skills: Literature Review

    tastes and wishes of the knower, then critical thinking and judg ement are superfluous. Another taxonomy is D ick's taxonomy (199 1) . Dick reviewed research in the area of CT for the last 40 ...

  11. 11.5 Critical Thinking and Research Applications

    Select information from sources to begin answering the research questions. Determine an appropriate organizational structure for the research paper that uses critical analysis to connect the writer's ideas and information taken from sources. At this point in your project, you are preparing to move from the research phase to the writing phase.

  12. Critical Thinking and Academic Research: Intro

    The research process is not simply collecting data, evidence, or "facts," then piecing together this preexisting information into a paper. Instead, the research process is about inquiry—asking questions and developing answers through serious critical thinking and thoughtful reflection.

  13. (PDF) Critical thinking: Definition and Structure

    Critical thinking is a vital skill for the 21st century, involving using rational standards to analyze and evaluate information, thoughts and situations. It aims to create new knowledge ...

  14. Full article: Fostering critical thinking skills in secondary education

    Our critical thinking skills framework. The focus on critical thinking skills has its roots in two approaches: the cognitive psychological approach and the educational approach (see for reviews, e.g. Sternberg Citation 1986; Ten Dam and Volman Citation 2004).From a cognitive psychological approach, critical thinking is defined by the types of behaviours and skills that a critical thinker can show.

  15. Critical Thinking: Components, Skills, and Strategies

    Critical Thinking: Components, Skills, a nd Strategies. Abdullah Bin Mohamed Al-Ghadouni. ABSTRACT. The research paper aimed at un covering the components of critica l thinking and. identifying ...

  16. Inquiry and critical thinking skills for the next generation: from

    Although we emphasized critical thinking in this paper, for the improvement of intelligence, creative thinking and critical thinking are separable, they are both essential abilities that develop expertise, eventually drive the improvement of HI at human race level. ... M. Davies, in Higher education: Handbook of theory and research. A Model of ...

  17. Research and Critical Thinking : An Important Link for Exercise Science

    However, attaining critical thinking skills does not come without its challenges as students must be able to manage a vast array of resources within a series of complex network systems. This is especially true when students are asked to write a research paper, which is one of the most common methods for teaching critical thinking skills.

  18. PDF Action Research: The Development of Critical Thinking Skills Tammy

    Running head: ACTION RESEARCH: DEVELOP CRITICAL THINKING SKILLS 1 Action Research: The Development of Critical Thinking Skills Tammy LaPoint-O'Brien ... for this research, minute papers are defined as a paper composed at the end of the lesson describing in essay format three facts, ideas, concepts, and/or thoughts learned during the lesson ...

  19. Understanding and teaching critical thinking—A new approach

    Against this background, several issues have been debated in the critical-thinking research community. Three of the more prominent ones have been the definition issue, ... such as open-ended essay tasks, when measuring critical thinking empirically (Brunt, 2005, Tsui, 1998). This kind of essay task has been said to make the students reveal ...

  20. The Link between Critical Reading, Thinking and Writing

    Critical thinking is a term you have probably come across a great deal in your academic journey at college, or even before you start college. Being critical is indeed a key skill you will need for your assessments, whether this be an essay, exam or oral presentation. But thinking critically is only the second step in a three-step process.

  21. (PDF) Critical Thinking Skills

    1. a set of information and belief generating and processing skills; and. 2. the habit, based on intellectual commitment, of using those skills to guide behaviour. It is therefore contras ted with ...

  22. Rubrics to assess critical thinking and information processing in

    Background Process skills such as critical thinking and information processing are commonly stated outcomes for STEM undergraduate degree programs, but instructors often do not explicitly assess these skills in their courses. Students are more likely to develop these crucial skills if there is constructive alignment between an instructor's intended learning outcomes, the tasks that the ...

  23. How to Write a Critical Thinking Essay Guide with Examples

    This particular term refers to a type of essay written to discuss a specific idea, voice clip, written piece or a video, using purely one's ideas, writing an argumentative essay, critical analysis essays and suggestions, with no input from other sources of information. Critical thinking reading and writing are often required by university lecturers as well as other teachers to test the ...

  24. 21st Century English Learning: a Revolution in Skills, Critical

    Critical thinking and problem-solving, creativity and innovation, communication, and teamwork are some of these 21st-century abilities. It is possible to build these 21st century skills in any discipline. It is highly appropriate to study English as a subject in order to gain 21st century skills. To effectively and efficiently accomplish learning objectives, chemistry instructors must ...

  25. The role of critical thinking skills and learning styles of university

    The current world needs people with a lot of capabilities such as understanding and using different ways of thinking, research, problem solving, critical thinking and creativity. Critical thinking is one of the aspects of thinking that has been accepted as a way to overcome the difficulties and to facilitate the access to information in life ( 1 ).

  26. Critical Thinking and it's Importance in Education

    Critical thinking occurs when students are. analyzing, evaluating, in terpreting, or synthesizing information and applying. creative thought to form an argument, solve a problem, or reach a ...

  27. Free Harvard Referencing Generator [Updated for 2024]

    A Harvard Referencing Generator is a tool that automatically generates formatted academic references in the Harvard style. It takes in relevant details about a source -- usually critical information like author names, article titles, publish dates, and URLs -- and adds the correct punctuation and formatting required by the Harvard referencing style.

  28. Critical Thinking: A Simple Guide and Why It's Important

    Strong critical thinking skills are crucial for career success, regardless of educational background. ... Student Paper Challenge. Fri, Mar 22 - Wed, Jul 31 at 2pm - 10:55pm. April. Apr 09. International MBA Abroad - Class 2026. Tue, Apr 9 - Wed, Apr 9 at 8am - 8am. View More. Contact & Location. Phone. 612.330.1148. Email.

  29. Exploring the relationship between regenerative grazing and Ranchers

    Regenerative grazing practices have garnered attention from researchers and practitioners as a promising approach for building farms' ecosystem health while also delivering societal benefits. While the ecological benefits of such practices have been extensively studied, their impact on farmers' wellbeing remains relatively unexplored but are as critical - we don't want to advocate for ...