writing and critical thinking skills

Writing to Think: Critical Thinking and the Writing Process

“Writing is thinking on paper.” (Zinsser, 1976, p. vii)

Google the term “critical thinking.” How many hits are there? On the day this tutorial was completed, Google found about 65,100,000 results in 0.56 seconds. That’s an impressive number, and it grows more impressively large every day. That’s because the nation’s educators, business leaders, and political representatives worry about the level of critical thinking skills among today’s students and workers.

What is Critical Thinking?

Simply put, critical thinking is sound thinking. Critical thinkers work to delve beneath the surface of sweeping generalizations, biases, clichés, and other quick observations that characterize ineffective thinking. They are willing to consider points of view different from their own, seek and study evidence and examples, root out sloppy and illogical argument, discern fact from opinion, embrace reason over emotion or preference, and change their minds when confronted with compelling reasons to do so. In sum, critical thinkers are flexible thinkers equipped to become active and effective spouses, parents, friends, consumers, employees, citizens, and leaders. Every area of life, in other words, can be positively affected by strong critical thinking.

Released in January 2011, an important study of college students over four years concluded that by graduation “large numbers [of American undergraduates] didn’t learn the critical thinking, complex reasoning and written communication skills that are widely assumed to be at the core of a college education” (Rimer, 2011, para. 1). The University designs curriculum, creates support programs, and hires faculty to help ensure you won’t be one of the students “[showing]no significant gains in . . . ‘higher order’ thinking skills” (Rimer, 2011, para. 4). One way the University works to help you build those skills is through writing projects.

Writing and Critical Thinking

Say the word “writing” and most people think of a completed publication. But say the word “writing” to writers, and they will likely think of the process of composing. Most writers would agree with novelist E. M. Forster, who wrote, “How can I know what I think until I see what I say?” (Forster, 1927, p. 99). Experienced writers know that the act of writing stimulates thinking.

Inexperienced and experienced writers have very different understandings of composition. Novice writers often make the mistake of believing they have to know what they’re going to write before they can begin writing. They often compose a thesis statement before asking questions or conducting research. In the course of their reading, they might even disregard material that counters their pre-formed ideas. This is not writing; it is recording.

In contrast, experienced writers begin with questions and work to discover many different answers before settling on those that are most convincing. They know that the act of putting words on paper or a computer screen helps them invent thought and content. Rather than trying to express what they already think, they express what the act of writing leads them to think as they put down words. More often than not, in other words, experienced writers write their way into ideas, which they then develop, revise, and refine as they go.

What has this notion of writing to do with critical thinking? Everything.

Consider the steps of the writing process: prewriting, outlining, drafting, revising, editing, seeking feedback, and publishing. These steps are not followed in a determined or strict order; instead, the effective writer knows that as they write, it may be necessary to return to an earlier step. In other words, in the process of revision, a writer may realize that the order of ideas is unclear. A new outline may help that writer re-order details. As they write, the writer considers and reconsiders the effectiveness of the work.

The writing process, then, is not just a mirror image of the thinking process: it is the thinking process. Confronted with a topic, an effective critical thinker/writer

  • asks questions
  • seeks answers
  • evaluates evidence
  • questions assumptions
  • tests hypotheses
  • makes inferences
  • employs logic
  • draws conclusions
  • predicts readers’ responses
  • creates order
  • drafts content
  • seeks others’ responses
  • weighs feedback
  • criticizes their own work
  • revises content and structure
  • seeks clarity and coherence

Example of Composition as Critical Thinking

“Good writing is fueled by unanswerable questions” (Lane, 1993, p. 15).

Imagine that you have been asked to write about a hero or heroine from history. You must explain what challenges that individual faced and how they conquered them. Now imagine that you decide to write about Rosa Parks and her role in the modern Civil Rights movement. Take a moment and survey what you already know. She refused to get up out of her seat on a bus so a White man could sit in it. She was arrested. As a result, Blacks in Montgomery protested, influencing the Montgomery Bus Boycott. Martin Luther King, Jr. took up leadership of the cause, and ultimately a movement was born.

Is that really all there is to Rosa Parks’s story? What questions might a thoughtful writer ask? Here a few:

  • Why did Rosa Parks refuse to get up on that particular day?
  • Was hers a spontaneous or planned act of defiance?
  • Did she work? Where? Doing what?
  • Had any other Black person refused to get up for a White person?
  • What happened to that individual or those individuals?
  • Why hadn’t that person or those persons received the publicity Parks did?
  • Was Parks active in Civil Rights before that day?
  • How did she learn about civil disobedience?

Even just these few questions could lead to potentially rich information.

Factual information would not be enough, however, to satisfy an assignment that asks for an interpretation of that information. The writer’s job for the assignment is to convince the reader that Parks was a heroine; in this way the writer must make an argument and support it. The writer must establish standards of heroic behavior. More questions arise:

  • What is heroic action?
  • What are the characteristics of someone who is heroic?
  • What do heroes value and believe?
  • What are the consequences of a hero’s actions?
  • Why do they matter?

Now the writer has even more research and more thinking to do.

By the time they have raised questions and answered them, raised more questions and answered them, and so on, they are ready to begin writing. But even then, new ideas will arise in the course of planning and drafting, inevitably leading the writer to more research and thought, to more composition and refinement.

Ultimately, every step of the way over the course of composing a project, the writer is engaged in critical thinking because the effective writer examines the work as they develop it.

Why Writing to Think Matters

Writing practice builds critical thinking, which empowers people to “take charge of [their] own minds” so they “can take charge of [their] own lives . . . and improve them, bringing them under [their] self command and direction” (Foundation for Critical Thinking, 2020, para. 12). Writing is a way of coming to know and understand the self and the changing world, enabling individuals to make decisions that benefit themselves, others, and society at large. Your knowledge alone – of law, medicine, business, or education, for example – will not be enough to meet future challenges. You will be tested by new unexpected circumstances, and when they arise, the open-mindedness, flexibility, reasoning, discipline, and discernment you have learned through writing practice will help you meet those challenges successfully.

Forster, E.M. (1927).  Aspects of the novel . Harcourt, Brace & Company.

The Foundation for Critical Thinking. (2020, June 17).  Our concept and definition of critical thinking . https://www.criticalthinking.org/pages/our-concept-of-critical-thinking/411

Lane, B. (1993).  After the end: Teaching and learning creative revision . Heinemann.

Rimer, S. (2011, January 18).  Study: Many college students not learning to think critically . The Hechinger Report. https://www.mcclatchydc.com/news/nation-world/national/article24608056.html

Zinsser, W. (1976).  On writing well: The classic guide to writing nonfiction . HarperCollins.

Share this:

  • Click to email a link to a friend (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Pinterest (Opens in new window)
  • Click to print (Opens in new window)

Follow Blog via Email

Enter your email address to follow this blog and receive email notifications of new posts.

Email Address

  • RSS - Posts
  • RSS - Comments
  • COLLEGE WRITING
  • USING SOURCES & APA STYLE
  • EFFECTIVE WRITING PODCASTS
  • LEARNING FOR SUCCESS
  • PLAGIARISM INFORMATION
  • FACULTY RESOURCES
  • Student Webinar Calendar
  • Academic Success Center
  • Writing Center
  • About the ASC Tutors
  • DIVERSITY TRAINING
  • PG Peer Tutors
  • PG Student Access

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

  • College Writing
  • Using Sources & APA Style
  • Learning for Success
  • Effective Writing Podcasts
  • Plagiarism Information
  • Faculty Resources
  • Tutor Training

Twitter feed

Academic Writing: Critical Thinking & Writing

  • Academic Writing
  • Planning your writing
  • Structuring your assignment
  • Critical Thinking & Writing
  • Building an argument
  • Reflective Writing
  • Summarising, paraphrasing and quoting

Critical Thinking

One of the most important features of studying at university is the expectation that you will engage in thinking critically about your subject area. 

Critical thinking involves asking meaningful questions concerning the information, ideas, beliefs, and arguments that you will encounter. It requires you to approach your studies with a curious, open mind, discard preconceptions, and interrogate received knowledge and established practices.

Critical thinking is key to successfully expressing your individuality as an independent learner and thinker in an academic context. It is also a valuable life skill. 

Critical thinking enables you to:

  • Evaluate information, its validity and significance in a particular context.
  • Analyse and interpret evidence and data in response to a line of enquiry.
  • Weigh-up alternative explanations and arguments.
  • Develop your own evidence-based and well-reasoned arguments.
  • Develop well-informed viewpoints.
  • Formulate your own independent, justifiable ideas.
  • Actively engage with the wider scholarship of your academic community.

Writing Critically

Being able to demonstrate and communicate critical thinking in your written assignments through critical writing is key to achieving academic success. 

Critical writing can be distinguished from descriptive writing which is concerned with conveying information rather than interrogating information. Understanding the difference between these two styles of academic writing and when to use them is important.

The balance between descriptive writing and critical writing will vary depending on the nature of the assignment and the level of your studies. Some level of descriptive writing is generally necessary to support critical writing. More sophisticated criticality is generally required at higher levels of study with less descriptive content. You will continue to develop your critical writing skills as you progress through your course.

Descriptive Writing and Critical Writing

  • Descriptive Writing
  • Critical Writing
  • Examples of Critical Writing

Descriptive writing demonstrates the knowledge you have of a subject, and your knowledge of what other people say about that subject.  Descriptive writing often responds to questions framed as ‘what’ , ‘where’ , ‘who’ and ‘when’ .

Descriptive writing might include the following:

  • Description of what something is or what it is about (an account, facts, observable features, details): a topic, problem, situation, or context of the subject under discussion.
  • Description of where it takes place (setting and context), who is involved and when it occurs. 
  • Re-statement or summary of what others say about the topic.
  • Background facts and information for a discussion.

Description usually comes before critical content so that the reader can understand the topic you are critically engaging with.

Critical writing requires you to apply interpretation, analysis, and evaluation to the descriptions you have provided. Critical writing often responds to questions framed as ‘how’ or ‘why’ . Often, critical writing will require you to build an argument which is supported by evidence. 

Some indicators of critical writing are:

  • Investigation of positive and negative perspectives on ideas
  • Supporting ideas and arguments with evidence, which might include authoritative sources, data, statistics, research, theories, and quotations
  • Balanced, unbiased appraisal of arguments and counterarguments/alternative viewpoints
  • Honest recognition of the limitations of an argument and supporting evidence
  • Plausible, rational, convincing, and well-reasoned conclusions 

Critical writing might include the following:

  • Applying an idea or theory to different situations or relate theory to practice. Does the idea work/not work in practice? Is there a factor that makes it work/not work? For example: 'Smith's (2008) theory on teamwork is effective in the workplace because it allows a diverse group of people with different skills to work effectively'.
  • Justifying why a process or policy exists. For example: 'It was necessary for the nurse to check the patient's handover notes because...'
  • Proposing an alternative approach to view and act on situations. For example: 'By adopting a Freirian approach, we could view the student as a collaborator in our teaching and learning'. Or: 'If we had followed the NMC guidelines we could have made the patient feel calm and relaxed during the consultation'.
  • Discussion of the strengths and weaknesses of an idea/theory/policy. Why does this idea/theory/policy work? Or why does this idea not work? For example: 'Although Smith's (2008) theory on teamwork is useful for large teams, there are challenges in applying this theory to teams who work remotely'. 
  • Discussion of how the idea links to other ideas in the field (synthesis). For example: 'the user experience of parks can be greatly enhanced by examining Donnelly's (2009) customer service model used in retail’.
  • Discussion of how the idea compares and contrasts with other ideas/theories. For example: ‘The approach advocated by the NMC differs in comparison because of factor A and factor C’.
  • Discussion of the ‘’up-to-datedness” and relevance of an idea/theory/policy (its currency). For example: 'although this approach was successful in supporting the local community, Smith's model does not accommodate the needs of a modern global economy'. 
  • Evaluating an idea/theory/policy by providing evidence-informed judgment. For example: 'Therefore, May's delivery model should be discontinued as it has created significant issues for both customers and staff (Ransom, 2018)'.
  • Creating new perspectives or arguments based on knowledge. For example: 'to create strong and efficient buildings, we will look to the designs provided by nature. The designs of the Sydney Opera House are based on the segments of an orange (Cook, 2019)'. 

Further Reading

Cover Art

  • << Previous: Structuring your assignment
  • Next: Building an argument >>
  • Last Updated: Apr 12, 2024 3:27 PM
  • URL: https://libguides.uos.ac.uk/academic-writing

➔ About the Library

➔ Meet the Team

➔ Customer Service Charter

➔ Library Policies & Regulations

➔ Privacy & Data Protection

Essential Links

➔ A-Z of eResources

➔ Frequently Asked Questions

➔Discover the Library

➔Referencing Help

➔ Print & Copy Services

➔ Service Updates

Library & Learning Services, University of Suffolk, Library Building, Long Street, Ipswich, IP4 1QJ

✉ Email Us: [email protected]

✆ Call Us: +44 (0)1473 3 38700

Module 1: Success Skills

Critical thinking, introduction, learning objectives.

  • define critical thinking
  • identify the role that logic plays in critical thinking
  • apply critical thinking skills to problem-solving scenarios
  • apply critical thinking skills to evaluation of information

Woman lying on her back outdoors, in a reflective posture

Consider these thoughts about the critical thinking process, and how it applies not just to our school lives but also our personal and professional lives.

“Thinking Critically and Creatively”

Critical thinking skills are perhaps the most fundamental skills involved in making judgments and solving problems. You use them every day, and you can continue improving them.

The ability to think critically about a matter—to analyze a question, situation, or problem down to its most basic parts—is what helps us evaluate the accuracy and truthfulness of statements, claims, and information we read and hear. It is the sharp knife that, when honed, separates fact from fiction, honesty from lies, and the accurate from the misleading. We all use this skill to one degree or another almost every day. For example, we use critical thinking every day as we consider the latest consumer products and why one particular product is the best among its peers. Is it a quality product because a celebrity endorses it? Because a lot of other people may have used it? Because it is made by one company versus another? Or perhaps because it is made in one country or another? These are questions representative of critical thinking.

The academic setting demands more of us in terms of critical thinking than everyday life. It demands that we evaluate information and analyze myriad issues. It is the environment where our critical thinking skills can be the difference between success and failure. In this environment we must consider information in an analytical, critical manner. We must ask questions—What is the source of this information? Is this source an expert one and what makes it so? Are there multiple perspectives to consider on an issue? Do multiple sources agree or disagree on an issue? Does quality research substantiate information or opinion? Do I have any personal biases that may affect my consideration of this information?

It is only through purposeful, frequent, intentional questioning such as this that we can sharpen our critical thinking skills and improve as students, learners and researchers.

—Dr. Andrew Robert Baker,  Foundations of Academic Success: Words of Wisdom

Defining Critical Thinking

Thinking comes naturally. You don’t have to make it happen—it just does. But you can make it happen in different ways. For example, you can think positively or negatively. You can think with “heart” and you can think with rational judgment. You can also think strategically and analytically, and mathematically and scientifically. These are a few of multiple ways in which the mind can process thought.

What are some forms of thinking you use? When do you use them, and why?

As a college student, you are tasked with engaging and expanding your thinking skills. One of the most important of these skills is critical thinking. Critical thinking is important because it relates to nearly all tasks, situations, topics, careers, environments, challenges, and opportunities. It’s not restricted to a particular subject area.

Handwritten poster. Guidelines for Critical Thinking when…talking/ reading/ blogging/ writing/ living. 4: justify your answers with text evidence (…because…) and examples from your life/world; agree and disagree with others and authors; ask questions of others and authors; complete sentences, correct punctuation/ capitols. 3: agree and disagree with others and authors; justify your opinions, tell why you agree and disagree; speak and write in complete sentences. 2: answers questions but not justify them; agree and disagree but you can’t tell why; incomplete sentences, incorrect punctuation. 1: does not contribute to the conversation; does not share your thinking; does not agree or disagree with others. Justify: to defend your thinking by showing and telling with examples and evidence.

Critical thinking is clear, reasonable, reflective thinking focused on deciding what to believe or do. It means asking probing questions like, “How do we know?” or “Is this true in every case or just in this instance?” It involves being skeptical and challenging assumptions, rather than simply memorizing facts or blindly accepting what you hear or read.

Imagine, for example, that you’re reading a history textbook. You wonder who wrote it and why, because you detect certain assumptions in the writing. You find that the author has a limited scope of research focused only on a particular group within a population. In this case, your critical thinking reveals that there are “other sides to the story.”

Who are critical thinkers, and what characteristics do they have in common? Critical thinkers are usually curious and reflective people. They like to explore and probe new areas and seek knowledge, clarification, and new solutions. They ask pertinent questions, evaluate statements and arguments, and they distinguish between facts and opinion. They are also willing to examine their own beliefs, possessing a manner of humility that allows them to admit lack of knowledge or understanding when needed. They are open to changing their mind. Perhaps most of all, they actively enjoy learning, and seeking new knowledge is a lifelong pursuit.

This may well be you!

No matter where you are on the road to being a critical thinker, you can always more fully develop your skills. Doing so will help you develop more balanced arguments, express yourself clearly, read critically, and absorb important information efficiently. Critical thinking skills will help you in any profession or any circumstance of life, from science to art to business to teaching.

Critical Thinking in Action

The following video, from Lawrence Bland, presents the major concepts and benefits of critical thinking.

Critical Thinking and Logic

Critical thinking is fundamentally a process of questioning information and data. You may question the information you read in a textbook, or you may question what a politician or a professor or a classmate says. You can also question a commonly-held belief or a new idea. With critical thinking, anything and everything is subject to question and examination.

Logic’s Relationship to Critical Thinking

The word logic comes from the Ancient Greek logike , referring to the science or art of reasoning. Using logic, a person evaluates arguments and strives to distinguish between good and bad reasoning, or between truth and falsehood. Using logic, you can evaluate ideas or claims people make, make good decisions, and form sound beliefs about the world. [1]

Questions of Logic in Critical Thinking

Let’s use a simple example of applying logic to a critical-thinking situation. In this hypothetical scenario, a man has a PhD in political science, and he works as a professor at a local college. His wife works at the college, too. They have three young children in the local school system, and their family is well known in the community.

The man is now running for political office. Are his credentials and experience sufficient for entering public office? Will he be effective in the political office? Some voters might believe that his personal life and current job, on the surface, suggest he will do well in the position, and they will vote for him.

In truth, the characteristics described don’t guarantee that the man will do a good job. The information is somewhat irrelevant. What else might you want to know? How about whether the man had already held a political office and done a good job? In this case, we want to ask, How much information is adequate in order to make a decision based on logic instead of assumptions?

The following questions, presented in Figure 1, below, are ones you may apply to formulating a logical, reasoned perspective in the above scenario or any other situation:

  • What’s happening? Gather the basic information and begin to think of questions.
  • Why is it important? Ask yourself why it’s significant and whether or not you agree.
  • What don’t I see? Is there anything important missing?
  • How do I know? Ask yourself where the information came from and how it was constructed.
  • Who is saying it? What’s the position of the speaker and what is influencing them?
  • What else? What if? What other ideas exist and are there other possibilities?

Infographic titled "Questions a Critical Thinker Asks." From the top, text reads: What's Happening? Gather the basic information and begin to think of questions (image of two stick figures talking to each other). Why is it Important? Ask yourself why it's significant and whether or not you agree. (Image of bearded stick figure sitting on a rock.) What Don't I See? Is there anything important missing? (Image of stick figure wearing a blindfold, whistling, walking away from a sign labeled Answers.) How Do I Know? Ask yourself where the information came from and how it was constructed. (Image of stick figure in a lab coat, glasses, holding a beaker.) Who is Saying It? What's the position of the speaker and what is influencing them? (Image of stick figure reading a newspaper.) What Else? What If? What other ideas exist and are there other possibilities? (Stick figure version of Albert Einstein with a thought bubble saying "If only time were relative...".

Problem-Solving With Critical Thinking

For most people, a typical day is filled with critical thinking and problem-solving challenges. In fact, critical thinking and problem-solving go hand-in-hand. They both refer to using knowledge, facts, and data to solve problems effectively. But with problem-solving, you are specifically identifying, selecting, and defending your solution. Below are some examples of using critical thinking to problem-solve:

  • Your roommate was upset and said some unkind words to you, which put a crimp in your relationship. You try to see through the angry behaviors to determine how you might best support your roommate and help bring your relationship back to a comfortable spot.

Young man in black jacket looking deep in thought, in foreground of busy street scene

  • Your final art class project challenges you to conceptualize form in new ways. On the last day of class when students present their projects, you describe the techniques you used to fulfill the assignment. You explain why and how you selected that approach.
  • Your math teacher sees that the class is not quite grasping a concept. She uses clever questioning to dispel anxiety and guide you to new understanding of the concept.
  • You have a job interview for a position that you feel you are only partially qualified for, although you really want the job and you are excited about the prospects. You analyze how you will explain your skills and experiences in a way to show that you are a good match for the prospective employer.
  • You are doing well in college, and most of your college and living expenses are covered. But there are some gaps between what you want and what you feel you can afford. You analyze your income, savings, and budget to better calculate what you will need to stay in college and maintain your desired level of spending.

Problem-Solving Action Checklist

Problem-solving can be an efficient and rewarding process, especially if you are organized and mindful of critical steps and strategies. Remember, too, to assume the attributes of a good critical thinker. If you are curious, reflective, knowledge-seeking, open to change, probing, organized, and ethical, your challenge or problem will be less of a hurdle, and you’ll be in a good position to find intelligent solutions.

Evaluating Information With Critical Thinking

Evaluating information can be one of the most complex tasks you will be faced with in college. But if you utilize the following four strategies, you will be well on your way to success:

  • Read for understanding by using text coding
  • Examine arguments
  • Clarify thinking

Photo of a group of students standing around a poster on the wall, where they're adding post-it notes with handwriting on them

1. Read for Understanding Using Text Coding

When you read and take notes, use the text coding strategy . Text coding is a way of tracking your thinking while reading. It entails marking the text and recording what you are thinking either in the margins or perhaps on Post-it notes. As you make connections and ask questions in response to what you read,  you monitor your comprehension and enhance your long-term understanding of the material.

With text coding, mark important arguments and key facts. Indicate where you agree and disagree or have further questions. You don’t necessarily need to read every word, but make sure you understand the concepts or the intentions behind what is written. Feel free to develop your own shorthand style when reading or taking notes. The following are a few options to consider using while coding text.

See more text coding from PBWorks and Collaborative for Teaching and Learning .

2. Examine Arguments

When you examine arguments or claims that an author, speaker, or other source is making, your goal is to identify and examine the hard facts. You can use the spectrum of authority strategy for this purpose. The spectrum of authority strategy assists you in identifying the “hot” end of an argument—feelings, beliefs, cultural influences, and societal influences—and the “cold” end of an argument—scientific influences. The following video explains this strategy.

3. Clarify Thinking

When you use critical thinking to evaluate information, you need to clarify your thinking to yourself and likely to others. Doing this well is mainly a process of asking and answering probing questions, such as the logic questions discussed earlier. Design your questions to fit your needs, but be sure to cover adequate ground. What is the purpose? What question are we trying to answer? What point of view is being expressed? What assumptions are we or others making? What are the facts and data we know, and how do we know them? What are the concepts we’re working with? What are the conclusions, and do they make sense? What are the implications?

4. Cultivate “Habits of Mind”

“Habits of mind” are the personal commitments, values, and standards you have about the principle of good thinking. Consider your intellectual commitments, values, and standards. Do you approach problems with an open mind, a respect for truth, and an inquiring attitude? Some good habits to have when thinking critically are being receptive to having your opinions changed, having respect for others, being independent and not accepting something is true until you’ve had the time to examine the available evidence, being fair-minded, having respect for a reason, having an inquiring mind, not making assumptions, and always, especially, questioning your own conclusions—in other words, developing an intellectual work ethic. Try to work these qualities into your daily life.

  • "logic." Wordnik . n.d. Web. 16 Feb 2016 . ↵
  • "Student Success-Thinking Critically In Class and Online."  Critical Thinking Gateway . St Petersburg College, n.d. Web. 16 Feb 2016. ↵
  • Outcome: Critical Thinking. Provided by : Lumen Learning. License : CC BY: Attribution
  • Self Check: Critical Thinking. Provided by : Lumen Learning. License : CC BY: Attribution
  • Foundations of Academic Success. Authored by : Thomas C. Priester, editor. Provided by : Open SUNY Textbooks. Located at : http://textbooks.opensuny.org/foundations-of-academic-success/ . License : CC BY-NC-SA: Attribution-NonCommercial-ShareAlike
  • Image of woman thinking. Authored by : Moyan Brenn. Located at : https://flic.kr/p/8YV4K5 . License : CC BY: Attribution
  • Critical Thinking. Provided by : Critical and Creative Thinking Program. Located at : http://cct.wikispaces.umb.edu/Critical+Thinking . License : CC BY: Attribution
  • Critical Thinking Skills. Authored by : Linda Bruce. Provided by : Lumen Learning. Project : https://courses.lumenlearning.com/lumencollegesuccess/chapter/critical-thinking-skills/. License : CC BY: Attribution
  • Image of critical thinking poster. Authored by : Melissa Robison. Located at : https://flic.kr/p/bwAzyD . License : CC BY: Attribution
  • Thinking Critically. Authored by : UBC Learning Commons. Provided by : The University of British Columbia, Vancouver Campus. Located at : http://www.oercommons.org/courses/learning-toolkit-critical-thinking/view . License : CC BY: Attribution
  • Critical Thinking 101: Spectrum of Authority. Authored by : UBC Leap. Located at : https://youtu.be/9G5xooMN2_c . License : CC BY: Attribution
  • Image of students putting post-its on wall. Authored by : Hector Alejandro. Located at : https://flic.kr/p/7b2Ax2 . License : CC BY: Attribution
  • Image of man thinking. Authored by : Chad Santos. Located at : https://flic.kr/p/phLKY . License : CC BY: Attribution
  • Critical Thinking.wmv. Authored by : Lawrence Bland. Located at : https://youtu.be/WiSklIGUblo . License : All Rights Reserved . License Terms : Standard YouTube License

University of York Library

  • Subject Guides

Being critical: a practical guide

  • Critical writing
  • Being critical
  • Critical thinking
  • Evaluating information
  • Reading academic articles
  • Critical reading

This guide contains key resources to introduce you to the features of critical writing.

For more in-depth advice and guidance on critical writing , visit our specialist academic writing guides:

Practical Guide

What is critical writing?

Academic writing requires criticality; it's not enough to just describe or summarise evidence, you also need to analyse and evaluate information and use it to build your own arguments. This is where you show your own thoughts based on the evidence available, so critical writing is really important for higher grades.

Explore the key features of critical writing and see it in practice in some examples:

Introduction to critical writing [Google Slides]

While we need criticality in our writing, it's definitely possible to go further than needed. We’re aiming for that Goldilocks ‘just right’ point between not critical enough and too critical. Find out more:

Google Doc

Forthcoming training sessions

Forthcoming sessions on :

CITY College

Please ensure you sign up at least one working day before the start of the session to be sure of receiving joining instructions.

If you're based at CITY College you can book onto the following sessions by sending an email with the session details to your Faculty Librarian:

[email protected]

There's more training events at:

writing and critical thinking skills

Quoting, paraphrasing and synthesising

Quoting, paraphrasing and synthesising are different ways that you can use evidence from sources in your writing. As you move from one method to the next, you integrate the evidence further into your argument, showing increasing critical analysis.

Here's a quick introduction to the three methods and how to use them:

Quoting, paraphrasing and synthesising: an introduction [YouTube video]  |  Quoting, paraphrasing and synthesising [Google Doc]

Want to know more? Check out these resources for more examples of paraphrasing and using notes to synthesise information:

Google Doc

Using evidence to build critical arguments

Academic writing integrates evidence from sources to create your own critical arguments.

We're not looking for a list of summaries of individual sources; ideally, the important evidence should be integrated into a cohesive whole. What does the evidence mean altogether?  Of course, a critical argument also needs some critical analysis of this evidence. What does it all mean in terms of your argument?

These resources will help you explore ways to integrate evidence and build critical arguments:

Building a critical argument [YouTube] |  Building a critical argument [Google Doc]

  • << Previous: Critical reading
  • Last Updated: Mar 25, 2024 5:46 PM
  • URL: https://subjectguides.york.ac.uk/critical

Banner Image

Library Guides

Critical thinking and writing: critical writing.

  • Critical Thinking
  • Problem Solving
  • Critical Reading
  • Critical Writing
  • Presenting your Sources

Common feedback from lecturers is that students' writing is too descriptive, not showing enough criticality: "too descriptive", "not supported by enough evidence", "unbalanced", "not enough critical analysis". This guide provides the foundations of critical writing along with some useful techniques to assist you in strengthening this skill. 

Key features of critical writing

Key features in critical writing include:

  • Presenting strong supporting evidence and a clear argument that leads to a reasonable conclusion. 
  • Presenting a balanced argument that indicates an unbiased view by evaluating both the evidence that supports your argument as well as the counter-arguments that may show an alternative perspective on the subject.
  • Refusing to simply accept and agree with other writers - you should show criticality towards other's works and evaluate their arguments, questioning if their supporting evidence holds up, if they show any biases, whether they have considered alternative perspectives, and how their arguments fit into the wider dialogue/debate taking place in their field. 
  • Recognizing the limitations of your evidence, argument and conclusion and therefore indicating where further research is needed.

Structuring Your Writing to Express Criticality

In order to be considered critical, academic writing must go beyond being merely descriptive. Whilst you may have some descriptive writing in your assignments to clarify terms or provide background information, it is important for the majority of your assignment to provide analysis and evaluation. 

Description :

Define clearly what you are talking about, introduce a topic.

Analysis literally means to break down an issue into small components to better understand the structure of the problem. However, there is much more to analysis: you may at times need to examine and explain how parts fit into a whole; give reasons; compare and contrast different elements; show your understanding of relationships. Analysis is to much extent context and subject specific.

Here are some possible analytical questions:

  • What are the constituent elements of something?
  • How do the elements interact?
  • What can be grouped together? What does grouping reveal?
  • How does this compare and contrast with something else?
  • What are the causes (factors) of something?
  • What are the implications of something?
  • How is this influenced by different external areas, such as the economy, society etc (e.g. SWOT, PESTEL analysis)?
  • Does it happen all the time? When? Where?
  • What other factors play a role? What is absent/missing?
  • What other perspectives should we consider?
  • What if? What are the alternatives?
  • With analysis you challenge the “received knowledge” and your own your assumptions.

Analysis is different within different disciplines:

  • Data analysis (filter, cluster…)
  • Compound analysis (chemistry)
  • Financial statements analysis
  • Market analysis (SWOT analysis)
  • Program analysis (computer science) - the process of automatically analysing the behaviour of computer programs
  • Policy Analysis (public policy) – The use of statistical data to predict the effects of policy decisions made by governments and agencies
  • Content analysis (linguistics, literature)
  • Psychoanalysis – study of the unconscious mind.

Evaluation : 

  • Identify strengths and weaknesses. 
  • Assess the evidence, methodology, argument etc. presented in a source. 
  • Judge the success or failure of something, its implications and/or value.
  • Draw conclusions from your material, make judgments about it, and relate it to the question asked. 
  • Express "mini-arguments" on the issues your raise and analyse throughout your work. (See box Your Argument.)
  • Express an overarching argument on the topic of your research. (See Your Argument .)

Tip: Try to include a bit of description, analysis and evaluation in every paragraph. Writing strong paragraphs can help, as it reminds you to conclude each paragraph drawing a conclusion. However, you may also intersperse the analysis with evaluation, within the development of the paragraph. 

Your Argument

What is an argument?

Essentially, the aim of an essay (and other forms of academic writing, including dissertations) is to present and defend, with reasons and evidence, an argument relating to a given topic. In the academic context argument means something specific. It is the main claim/view/position/conclusion on a matter, which can be the  answer to the essay (or research) question . The development of an argument is closely related to criticality , as in your academic writing you are not supposed to merely describe things; you also need to analyse and draw conclusions.

Tips on devising an argument

  • Try to think of a clear statement. It may be as simple as trying to prove that a statement in the essay title is right or wrong. 
  • Identify rigorous evidence and logical reasons to back up your argument. 
  • Consider different perspectives and viewpoints, but show why your argument prevails. 
  • Structure your writing in light of your argument: the argument will shape the whole text, which will present a logical and well-structured account of background information, evidence, reasons and discussion to support your argument.
  • Link and signpost to your argument throughout your work. 

Argument or arguments?

Both! Ideally, in your essay you will have an overarching argument (claim) and several mini-arguments, which make points and take positions on the issues you discuss within the paragraphs. 

Your Argument image

  • ACADEMIC ARGUMENTATION This help-sheet highlights the differences between everyday and academic argumentation
  • Argument A useful guide developed by The Writing Center, University of North Carolina at Chapel Hill.

Useful resources

Learning Development, University of Plymouth (2010). Critical Thinking. University of Plymouth . Available from  https://www.plymouth.ac.uk/uploads/production/document/path/1/1710/Critical_Thinking.pdf  [Accessed 16 January 2020].

Student Learning Development, University of Leicester (no date). Questions to ask about your level of critical writing. University of Leicester . Available from  https://www2.le.ac.uk/offices/ld/resources/writing/questions-to-ask/questions-to-ask-about-your-level-of-critical-writing  [Accessed 16 January 2020].

Workshop recording

  • Critical thinking and writing online workshop Recording of a 45-minute online workshop on critical thinking and writing, delivered by one of our Learning Advisers, Dr Laura Niada.

Workshop Slides

  • Critical Thinking and Writing
  • << Previous: Critical Reading
  • Next: Presenting your Sources >>
  • Last Updated: May 5, 2023 10:54 AM
  • URL: https://libguides.westminster.ac.uk/critical-thinking-and-writing

CONNECT WITH US

Home

- News, tips, inspiration you can trust to thrive in today’s digital age.

Search form

The importance of critical thinking in writing (and how to apply it).

Woman_Using_laptop_writing_importance_of_Critical_Thinking_When_Writing - Illustration

Developing unique ideas for writing and writing a story worth reading can be challenging. Even when the ideas for writing are already in your head, writing requires research, organization, and a great deal of creativity. But, you already knew that, right?

What many people don’t know or don't realize, however, is that all those processes for effective writing depend on how well you’ve developed your critical thinking skills.

According to the Texas A&M University Writing Center, critical thinking is "the ability to view any object of study from multiple perspectives, to recognize the cultural, ideological, and cognitive frames (or schemata) we bring to understanding."

You can learn everything about story structure and all the rules that come with it such as formatting, language and grammar rules , but applying your ideas effectively in an actual piece of writing requires critical thinking.

Critical thinking is what glues all of the writing processes together and defines your writing style . 

Critical Thinking Informs All Good Writing

laptop-texts-writing.jpg

The best writers are those who think critically and may have even undergone some form of critical thinking training . The value of critical thinking is clear thought-processing, which results in well-developed plots and writings. When you need to write a story that reads well and avoids plot holes and inconsistencies , honing your critical thinking is necessary.

You can perform the research necessary for a story and plan to finish with a strong conclusion. But, when you don’t apply critical thinking in your story, your ideas risk coming across as ambiguous or not well thought out. This is because you can’t really plan out your arguments or provide the story’s premises effectively without critical thinking.

Critical thinking in writing is related to research in the way you deliberately search, analyze and evaluate ideas that you'll put on paper. However, critical thinking discriminates information and ideas to ensure you pick and use only the most appropriate, concise words and paragraphs that deliver messages powerfully and with great impact on readers.

Reserchers have also come to understand that critical thinking is in itself a habit and a skill, something which you can practice, polish, and develop.

Hone Your Critical Thinking Skills

To consciously direct and hone your critical thinking skills, you’ll need to answer some basic questions before writing your story: 

  • How good is my argument or story idea?
  • Is my argument or idea defensible and valid?
  • Am I using a rational, reasonable position on the idea or issue?
  • What should I use to best present this idea and deal with its complexity?
  • Should I go deep into the topic or only touch upon the key issues lightly?
  • Should I address any other points of view, and which ones?
  • What are my goals with the story?
  • What sources of information should I consult?
  • What's the best way to present the information?

When asking (and answering) these questions, your analytical skills and quality of answers will depend greatly on the clarity of your thoughts, sources, and intentions. Once that's done sufficiently, you can apply it all to your writing.

8 Ways to Apply Critical Thinking in Your Writing

man_holding_pen__papers_writing.jpg

To make sure you write your story based on sound critical thinking, use these handy tips:

1. Research by questioning everything

Not all of the sources you will be using for your story, research, and critical analysis will be accurate or even relevant. Thinking critically means that you should question all your sources and be careful about the acquisition of data you’ll use in your story.

To write critically, you must examine every little piece of information before using it; validate and parse as part of your research . Basically, you need a rather active, critical and detailed approach throughout the accumulation of information.

2. Scrutinize your method of gathering information

Before you use any of the evidence or information you have found during the research for your story , look at the method for its gathering.

Think of sources you plan to use and places where you can find them. But, most importantly, think of the sources’ credibility and whether or not you can ascertain this.

Only use information that is reliable in your stories.

3. Stay true to the evidence

Before you jump into any conclusions, examine the evidence and the unbiased direction it is pointing towards.

Carefully examining the evidence for your ideas will help you find information that is valid, and any other information you might have missed out on an argument of big importance.

To avoid turning your story into a poorly written one, stay true to the evidence you’ve collected. Also consider the evidence itself in detail.

Is the evidence too broad? Does it have too many details? Are there any other explanations you can provide for it? Do you have enough evidence to support your arguments? Use only the most appropriate and accurate evidence.

4. Eliminate truisms and tautologies

Truism is a truth that is self-evident, while tautology is a statement that repeats the same thing. Both create redundancy that in most cases, doesn’t add directly to your story.

Even though truism and tautology used masterfully could give a story a certain artistic quality, you should generally try to avoid them in your writing.

Critically look for statements in your writing that repeat themselves or are self-evident. These are unnecessary features of your writing that should be removed to improve precision and clarity in your story.

5. Avoid oversimplification

There is a fine line between improving clarity and oversimplification. Try to achieve the former, while eradicating the latter as much as possible.

We are talking about using short, concise, easy to understand and simple explanations, and avoiding dumbed down explanations that insult the intelligence of the reader and demonstrate a lack of breadth and depth.

That certainly calls for high critical thinking and judgment when writing or crafting a story.

6. Plan ahead

When selecting a topic for your story, brainstorm ideas for it beforehand. Make sure the topic you chose is right for the specific purpose. Think of your objectives and goals, and also what you represent.

By brainstorming and planning ahead, you’ll be better equipped to write a story that is concise, relevant, and properly organized.

One grand factor of planning is organization. To plan ahead and do it well, you need to prioritize and reorganize your concepts, ideas, and arguments well.

In other words, you need a chronology of ideas and arguments. Use careful discretion and judgment to create a plan that makes sense and demonstrates your critical thinking abilities.

7. Define your approaches

In writing, you need arguments and ideas. But, you cannot just toss them around anyhow and expect them to make sense.

Instead, you’ll not only need good organization and planning skills, but also a strategy or an approach for presenting them in the most effective way possible.

As soon as you have all the evidence and material ready for use in your story, analyze the strengths and weaknesses of your sources and the arguments they raise. This will help you define the best possible approach for using the evidence and material in your story.

While you take care of this part, remember that each and every argument and evidence used in your story should be as reasonable as it is valid.

8. Break down your arguments

To better present the relationships between arguments in your story, and to find the best writing approach, break down arguments into smaller, easy to understand parts. For this purpose, you can use priority ranking, comparison and contrast, cause and effect, making inferences, and drawing conclusions.

Cons of Not Using Critical Thinking in Your Writing

If you are thinking applying critical thinking in writing is too much of a hassle, then understand that not incorporating critical thinking leads to poor writing.

And it’s easy to detect the effects of not using critical thinking in writing. Some of the obvious signs of not applying critical thinking is a piece of writing include:

  • Relationships between concepts aren’t clearly described, but only summarized or alluded to.
  • The arguments or thesis are repetitive and don’t relate to the rest of the story.
  • Poor or no order whatsoever in the presentation of arguments, summaries, and evidence.
  • No chronology or sequel in sentences, arguments, and or paragraphs.
  • Weak summaries or summaries with no order.
  • Relationships between arguments aren’t fully developed.
  • Heavy use of truisms, tautologies, and or abstractions.

If you want to write powerfully and ensure your stories (be they blogs, essays, or reports) yield results and impact readers , you have to improve clarity and add informational value. The only way to do this is by employing critical thinking in your writing.

Critical thinking is an essential skill and practice not just for good writing, but also for effective storytelling within your writings.

Alexandra Reay is a journalist, writer, and editor. She is also a professional content writer who enjoys researching and writing on the topics of self-improvement, technology innovations, and global education development. Follow her on Twitter .  

Related stories

Things Successful Writers Do Differently to Write More Fluidly

Quirky Things Top Creative Minds Did to Generate Brilliant Ideas

Types of Music for Inspiration While Writing

Practical Tips for Beating Writers' Block

Why Writing Longhand Is Still So Important

View the discussion thread.

Share this article

bg_carousel_header_0.png

SUBSCRIBE TO OUR NEWSLETTER  

Get our best content, news, tips, and inspiration in your inbox free.

Join Over 20,000 Subscribers!

Get our best content, tips, and inspiration free in your inbox.  Subscribe ››

linkedin-gray.jpg

Most read this week

small-boy-using-laptop - Raising Kids in the Digital World: Screen Time Stats &amp; Tips - illustration

Got a story or tip for us?

Tips_0_0_0.png

Here's how to submit it →

Latest posts

7 tips to help you build a successful creative business.

Tips to Build a Successful Creative Business

5 Signs a Customer Is Not Going to Pay You on Time

Signs a Customer Is Not Going to Pay You on Time

Wouldn't it be great if you could predict which customers are likely to pay late, if at all? There are signs of payment defaulters you can easily identify if you are keen enough.

7 Usability Rules That’ll Help You Build Great Websites

Usability Rules That’ll Help You Build Great Websites!

Contrary to what many people might think, website usability is not an abstract parameter. It is a measurable quality that you can actually improve to build great websites.

How to Develop a Digital Marketing Strategy That Grows Your Business Fast

How to Develop a Digital Marketing Strategy That Grows Your Business Fast

Should You Hire an IT Outsourcing Company for Your Business?

Should You Hire an IT Outsourcing Company for Your Business?

IT outsourcing can be an effective tool for improving your company's overall performance while minimizing costs.

  • Load 5 More Stories ▽

Contributors blogs

Blog here »

concept depicting carbon dioxide emissions impact on nature

Tackling Climate Change: The Big Companies Reducing Carbon Footprints

writing and critical thinking skills

Why Jargon Is Damaging to SEO

writing and critical thinking skills

5 Tips for Networking and Succeeding as a Digital Nomad

writing and critical thinking skills

Simple Changes You Can Make for a More Inclusive Workplace

habits_of_highly_disciplined_women

7 Habits of Highly Disciplined Women Who Succeed

man-drinking-alcohol-substance-abuse

Can Mindfulness Help to Stop Substance Abuse and Addiction?

Vaccine Passport: 32% of People Opposed to It Would Change their Mind

Vaccine Passport: 32% of People Opposed to It Would Change their Mind

Like this content subscribe for updates.

EXPLORE MORE ...

black-nav-bar1.png

News & Features   ›

london-skyline-uk-severe-data-breaches-costing-small-businesses

Data Breaches Could Cost UK Small Businesses a Whopping £41.3 Billion

inovative_companies_samsung_expo_people

Samsung Beats Apple as the Most Innovative Company [2023]

electric_car_at_charging_point

The Benefits of Buying a Used Electric Car

hor-line-blue

Tech & Trends   ›

writing and critical thinking skills

What Would It Cost to Develop a Mobile App Like TikTok?

writing and critical thinking skills

VPNGate Privacy & Security Solution: A Review

writing and critical thinking skills

5 Ways to Improve IT Support by Using Data More Strategically

Arts & culture   ›.

reading-glasses-pages-classic-books-on-writing

Six Classic Books On Writing Every Writer Should Read

Female Hands Typing - Awesome Tips & Reminders for Writers to Stay Inspired and Productive

10 Awesome Tips & Reminders for Writers to Stay Inspired and Productive

writing and critical thinking skills

The Impact of Social Media on Children

hor-line-brown

Business & Economy   ›

writing and critical thinking skills

Five Ways to Ensure Your Website's Accessibility Is Great

writing and critical thinking skills

5 Ways Your Choice of Web Host Affects Your Business

writing and critical thinking skills

How SMEs Are Achieving Global Exporting Success through Amazon

hor-line-green

Health & Style   ›

writing and critical thinking skills

7 Herbal Teas That Help to Calm Down Anxiety

writing and critical thinking skills

How to Make Hand Sanitizer at Home: Two Simple Recipes

writing and critical thinking skills

Oral Health: 7 Reasons to Consider Kid's Root Canal Treatment

Thinking through Writing

Before you purchase audiobooks and ebooks.

Please note that audiobooks and ebooks purchased from this site must be accessed on the Princeton University Press app. After you make your purchase, you will receive an email with instructions on how to download the app. Learn more about audio and ebooks .

Support your local independent bookstore.

  • United States
  • United Kingdom

Thinking through Writing: A Guide to Becoming a Better Writer and Thinker

  • John Kaag and Jonathan van Belle

A concise and practical manual on developing reading, writing, and critical thinking skills in tandem

  • Skills for Scholars

writing and critical thinking skills

  • Request Exam Copy
  • Download Cover

For college students learning how to write on scholarly subjects, writing and critical thinking go hand in hand. And yet most books on these topics are categorized separately: writing guides and critical thinking handbooks. This book is different, offering a manual for developing reading, writing, and thinking skills in tandem. With short, practical chapters, Thinking through Writing helps readers learn to think critically about themselves and the world at large, read carefully and get the necessary literary support, write clearly and persuasively, stay on point, and finish their work as cleanly and compellingly as possible. Drawing on years of teaching critical thinking and writing, including almost a decade of teaching Harvard’s freshman expository writing course, the authors invite readers to consider the intimate relationship between thinking and the creative, critical, self-actualizing act of writing. • Interviews with some of the most interesting and brilliant writers working today • Advice on how to structure an argument, write for an audience, work through writer’s block and anxiety, and much more • Tips on how to make your writing unique and personal • Exercises and templates to help novice writers reach their full potential in practice

writing and critical thinking skills

“One of the great strengths of the book is its knowledgeable-but-cheeky tone. It lends the authors enormous credibility when offering advice and solace and will help keep student readers on their toes rather than checking out of reading a dry textbook.” —Lisa Horton, University of Minnesota Duluth

“With a unique, lively, and contemporary approach, the authors of Thinking through Writin g meet students where they are at both intellectually and personally, addressing them as partners and allies in the writing process.”—Douglas Dowland, Ohio Northern University

“Good writing involves ‘thinking against oneself’—questioning oneself—and this jargon-free, concise, witty book not only does an admirable job of showing the student-writer how to perform this magic trick but also vividly executes it page by page, line by line.”—David Shields, New York Times bestselling author of The Thing about Life Is That One Day You'll Be Dead

“Writing is thinking, not just a record of your thoughts—and this book exemplifies that truth on every page. With bell-ringing clarity, Thinking through Writing shows how to sort your thoughts out into arguments, and how to support those arguments firmly and concisely. It's a true writer's companion, and a must-read for students who want to move from idea to page.”—Leonard Cassuto, author of Academic Writing as if Readers Matter

Stay connected for new books and special offers. Subscribe to receive a welcome discount for your next order. 

50% off sitewide with code FIFTY  |  May 7 – 31  |  Some exclusions apply. See our FAQ .

  • ebook & Audiobook Cart

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Learning to Improve: Using Writing to Increase Critical Thinking Performance in General Education Biology

  • Ian J. Quitadamo
  • Martha J. Kurtz

*Department of Biological Sciences, Central Washington University, Ellensburg, WA 98926-7537; and

Search for more papers by this author

Department of Chemistry, Central Washington University, Ellensburg, WA 98926-7539

Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens.

INTRODUCTION

A national call to improve critical thinking in science.

In the past several years, an increasing number of national reports indicate a growing concern over the effectiveness of higher education teaching practices and the decreased science (and math) performance of U.S. students relative to other industrialized countries ( Project Kaleidoscope, 2006 ). A variety of national stakeholders, including business and educational leaders, politicians, parents, and public agencies, have called for long-term transformation of the K–20 educational system to produce graduates who are well trained in science, can engage intelligently in global issues that require local action, and in general are better able to solve problems and think critically. Specifically, business leaders are calling for graduates who possess advanced analysis and communication skills, for instructional methods that improve lifelong learning, and ultimately for an educational system that builds a nation of innovative and effective thinkers ( Business-Higher Education Forum and American Council on Education, 2003 ). Education leaders are similarly calling for institutions of higher education to produce graduates who think critically, communicate effectively, and who employ lifelong learning skills to address important scientific and civic issues ( Association of American Colleges and Universities, [AACU] 2005 ).

Many college faculty consider critical thinking to be one of the most important indicators of student learning quality. In its 2005 national report, the AACU indicated that 93% of higher education faculty perceived analytical and critical thinking to be an essential learning outcome (AACU, 2005) whereas 87% of undergraduate students indicated that college experiences contributed to their ability to think analytically and creatively. This same AACU report showed that only 6% of undergraduate seniors demonstrated critical thinking proficiency based on Educational Testing Services standardized assessments from 2003 to 2004. During the same time frame, data from the ACT Collegiate Assessment of Academic Proficiency test showed a similar trend, with undergraduates improving their critical thinking less than 1 SD from freshman to senior year. Thus, it appears a discrepancy exists between faculty expectations of critical thinking and students' ability to perceive and demonstrate critical thinking proficiency using standardized assessments (AACU, 2005).

Teaching that supports the development of critical thinking skills has become a cornerstone of nearly every major educational objective since the Department of Education released its six goals for the nation's schools in 1990. In particular, goal three of the National Goals for Education stated that more students should be able to reason, solve problems, and apply knowledge. Goal six specifically stated that college graduates must be able to think critically ( Office of Educational Research and Improvement, 1991 ). Since 1990, American education has tried—with some success—to make a fundamental shift from traditional teacher-focused instruction to more student-centered constructivist learning that encourages discovery, reflection, and in general is thought to improve student critical thinking skill. National science organizations have supported this trend with recommendations to improve the advanced thinking skills that support scientific literacy ( American Association for Higher Education, 1989 ; National Research Council, 1995 ; National Science Foundation, 1996 ).

More recent reports describe the need for improved biological literacy as well as international competitiveness ( Bybee and Fuchs, 2006 ; Klymkowsky, 2006 ). Despite the collective call for enhanced problem solving and critical thinking, educators, researchers, and policymakers are discovering a lack of evidence in existing literature for methods that measurably improve critical thinking skills ( Tsui, 1998 , 2002 ). As more reports call for improved K–20 student performance, it is essential that research-supported teaching and learning practices be used to better help students develop the cognitive skills that underlie effective science learning ( Malcom et al., 2005 ; Bybee and Fuchs, 2006 ).

Critical Thinking

Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions ( U.S. Department of Education, 1990 ), and are generally more employable ( Carnevale and American Society for Training and Development, 1990 ; Holmes and Clizbe, 1997 ; National Academy of Sciences, 2005 ). By focusing on instructional efforts that develop critical thinking skills, it may be possible to increase student performance while satisfying national stakeholder calls for educational improvement and increased ability to solve problems as engaged and productive citizens.

Although academics and business professionals consider critical thinking skill to be a crucial outcome of higher education, many would have difficulty defining exactly what critical thinking is. Historically, there has been little agreement on how to conceptualize critical thinking. Of the literally dozens of definitions that exist, one of the most organized efforts to define (and measure) critical thinking emerged from research done by Peter Facione and others in the early 1990s. Their consensus work, referred to as the Delphi report, was accomplished by a group of 46 leading theorists, teachers, and critical thinking assessment specialists from a variety of academic and business disciplines ( Facione and American Philosophical Association, 1990 ). Initial results from the Delphi report were later confirmed in a national survey and replication study ( Jones et al., 1995 ). In short, the Delphi panel expert consensus describes critical thinking as a “process of purposeful self-regulatory judgment that drives problem-solving and decision-making” ( Facione and American Philosophical Association, 1990 ). This definition implies that critical thinking is an intentional, self-regulated process that provides a mechanism for solving problems and making decisions based on reasoning and logic, which is particularly useful when dealing with issues of national and global significance.

The Delphi conceptualization of critical thinking encompasses several cognitive skills that include: 1) analysis (the ability to break a concept or idea into component pieces in order to understand its structure and inherent relationships), 2) inference (the skills used to arrive at a conclusion by reconciling what is known with what is unknown), and 3) evaluation (the ability to weigh and consider evidence and make reasoned judgments within a given context). Other critical thinking skills that are similarly relevant to science include interpretation, explanation, and self-regulation ( Facione and American Philosophical Association, 1990 ). The concept of critical thinking includes behavioral tendencies or dispositions as well as cognitive skills ( Ennis, 1985 ); these include the tendency to seek truth, to be open-minded, to be analytical, to be orderly and systematic, and to be inquisitive ( Facione and American Philosophical Association, 1990 ). These behavioral tendencies also align closely with behaviors considered to be important in science. Thus, an increased focus on teaching critical thinking may directly benefit students who are engaged in science.

Prior research on critical thinking indicates that students' behavioral dispositions do not change in the short term ( Giancarlo and Facione, 2001 ), but cognitive skills can be developed over a relatively short period of time (Quitadamo, Brahler, and Crouch, unpublished results). In their longitudinal study of behavioral disposition toward critical thinking, Giancarlo and Facione (2001) discovered that undergraduate critical thinking disposition changed significantly after two years. Specifically, significant changes in student tendency to seek truth and confidence in thinking critically occurred during the junior and senior years. Also, females tended to be more open-minded and have more mature judgment than males ( Giancarlo and Facione, 2001 ). Although additional studies are necessary to confirm results from the Giancarlo study, existing research seems to indicate that changes in undergraduate critical thinking disposition are measured in years, not weeks.

In contrast to behavioral disposition, prior research indicates that critical thinking skills can be measurably changed in weeks. In their study of undergraduate critical thinking skill in university science and math courses, Quitadamo, Brahler, and Crouch (unpublished results) showed that critical thinking skills changed within 15 wk in response to Peer Led Team Learning (a national best practice for small group learning). This preliminary study provided some evidence that undergraduate critical thinking skills could be measurably improved within an academic semester, but provided no information about whether critical thinking skills could be changed during a shorter academic quarter. It was also unclear whether the development of critical thinking skills was a function of chronological time or whether it was related to instructional time.

Numerous studies provide anecdotal evidence for pedagogies that improve critical thinking, but much of existing research relies on student self-report, which limits the scope of interpretation. From the literature it is clear that, although critical thinking skills are some of the most valued outcomes of a quality education, additional research investigating the effects of instructional factors on critical thinking performance is necessary ( Tsui, 1998 , 2002 ).

Writing and Critical Thinking

Writing has been widely used as a tool for communicating ideas, but less is known about how writing can improve the thinking process itself ( Rivard, 1994 ; Klein, 2004 ). Writing is thought to be a vehicle for improving student learning ( Champagne and Kouba, 1999 ; Kelly and Chen, 1999 ; Keys, 1999 ; Hand and Prain, 2002 ), but too often is used as a means to regurgitate content knowledge and derive prescribed outcomes ( Keys, 1999 ; Keys et al., 1999 ). Historically, writing is thought to contribute to the development of critical thinking skills ( Kurfiss, and Association for the Study of Higher Education, 1988 ). Applebee (1984) suggested that writing improves thinking because it requires an individual to make his or her ideas explicit and to evaluate and choose among tools necessary for effective discourse. Resnick (1987) stressed that writing should provide an opportunity to think through arguments and that, if used in such a way, could serve as a “cultivator and an enabler of higher order thinking.” Marzano (1991) suggested that writing used as a means to restructure knowledge improves higher-order thinking. In this context, writing may provide opportunity for students to think through arguments and use higher-order thinking skills to respond to complex problems ( Marzano, 1991 ).

Writing has also been used as a strategy to improve conceptual learning. Initial work focused on how the recursive and reflective nature of the writing process contributes to student learning ( Applebee, 1984 ; Langer and Applebee, 1985 , 1987 ; Ackerman, 1993 ). However, conclusions from early writing to learn studies were limited by confounding research designs and mismatches between writing activities and measures of student learning ( Ackerman, 1993 ). Subsequent work has focused on how writing within disciplines helps students to learn content and how to think. Specifically, writing within disciplines is thought to require deeper analytical thinking ( Langer and Applebee, 1987 ), which is closely aligned with critical thinking.

The influence of writing on critical thinking is less defined in science. Researchers have repeatedly called for more empirical investigations of writing in science; however, few provide such evidence ( Rivard, 1994 ; Tsui, 1998 ; Daempfle, 2002 ; Klein, 2004 ). In his extensive review of writing research, Rivard (1994) indicated that gaps in writing research limit its inferential scope, particularly within the sciences. Specifically, Rivard and others indicate that, despite the volume of writing students are asked to produce during their education, they are not learning to use writing to improve their awareness of thinking processes ( Resnick, 1987 ; Howard, 1990 ). Existing studies are limited because writing has been used either in isolation or outside authentic classroom contexts. Factors like gender, ethnicity, and academic ability that are not directly associated with writing but may nonetheless influence its effectiveness have also not been sufficiently accounted for in previous work ( Rivard, 1994 ).

A more recent review by Daempfle (2002) similarly indicates the need for additional research to clarify relationships between writing and critical thinking in science. In his review, Daempfle identified nine empirical studies that generally support the hypothesis that students who experience writing (and other nontraditional teaching methods) have higher reasoning skills than students who experience traditional science instruction. Of the relatively few noninstructional variables identified in those studies, gender and major did not affect critical thinking performance; however, the amount of time spent on and the explicitness of instruction to teach reasoning skills did affect overall critical thinking performance. Furthermore, the use of writing and other nontraditional teaching methods did not appear to negatively affect content knowledge acquisition ( Daempfle, 2002 ). Daempfle justified his conclusions by systematically describing the methodological inconsistencies for each study. Specifically, incomplete sample descriptions, the use of instruments with insufficient validity and reliability, the absence of suitable comparison groups, and the lack of statistical covariate analyses limit the scope and generalizability of existing studies of writing and critical thinking ( Daempfle, 2002 ).

Writing in the Biological Sciences

The conceptual nature and reliance on the scientific method as a means of understanding make the field of biology a natural place to teach critical thinking through writing. Some work has been done in this area, with literature describing various approaches to writing in the biological sciences that range from linked biology and English courses, writing across the biology curriculum, and directed use of writing to improve reasoning in biology courses ( Ebert-May et al., 1997 ; Holyoak, 1998 ; Taylor and Sobota, 1998 ; Steglich, 2000 ; Lawson, 2001 ; Kokkala and Gessell, 2003 ; Tessier, 2006 ). In their work on integrated biology and English, Taylor and Sobota (1998) discussed several problem areas that affected both biology and English students, including anxiety and frustration associated with writing, difficulty expressing thoughts clearly and succinctly, and a tendency to have strong negative responses to writing critique. Although the authors delineate the usefulness of several composition strategies for writing in biology ( Taylor and Sobota, 1998 ), it was unclear whether student data were used to support their recommendations. Kokkala and Gessell (2003) used English students to evaluate articles written by biology students. Biology students first reflected on initial editorial comments made by English students, and then resubmitted their work for an improved grade. In turn, English students had to justify their editorial comments with written work of their own. Qualitative results generated from a list of reflective questions at the end of the writing experience seemed to indicate that both groups of students improved editorial skills and writing logic. However, no formal measures of student editorial skill were collected before biology-English student collaboration, so no definitive conclusions on the usefulness of this strategy could be made.

Taking a slightly different tack, Steglich (2000) informally assessed student attitudes in nonmajors biology courses, and noted that writing produced positive changes in student attitudes toward biology. However, the author acknowledged that this work was not a research study. Finally, Tessier (2006) showed that students enrolled in a nonmajors ecology course significantly improved writing technical skills and committed fewer errors of fact regarding environmental issues in response to a writing treatment. Attitudes toward environmental issues also improved ( Tessier, 2006 ). Although this study surveyed students at the beginning and the end of the academic term and also tracked student progress during the quarter, instrument validity and reliability were not provided. The generalizability of results was further limited because of an overreliance on student self-reports and small sample size.

Each of the studies described above peripherally supports a relationship between writing and critical thinking. Although not explicitly an investigation of critical thinking, results from a relatively recent study support a stronger connection between writing and reasoning ability ( Daempfle, 2002 ). Ebert-May et al. (1997) used a modified learning cycle instructional method and small group collaboration to increase reasoning ability in general education biology students. A quasi-experimental pretest/posttest control group design was used on a comparatively large sample of students, and considerable thought was given to controlling extraneous variables across the treatment and comparison groups. A multifaceted assessment strategy based on writing, standardized tests, and student interviews was used to quantitatively and qualitatively evaluate student content knowledge and thinking skill. Results indicated that students in the treatment group significantly outperformed control group students on reasoning and process skills as indicated by the National Association of Biology Teachers (NABT) content exam. Coincidentally, student content knowledge did not differ significantly between the treatment and control sections, indicating that development of thinking skill did not occur at the expense of content knowledge ( Ebert-May et al., 1997 ). Interview data indicated that students experiencing the writing and collaboration-based instruction changed how they perceived the construction of biological knowledge and how they applied their reasoning skills. Although the Ebert-May study is one of the more complete investigations of writing and critical thinking to date, several questions remain. Supporting validity and reliability data for the NABT test was not included in the study, making interpretation of results somewhat less certain. In addition, the NABT exam is designed to assess high school biology performance, not college performance ( Daempfle, 2002 ). Perhaps more importantly, the NABT exam does not explicitly measure critical thinking skills.

Collectively, it appears that additional research is necessary to establish a more defined relationship between writing and critical thinking in science ( Rivard, 1994 ; Tsui, 1998 , 2002 ; Daempfle, 2002 ). The current study addresses some of the gaps in previous work by evaluating the effects of writing on critical thinking performance using relatively large numbers of students, suitable comparison groups, valid and reliable instruments, a sizable cadre of covariables, and statistical analyses of covariance. This study uses an experimental design similar to that of the Ebert-May et al. (1997) study but incorporates valid and reliable test measures of critical thinking that can be used both within and across different science disciplines.

Purpose of the Study

Currently there is much national discussion about increasing the numbers of students majoring in various science fields ( National Research Council, 2003 ; National Academy of Sciences, 2005 ). Although this is a necessary and worthwhile goal, attention should also be focused on improving student performance in general education science because these students will far outnumber science majors for the foreseeable future. If college instructors want general education students to think critically about science, they will need to use teaching methods that improve student critical thinking performance. In many traditional general education biology courses, students are not expected to work collaboratively, to think about concepts as much as memorize facts, or to develop and support a written thesis or argument. This presents a large problem when one considers the societal role that general education students will play as voters, community members, and global citizens. By improving their critical thinking skills in science, general education students will be better able to deal with the broad scientific, economic, social, and political issues they will face in the future.

Does writing in laboratory affect critical thinking performance in general education biology?

Does the development of analysis, inference, and evaluation skills differ between students who experience writing versus those who experience traditional laboratory instruction?

What measurable effect do factors like gender, ethnicity, and prior thinking skill have on changes in critical thinking in general education biology?

If critical thinking skills change during an academic quarter, when does that take place?

MATERIALS AND METHODS

Study context.

The study took place at a state-funded regional comprehensive university in the Pacific Northwest. All participants were nonmajor undergraduates who were taking biology to satisfy their general education science requirement. Ten total sections of general education biology offered over three academic quarters (one academic year) were included in the study. Four of the 10 sections implemented a writing component during weekly laboratory meetings (N = 158); six traditional quiz-based laboratory sections served as a nonwriting control group (N = 152). Only scores from students who had completed both the initial (pretest) and end-of-quarter (posttest) critical thinking assessments were included in the data analysis. A breakdown of participant demographics for the writing and nonwriting groups is provided in Table 1 .

Demographics profile for the study sample. n values in parentheses.

a Other includes the ″choose not to answer″ response.

Each course section included a lecture component offered four times per week for 50 min and a laboratory component that met once a week for 2 h. Course lecture sections were limited to a maximum enrollment of 48 students, with two concurrent lab sections of 24 students. Two different instructors taught five writing sections and five other instructors taught 11 traditional sections over three consecutive quarters. Each course instructor materially participated in teaching laboratory with the help of one graduate assistant per lab section (two graduate students per course section). None of the instructors from treatment sections had implemented writing in the laboratory before the start of this study. Writing instructors were chosen on the basis of personal dissatisfaction with traditional laboratory teaching methods and willingness to try something new.

Strong efforts were made to establish equivalency between writing and nonwriting course sections a priori. Course elements that were highly similar included common lecture rooms, the use of similar (in most cases identical) textbooks, and a lab facility coordinated by a single faculty member. More specifically, three similarly appointed lecture rooms outfitted with contemporary instructional technology including dry erase boards, media cabinets, a networked computer, and digital projection were used to teach the nonmajors biology courses. The same nonmajors biology textbook was used across the writing and most of the nonwriting sections. All laboratory sections used a common lab facility and were taught on the same day of the week. Although the order in which specific labs were taught differed among sections, a common laboratory manual containing prescriptive exercises covering the main themes of biology (scientific method, cellular biology and genetics, natural selection and evolution, kingdoms of life, and a mammalian dissection) was used across all writing and nonwriting lab sections.

Primary course differences included a writing component in the laboratory, and how much time was devoted to laboratory activities. Those sections that experienced the writing treatment completed the prescriptive lab exercises in the first hour and engaged in writing during the second hour of the lab. Nonwriting sections allocated 2 h for the prescriptive lab exercises and included a traditional laboratory quiz rather than a writing assignment. The degree to which the writing and nonwriting sections included small group collaboration in laboratory varied and all course sections differed with regards to individual instructor teaching style. Although all course sections used traditional lecture exams during the quarter to assess content knowledge, the degree to which rote memorization-based exam questions were used to evaluate student learning varied.

Description of the Writing Treatment

On the first day of lecture, students in the writing treatment group were told that their laboratory performance would be evaluated using collaborative essays instead of traditional quizzes. A brief overview of the writing assignments was included in associated course syllabi. During the first laboratory session of the quarter, students were grouped into teams of three or four individuals, and the criteria for completing weekly writing assignments were further explained.

The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature ( Collier, 1980 ; Bruffee, 1984 ; Tobin et al., 1994 ; Jones and Carter, 1998 ; Springer et al., 1999 ) and prior research by Quitadamo, Brahler, and Crouch (unpublished results), who showed that Peer Led Team Learning (one form of collaborative learning) helped to measurably improve undergraduate critical thinking skills. Small group learning was also used in the nonwriting treatment groups to a greater or lesser extent depending on individual instructor preference.

Baseline critical thinking performance was established in the academic quarters preceding the writing experiment to more specifically attribute changes in critical thinking to the writing treatment. Concurrent nonwriting course sections were also used as comparison groups. The historical baseline provided a way to determine what student performance had been before experiencing the writing treatment, whereas the concurrent nonwriting groups allowed for a direct comparison of critical thinking performance during the writing treatment. Pretest scores indicating prior critical thinking skill were also used to further establish comparability between the writing and nonwriting groups.

Laboratory activities were coordinated for all sections by a single faculty member who taught in the nonwriting group. All faculty and graduate assistants met regularly to discuss course progress, laboratory procedure, and coordinate resources. Nonwriting faculty drafted quizzes that addressed laboratory content knowledge. Writing faculty collaboratively crafted a consensus essay, or thought question, designed to elicit student critical thinking and ability to apply content knowledge. Each thought question was designed so that students had to apply lecture concepts and build on their conceptual understanding by integrating actual laboratory experiences (see Supplemental Appendix 1 , available online) for thought question examples). Weekly thought questions became progressively more difficult as the term progressed. Initial planning meetings took place just before the beginning of the academic quarter and included graduate assistant training to help them learn to consistently evaluate student writing using a modified thesis-based essay rubric (see Supplemental Appendix 2 ; Beers et al., 1994 ). A range of sample essays from poor to high quality was used to calibrate graduate assistant scoring and ensure consistency between assistants from different laboratory sections within the writing group. All graduate assistants and course instructors applied the thesis-based rubric to sample essays and worked toward consensus. Initial training ended when all graduate assistants scored within 0.5 points of each other on at least two sample essays.

Students were given weekly thought questions before beginning laboratory to help them frame their efforts during laboratory exercises. Students completed the prescriptive lab activities during the first hour, and then each student group relocated to an assigned computer lab in the same building and worked around a common computer terminal to draft a collective response to the weekly thought question. Students were allowed to use any suitable information or materials (laboratory observations, laboratory manuals, lecture notes, textbooks, the Internet, etc.) to help them address their thought question. Internal group discussions allowed students to argue individual viewpoints as they worked toward group agreement on each thought question. Essay responses to thought questions were answered using a standard five-paragraph format. Each essay included an introduction with a group-generated thesis statement, two to three body paragraphs that provided sufficient detail to support the thesis statement, and a summary paragraph that concluded the essay. Students were not allowed to work on essays outside of the laboratory environment.

Initial essay drafts were composed in Microsoft Word and submitted to the graduate assistant by the end of the laboratory period using the campus e-mail system. Graduate assistants evaluated each group's essay (typically six per lab section) and assigned an initial grade based on the thesis-based essay rubric. Graduate assistants made comments and suggestions electronically using Microsoft Word revising and track changes tools. Evaluated essays were e-mailed back to each student group, which addressed comments and suggestions during the subsequent week's laboratory writing time. Each student group submitted a final draft that was re-evaluated and assigned a final grade. During the second week, students both revised their essay from the previous week and then generated an initial draft for the current week's thought question, all within the lab writing hour. This was done to help students become more proficient writers within a short period of time. Overall, students in the writing group completed eight essays that, along with lab book scores, constituted 25% of their overall course grade. An identical percentage was used to calculate traditional quiz and lab book scores in all nonwriting course sections.

At the end of the quarter, each writing group member completed a peer evaluation for all group members, including themselves (see Supplemental Appendix 3 ). This was done to help students reflect on and evaluate their own performance, maximize individual accountability within the group, and make sure students received credit proportional to their contributions. The average peer evaluation score for each student was included as 5% of the final course grade.

Collectively, this approach to writing and evaluation was used to 1) help students reflect on and discuss deficiencies in their collective and written work, 2) provide an opportunity for students to explicitly address deficiencies in thesis development and general writing skill, 3) provide a suitable reward for student efforts to revise their work relative to established performance benchmarks, 4) improve individual accountability within each group, and 5) help students develop more efficient and effective writing skills that collectively might lead to improved critical thinking skill.

Assessment of Critical Thinking

Using critical thinking to indicate student learning performance is particularly useful because it can be measured within and across disciplines. Various instruments are available to assess critical thinking ( Watson and Glaser, 1980 ; Ennis and Weir, 1985 ; Facione, 1990b ; Center for Critical Thinking and Moral Critique, 1996 ); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and has been evaluated for validity and reliability for measuring critical thinking at the college level ( Facione, 1990a ; Facione et al., 1992 , 2004 ). The CCTST measures cognitive skills of analysis, inference, evaluation, induction, and deduction, with results expressed as raw scores or national percentile equivalents based on a national norming sample of students from 4-yr colleges and universities. Construct validity for the CCTST is high as indicated by greater than 95% consensus of the Delphi panel experts on the component skills of critical thinking. Test reliability (calculated using the KR–20 internal consistency method) is 0.78–0.84 for the form used in this study, a value considered to be within the recommended range for tests that measure a wide range of critical thinking skills ( Facione, 1991 ). The CCTST norming sample for 4-yr colleges and universities is based on a stratified sample of 2000 students from various disciplines, with approximately 30% of the norming sample comprised of science and math students. Approximately 20,000 college students complete the CCTST each year ( Insight Assessment and Blohm, 2005 ).

The CCTST contains 34 questions and is a 45-min timed assessment of critical thinking. An online version of the CCTST was administered in this study, which allowed the researchers to collect student demographics data including gender, ethnicity, age, and several others at the same time critical thinking skill was measured. Total critical thinking skill as well as analysis, inference, and evaluation component critical thinking skills ( Facione, 1990c ) were determined for each CCTST administration and compared across the writing and nonwriting groups.

Research Design

A quasi-experimental pretest/posttest control group design was used for this study to determine whether critical thinking performance in the writing group differed significantly from the nonwriting group. This design was chosen in order to compare critical thinking performance between intact groups, and because it was not feasible to randomly assign students from one course section to another within the sample. Frequency distributions of pretest/posttest changes in total critical thinking skill and analysis, inference, and evaluation component critical thinking skills were constructed to provide some indication of sample randomness and to inform assumptions for subsequent statistical analyses of covariance (see Figure 1 , A–D).

The pretest/posttest control group design was also used in order to minimize internal validity threats that could potentially compete with the effects of the writing treatment on student critical thinking performance. This design is widely used in educational research, and generally controls for most threats to internal validity ( Campbell and Stanley, 1963 ). Internal threats that remain a concern include history, maturation, pretest sensitization, selection, and statistical regression toward the mean. In the current study, history and maturation threats were minimized to the extent that the CCTST pretest and posttest were administered only 9 wk apart, and class standing and age covariables that indicate maturation were included in the statistical analysis. Pretest sensitization and selection are larger concerns for this design. Pretest sensitization was minimized in several ways: 1) prior critical thinking skill indicated by the CCTST pretest was used as a covariable in statistical analyses, 2) pretest/posttest to posttest only comparison studies conducted by Insight Assessment indicate CCTST pretest sensitization is minimized ( Facione, 1990a ), and 3) neither the students, instructors, nor the test administrators have access to the correct answers on the CCTST, so repeat performance on the posttest is less likely. Selection threats were also reduced by using CCTST pretest scores in the statistical analyses, thereby making it more difficult to detect statistically significant differences in critical thinking performance between the writing and nonwriting groups. Statistical regression toward the mean, which was observed to some extent in this study, was minimized because this study used a valid and reliable instrument to assess critical thinking ( Facione, 1990a ). Regression threats were also minimized to the extent that students with higher initial scores regressed much less than students with lower initial scores.

The generalizability of study results is limited because all data were collected at a single university. Specific threats to external validity include selection-treatment interaction and treatment diffusion. These threats were minimized because writing was mandatory for all treatment group participants, thereby minimizing volunteer effects. Because the writing also took considerable student effort, it is less likely that treatment diffusion occurred. In summary, the pretest/posttest control group design was used to minimize internal and external validity threats and maximize the ability to determine the effects of writing on student critical thinking performance.

Study Variables and Data Analysis

Effect of writing on critical thinking performance..

General education biology students were divided into writing and nonwriting groups (independent variable). Changes in CCTST pretest/posttest scores (dependent variable) were determined to discover whether writing influenced student critical thinking performance. Two CCTST outcome measures were used to statistically test for writing effect: 1) raw scores for total critical thinking skill, and 2) raw scores for analysis, inference, and evaluation component skills. Results were reported using raw scores and corresponding national percentile rank so that critical thinking performance outcomes would be more meaningful and intuitive. Conversion of CCTST raw scores to national percentile ranking was done using SPSS (SPSS, Inc., Chicago, IL) statistical software and a linear estimation conversion script based on an equivalency scale from Insight Assessment (Millbrae, CA).

Several covariables were included in the analysis to increase statistical accuracy and precision, and to more specifically isolate the effects of writing on critical thinking performance. CCTST pretest scores were used to indicate initial critical thinking skill. Gender and ethnicity helped to account for male/female or race-specific changes in critical thinking performance and were also used to identify potential sources of performance bias. Academic term and time of day were used to account for critical thinking differences due to the time of year each course was offered and the time of day each student took the course, respectively. Class standing and age were used to indicate maturation related to time in college and chronological age, respectively. Finally, the instructor covariable was used to account for performance differences due to individual teaching styles.

Statistical Analysis of Effect of Writing.

Several statistical analyses were conducted to determine the effects of writing on critical thinking performance in general education biology. An analysis of covariance (ANCOVA) test provided insight regarding differences in overall critical thinking performance between the writing and nonwriting groups. Change in CCTST total raw scores and national percentile ranking was used as composite measures of critical thinking ( Facione, 1990c ) in this initial analysis. Second, changes in particular component critical thinking skills (analysis, inference, and evaluation) were evaluated using a multivariate analysis of covariance (MANCOVA) test because of the three dependent variables. The ANCOVA and MANCOVA tests also provided some insight into the effect the covariables had on critical thinking performance in general education biology. Collectively, these statistical tests allowed for a more accurate and precise analysis because variance associated with the covariables could be more specifically isolated from the writing treatment. Mean, SE, and effect size were also compared between the writing and nonwriting groups. Effect size, represented in standard units, was used to compare the magnitude of writing effect in the study.

Analysis of Thought Question Performance.

Performance on weekly thought questions was analyzed to discover specifically when and how much student critical thinking skills changed during the academic term. This analysis also provided context for CCTST critical thinking performance measures. Specifically, average scores from a representative sample of writing course sections (approximately 100 students) were used to compare initial essay drafts across the weeks of the term to discover when students began to show changes in their first attempt at each essay. Weekly performance on final revised essays was also compared to determine how student final submissions changed over time. Finally, the weekly difference between each initial essay and each final essay was compared to determine how much the revision process changed during the term. These calculations collectively helped to provide a profile of critical thinking performance over time.

Participant Demographics

Student demographics provided in Table 1 indicated an overall distribution of approximately 49% freshmen, 31% sophomores, 11% juniors, and 9% seniors. Approximately 74% of the writing group students were freshmen and sophomores, whereas 82% of the nonwriting group was underclassmen. Overall, 61% of the sample was female and 39% male, with near identical gender distribution across the writing and nonwriting groups. The predominant ethnicity in the sample was Caucasian (>83%), with Asian American (5%), Latino/Hispanic (3%), African American (2%), and Native American (1%) students comprising the remainder of the sample. About 6% of the sample classified themselves as having some other ethnicity or chose not to identify their ethnic heritage.

Statistical Assumptions

Analysis of covariance and multivariate analysis of covariance tests were used to compare critical thinking performance between the writing and nonwriting groups. The evaluated assumptions for the ANCOVA and MANCOVA tests were homogeneity of slopes, homogeneity of covariances, and normality. An analysis evaluating the homogeneity of slopes assumption indicated that the relationship between the covariables and the critical thinking performance dependent variable did not differ significantly by the writing/nonwriting independent variable for the ANCOVA test, F(2, 307) = 1.642, p = 0.195, power = 0.346, partial η 2 = 0.011, or the MANCOVA test, F(6, 610) = 1.685, p = 0.122, power = 0.645, partial η 2 = 0.016. These results confirmed that both analyses of covariance met the homogeneity of slopes assumption. The homogeneity of covariance assumption was tested using Levene's and Box's tests. Levene's test results for the ANCOVA indicated that error variances were not equal across writing and nonwriting groups, F(1,308) = 7.139, p = 0.008. Similarly, Box's test results indicated that covariance was not equal for the writing and nonwriting groups, F(6, 684,530) = 4.628, p = 0.000. These results indicated that the ANCOVA/MANCOVA tests did not meet the homogeneity of covariance assumption. To more fully evaluate this assumption, distributions of total and component critical thinking skill were constructed (see Figure 1 , A–D). Furthermore, the writing and nonwriting groups were highly similar in size and no post hoc tests were conducted. On the basis of these data, it was determined that the ANCOVA and MANCOVA tests were the best statistical measures to answer the research questions. Finally, the normality assumption was evaluated using the previously constructed frequency distributions for total change in critical thinking ( Figure 1 A) as well as change in analysis ( Figure 1 B), inference ( Figure 1 C), and evaluation ( Figure 1 D) critical thinking skills. Frequency distributions of total and component critical thinking dependent variables indicated that each approximated a standard normal curve.

Figure 1.

Figure 1. (A–D) Frequency distribution of change in critical thinking skills. Distribution of change in critical thinking skill for the experimental sample. Changes are indicated using raw scores from CCTST pre- and posttests for total critical thinking skill (A) as well as analysis (B), inference (C), and evaluation (D) component critical thinking skills.

Effect of Writing on Total Critical Thinking Performance

The ANCOVA test of total critical thinking performance showed that writing and nonwriting groups differed significantly, F(1, 300) = 19.357, p < 0.0001, power = 0.992, partial η 2 = 0.061 (see Table 2 ). The strength of the relationship between the writing/nonwriting groups and critical thinking performance was modest but significant, accounting for more than 6% of the variance in critical thinking performance.

Analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pre-test), academic term, time of day, and instructor.

a Significance tested at 0.05 level.

Descriptive statistics of total critical thinking performance in the writing and nonwriting groups were also calculated (see Table 3 ). The writing group showed an average CCTST raw score change of 1.18 compared with the nonwriting group, which showed an average raw score change of −0.51. These critical thinking raw scores equated to gains in national percentile rank of 7.47 (45th to 53rd percentile) for the writing group and −2.09 (42nd to 40th percentile) for the nonwriting group. Critical thinking improvement in the writing group was approximately nine times greater than the nonwriting group (see Figure 2 ).

Comparison of writing and nonwriting group performance based on CCTST raw scores. CCTST raw score range was 0–34; n values in parentheses.

Figure 2.

Figure 2. Effect of writing on total critical thinking national percentile rank. Comparison of total critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The ANCOVA test of total critical thinking skill indicated that gender, ethnicity, age, class standing, and academic term did not significantly affect critical thinking performance (see Table 2 ). Covariables that significantly affected total critical thinking performance included 1) CCTST pretest score, F(1, 300) = 19.713, p < 0.0001, power = 0.993, partial η 2 = 0.062, 2) instructor, F(1, 300) = 7.745, p < 0.006, power = 0.792, partial η 2 = 0.025, and 3) time of day, F(1300) = 6.291, p < 0.013, power = 0.705, partial η 2 = 0.021. The effect of prior critical thinking skill (CCTST pretest) was moderately strong, accounting for more than 6% of the variance in total critical thinking performance. The effect of instructor and time of day were smaller, accounting for 2.5 and 2%, respectively, of total critical thinking performance variance. Critical thinking improvement associated with CCTST pretest score was approximately 2.5 times greater than for instructor and nearly three times greater than for time of day.

Effect of Writing on Component Critical Thinking Performance

The MANCOVA test indicated that analysis, inference, and evaluation critical thinking skills differed significantly between the writing and nonwriting groups, Wilks λ = 0.919, F(3, 296) = 8.746, p < 0.0001, power = 0.995, partial η 2 = 0.081 (see Table 4 ). The strength of the relationship between writing and component critical thinking performance was modest but significant, accounting for more than 8% of the variance in critical thinking performance.

Multivariate analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pretest), academic term, time of day, and instructor.

Specifically, significant gains in analysis and inference skills were observed in the writing group but not the nonwriting group. No statistically significant gains in evaluation skill were observed in either group (see Table 5 ). National percentile rank equivalents for CCTST component raw scores indicated the writing group gained 10.51 percentile in analysis skill (42nd to 52nd percentile), 6.05 percentile in inference skill (45th to 52nd percentile), and 5.16 percentile in evaluation skill (46th to 52nd percentile). The nonwriting group showed a national percentile rank change of −4.43 percentile in analysis skill (47th to 42nd percentile), −2.23 percentile in inference skill (42nd to 40th percentile), and 1.37 percentile in evaluation (44th to 45th percentile; see Figure 3 ). Critical thinking performance for the writing group was 15 times greater for analysis and 8 times greater for inference skills than for the nonwriting group. Although neither the writing nor the nonwriting group showed significant gains in evaluation skill, the writing group showed more than 3 times greater improvement than did the nonwriting group.

Comparison of writing and nonwriting group performance based on critical thinking component skill raw scores (CCTST subscales). Score range was 0–7 (analysis), 0–16 (inference), and 0–11 (evaluation).

Figure 3.

Figure 3. Effect of writing on component critical thinking national percentile rank. Comparison of component critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The MANCOVA test of analysis, inference, and evaluation skills indicated that gender, ethnicity, age, class standing, academic term, and time of day did not significantly affect critical thinking performance. Critical thinking performance was affected by prior analysis, inference, and evaluation skill (CCTST component pretest scores) and instructor (see Table 4 ). Specifically, component pretest scores had a large effect on critical thinking, accounting for 38% (analysis), 32% (inference), and 39% (evaluation) of critical thinking performance variance. The effect of instructor was smaller, accounting for 4.4% of variation in critical thinking skill. The effect of prior component critical thinking skill was approximately 4.5 times greater than the effect of writing, and nearly 9 times greater than the effect of instructor.

Student Thought Question Performance

Critical thinking performance on student essays was evaluated by applying a thesis-based essay rubric (see Supplemental Appendix 2 ) on initial submissions and final revised essays. Average weekly performance during the academic term is shown in Figure 4 . A comparison of initial essays indicated that students improved 53.3% from week 1 (average score of 27.9%) to week 7 (average score of 81.2%). A similar comparison of final essays showed that students improved 32.5% from week 1 (average score of 54.1%) to week 7 (average score of 86.6%). The largest changes between initial and final essays occurred in week 1 (change of 26.2%), and decreased each week thereafter (24.8, 23.9, 18.8, 8, 7.8, and 5.4% for weeks 2 through 7, respectively). These results showed that students produced little evidence of critical thinking skill in their writing early in the term, but improved dramatically on both initial and revised essay submissions by the end of the term.

Figure 4.

Figure 4. Profile of change in critical thinking performance in writing group. Comparison of student writing performance on weekly initial and revised essays. Essay scores were derived using a thesis-based critical thinking rubric (see Supplemental Appendix 2 ). Average essay scores were computed across writing sections.

The purpose of this study was to discover whether writing could measurably influence critical thinking performance in general education biology. Results indicated that students from the writing group significantly outperformed their nonwriting peers in both total critical thinking skill and the component critical thinking skills of analysis and inference. The writing and nonwriting groups were highly similar initially and began the academic term with comparable critical thinking ability (45th and 42nd national percentile for writing and nonwriting, respectively). By the end of the term, writing students had improved their critical thinking skill to above the 52nd percentile whereas nonwriting students decreased to below the 40th percentile. In addition to writing, prior critical thinking skill and course instructor significantly affected critical thinking performance, with prior critical thinking skill having the largest effect on critical thinking gains of any variable tested. Further analysis of the writing group showed that the largest gains in critical thinking occurred during the first few weeks of the term, with graduated improvement during the remainder of the term. A comparison of average critical thinking performance on initial essays and revised essays showed that thinking skills improvement was greater on initial essays (53%) than on final essays (33%). Collectively, the results of this study indicated that students who experienced writing in general education biology significantly improved their critical thinking skills.

The covariance analysis that was conducted provided a partial means to separate out the effects of writing, prior critical thinking skill, instructor, and multiple covariables from total and component critical thinking gains. The analysis of total critical thinking skill indicated that writing students changed their critical thinking skill from below the national average to above the national average within an academic quarter, whereas nonwriting students remained below the national average. This observation is important because it shows that students can develop critical thinking skills within a fairly short 9-wk period of time, and that writing can play a role in that process. A similar study showed critical thinking skills improve over 15 wk (Quitadamo, Brahler, and Crouch, unpublished results); however, this study provided no insight into whether critical thinking skills could be changed over a shorter period of time, in a different academic setting, or in response to instructional variables such as writing.

Although critical thinking gains were influenced by writing, they did not appear to be affected by gender, ethnicity, class standing, or age. In fact, statistical results indicated that these variables collectively had a very small effect on critical thinking performance. Gender distribution was nearly identical across the writing and nonwriting groups, and was predominantly female (nearly 62%). Ethnic distribution was also highly similar across the writing and nonwriting groups, but the sampling was largely Caucasian (>84%). Class standing varied a little more across the writing and nonwriting groups, with the sample largely comprised of underclassmen (70%). Although nearly three-quarters of the sample was between 18 and 21 years of age, nearly 10% was over 21, with a fair number of older nontraditional students represented. It is possible that a more diverse sample would have produced different results, or it may be that the individuals participating in this study responded particularly well to writing. Although further investigation of these variables is necessary and important, it was beyond the scope of the current study.

The analysis of component skills provided greater insight into the particular critical thinking skills that students changed in response to writing. Specifically, writing students significantly improved their analysis and inference skills whereas nonwriting students did not. Writing students also improved their evaluation skills much more than nonwriting students, although not significantly. These results indicate that the process of writing helps students develop improved analytical and inference skills. Prior research indicates that the writing to learn strategy is effective because students must conceptually organize and structure their thoughts as well as their awareness of thinking processes ( Langer and Applebee, 1987 ; Ackerman, 1993 ; Holliday, 1994 ; Rivard, 1994 ). More specifically, as students begin to shape their thoughts at the point of construction and continually analyze, review, and clarify meaning through the processes of drafting and revision, they necessarily engage and apply analysis and inference skills ( Klein, 1999 ; Hand and Prain, 2002 ). In this study, the process of writing appears to have influenced critical thinking gains. It also seems likely that writing students experienced a greater cognitive demand than nonwriting students simply because the writing act required them to hypothesize, debate, and persuade ( Rivard, 1994 ; Hand and Prain, 2002 ) rather than memorize as was the case in nonwriting control courses.

Conversely, the lack of any significant change in analysis, inference, or evaluation skills in the nonwriting group indicated that the traditional lab instruction used in the general education biology control courses did not help students develop critical thinking skills. Based on the results of this study, it could be argued that traditional lab instruction actually prevents the development of critical thinking skills, which presents a rather large problem when one considers how frequently these traditional methods are used in general education biology courses. One also has to consider that the critical thinking gains seen in the writing group might also have resulted from the relative absence of traditional lab instruction rather than writing alone. Additional research will be necessary to gain further insight into this question. Either way, changes to the traditional model of lab instruction will be necessary if the goal is to enhance the critical thinking abilities of general education biology students.

The variable that had the largest impact on critical thinking performance gains was prior critical thinking skill. This phenomenon was previously observed by Quitadamo, Brahler, and Crouch (unpublished results) in a related study that investigated the effect of Peer Led Team Learning on critical thinking performance. That study focused on science and math major undergraduate critical thinking performance at a major research university, and found that, in addition to Peer Led Team Learning, prior critical thinking skill significantly influenced critical thinking performance (Quitadamo, Brahler, and Crouch, unpublished results). Specifically, students with the highest prior critical thinking skill showed the largest performance gains, whereas students with low initial skill were at a comparative disadvantage. The fact that prior critical thinking skill also had a large effect on critical thinking performance in this study increases the generalizability of the observation and underscores its importance. Simply put, students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills, not because they lack the cognitive hard-wiring to perform but because they lack the tools to build their knowledge. Is it reasonable or just to expect otherwise comparable students to perform at similar levels when only some of them have the keys for success? If we hope to improve the perception of science in this country, we need to educate people on how to think about important scientific issues, and not simply argue a position based on one school of thought. By helping general education students to develop critical thinking skills, it is hoped that they will be better able to think rationally about science.

The observation that students who come to general education biology with greater critical thinking skills leave with the largest skill gains has important implications for the K–12 school system as well. If a high proportion of students are coming to institutions of higher education lacking critical thinking skills, why are these skills not being explicitly taught in the K–12 system? Ideally, students would learn the foundational tenets of critical thinking at an earlier age, and be able to refine and hone these skills as they progress through the K–20 education system. The results of this study reinforce the idea that students should be explicitly taught critical thinking skills and be expected to practice them as early and often as possible.

Although its effect was smaller than writing or prior critical thinking skill, the instructor variable also played a significant role in student critical thinking performance, accounting for 2.5% of the total variance in critical thinking gains. Determining the particular qualities of each instructor that contributed to student critical thinking success and further separating instructor and writing effects will require additional research. Previous research indicates that teaching style positively influences certain aspects of student learning ( Grasha, 1994 ; Hativa et al., 2001 ; Bain, 2004 ), but the qualities that specifically influence student critical thinking gains have not been sufficiently investigated. Additional research in this area is necessary.

Faculty considering whether to use writing in the laboratory may wonder about how much time and energy it takes to implement, if efforts to change will translate into improved student learning, and how these changes affect disciplinary content. From a practical perspective, implementing writing did not take more time and effort per se; rather, it required faculty to reconceptualize how they spent their instructional time. Instead of individually developing course materials, writing faculty collaborated to a greater extent than nonwriting faculty on course design and assessments that required students to demonstrate their critical thinking skill. Interviews of faculty from the writing and nonwriting groups indicated that writing faculty felt the course was less work because they collaborated with colleagues and because students demonstrated improved thinking skill. Writing faculty generally became more comfortable with the new model after ∼2–3 wk when students began to show observable changes in writing proficiency and critical thinking. Together, collaboration with colleagues and observed gains in critical thinking tended to create a positive feedback loop that helped to sustain writing faculty efforts. In contrast, nonwriting faculty similarly wanted their students to think better but were convinced that traditional methods would be more effective, and so remained closed to change. There were some logistical challenges with writing, like scheduling computer labs where students could draft and revise their weekly essay responses under instructor and teaching assistant supervision. Teaching assistants (and faculty) also needed to be trained on how to evaluate writing using a rubric. Finally, with regards to content coverage, no lecture or laboratory content was killed in order to implement writing because writing and nonwriting students both performed the same lab activities. Collectively, the benefits of using writing in laboratory should encourage faculty who want their students to learn to think critically to give it a try.

Future Directions

This study showed that writing affects student critical thinking skill in a nonmajors biology course, but the results have generated more questions than have been answered. How does writing specifically produce gains in critical thinking performance? What factors influence student prior critical thinking skill? How do instructors specifically influence student gains in critical thinking? Future studies that analyze student essays in more detail would provide greater insight into how writing influences critical thinking skill. Using writing in other nonmajor science courses such as chemistry, geology, or physics could also be done to determine the transferability of this method. Additional studies that investigate student prior critical thinking skill and instructor variables are also necessary. These future studies would further contribute to the knowledge base in this area, and also address some of its identified limitations ( Ebert-May et al., 1997 ; Daempfle, 2002 ). Results from these studies would also increase the generalizability of the results from this study.

CONCLUSIONS

Building on existing research and on the basis of several lines of evidence presented in this study, we conclude that writing positively influences critical thinking performance for general education biology students. Those students with prior critical thinking skill may have a comparative advantage over other general education biology students who have not developed these same skills. To rectify that inequity critical thinking skills should be explicitly taught early and used often during the K–20 academic process. As it appears that particular instructors improve student critical thinking skills more than others, students should be discerning in their choice of instructors if they want to improve their critical thinking skills. Whether writing as a method to improve critical thinking skills will prove useful in other general education science courses will likely depend on a host of factors, but it has potential. Further study of writing in general education science will be necessary to verify these results and discover the breadth and depth of how writing affects critical thinking skill.

ACKNOWLEDGMENTS

We thank Drs. Holly Pinkart, Roberta Soltz, Phil Mattocks, and James Johnson and undergraduate researchers Matthew Brewer, Dayrk Flaugh, Adam Wallace, Colette Watson, Kelly Vincent, and Christine Weller for their valuable contributions to this study. The authors also acknowledge the generous financial support provided by the Central Washington University Office of the Provost and the Office of the Associate Vice President for Undergraduate Studies.

  • Ackerman J. M. ( 1993 ). The promise of writing to learn . Writ. Commun. 10 , (3), 334-370. Google Scholar
  • American Association for the Advancement of Science ( 1989 ). Science for All Americans. A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology In: Washington, DC. Google Scholar
  • Applebee A. N. ( 1984 ). Writing and reasoning . Rev. Educ. Res. 54 , (4), 577-596. Google Scholar
  • Association of American Colleges Universities ( 2005 ). Liberal Education Outcomes: A Preliminary Report on Student Achievement in College In: Washington, DC. Google Scholar
  • Bain K. ( 2004 ). What the Best College Teachers Do In: Cambridge, MA: Harvard University Press. Google Scholar
  • Beers T., McIssac C., Henderson B., Gainen J. ( 1994 ). Writing: thesis and support scoring guide accessed 25 August 2006 http://www.insightassessment.com/pdf_files/RUB_WTHS.PDF . Google Scholar
  • Bruffee K. A. ( 1984 ). Collaborative learning and the “conversation of mankind.” Coll . Engl. 46 , (7), 635-653. Google Scholar
  • Business-Higher Education Forum, and American Council on Education ( 2003 ). Building a Nation of Learners: The Need for Changes in Teaching and Learning To Meet Global Challenges In: Washington, DC. Google Scholar
  • Bybee R. W., Fuchs B. ( 2006 ). Preparing the 21st century workforce: a new reform in science and technology education . J. Res. Sci. Teach. 43 , (4), 349-352. Google Scholar
  • Campbell D. T., Stanley J. C. ( 1963 ). Experimental and Quasi-experimental Designs for Research In: Boston, MA: Houghton Mifflin Company. Google Scholar
  • American Society for Training Development ( 1990 ). Carnevale A. P Workplace Basics: The Essential Skills Employers Want In: San Francisco, CA: Jossey-Bass. Google Scholar
  • Center for Critical Thinking and Moral Critique ( 1996 ). ICAT Critical Thinking Essay Test In: Rohnert Park, CA: Sonoma State University. Google Scholar
  • Champagne A., Kouba V., Mintzes J., Wandersee J., Novak J. ( 1999 ). Written product as performance measures In: Assessing Science Understanding: A Human Constructivist View , New York: Academic Press, 224-248. Google Scholar
  • Collier K. G. ( 1980 ). Peer-group learning in higher education: the development of higher order skills . Stud. High. Educ. 5 , (1), 55-61. Google Scholar
  • Daempfle P. A. ( 2002 ). Instructional Approaches for the Improvement of Reasoning in Introductory College Biology Courses: A Review of the Research In: New York: U.S. Department of Education. Google Scholar
  • Ebert-May D., Brewer C., Allred S. ( 1997 ). Innovation in large lectures—teaching for active learning . Bioscience 47 , (9), 601-607. Google Scholar
  • Ennis R. H. ( 1985 ). A logical basis for measuring critical thinking skills . Educ. Leadership 43 , (2), 44-48. Google Scholar
  • Ennis R. H., Weir E. ( 1985 ). The Ennis-Weir Critical Thinking Essay Test In: Pacific Grove, CA: Midwest Publications. Google Scholar
  • Facione P. A. ( 1990a ). The California Critical Thinking Skills Test—College Level. Technical Report 1. Experimental Validation and Content Validity In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1990b ). The California Critical Thinking Skills Test—College Level. Technical Report 3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1990c ). The California Critical Thinking Skills Test—College Level. Technical Report 4. Interpreting the CCTST, Group Norms, and Sub-Scores In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1991 ). Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment In: Millbrae, CA: Insight Assessment. Google Scholar
  • American Philosophical Association ( 1990 ). Facione P. A Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A., Facione N. C., Giancarlo C. A. ( 1992 ). Test Manual: The California Critical Thinking Disposition Inventory In: Millbrae, CA: Insight Assessment. Google Scholar
  • Insight Assessment ( 2004 ). Facione P. A, Facione N. C Test of everyday reasoning In: accessed 30 June 2006 http://www.insightassessment.com/test-ter.html . Google Scholar
  • Giancarlo C. A., Facione P. A. ( 2001 ). A look across four years at the disposition toward critical thinking among undergraduate students . J. Gen. Educ. 50 , (1), 29-55. Google Scholar
  • Grasha A. F. ( 1994 ). A matter of style: the teacher as expert, formal authority, personal model, facilitator, and delegator . Coll. Teach. 42 , (4), 142-149. Google Scholar
  • Hand B., Prain V. ( 2002 ). Teachers implementing writing-to-learn strategies in junior secondary science: a case study . Sci. Educ. 86 , (6), 737-755. Google Scholar
  • Hativa N., Barak R., Simhi E. ( 2001 ). Exemplary university teachers: knowledge and beliefs regarding effective teaching dimensions and strategies . J. High. Educ. 72 , (6), 699-729. Google Scholar
  • Holliday W. G. ( 1994 ). The reading-science learning-writing connection: breakthroughs, barriers, and promises . J. Res. Sci. Teach. 31 , (9), 877-893. Google Scholar
  • Holmes J., Clizbe E. ( 1997 ). Facing the 21st century . Bus. Educ. Forum. 52 , (1), 33-35. Google Scholar
  • Holyoak A. R. ( 1998 ). A plan for writing throughout (not just across) the biology curriculum . Am. Biol. Teach. 60 , (3), 186-190. Google Scholar
  • Howard V. A. ( 1990 , Ed. V. A. Howard , Thinking on paper: a philosopher's look at writing In: Varieties of Thinking: Essays from Harvard's Philosophy of Education Research Center , New York: Routledge, 84-92. Google Scholar
  • Insight Assessment ( 2005 ). Blohm S. Annual number of users for the CCTST form 2000 . accessed 8 December 2006 . Google Scholar
  • Jones E. A., Hoffman S., Moore L. M., Ratcliff G., Tibbets S., Click B. ( 1995 ). National Assessment of College Student Learning: Identifying College Graduates' Essential Skills in Writing, Speech and Listening, and Critical Thinking. Final project report In: Report no. NCES-95-001 , University Park, PA: U.S. Department of Education, Office of Educational Research and Improvement.. Google Scholar
  • Jones G. M, Carter G ( 1998 , Ed. J. J. MintzesJ. H. WanderseeJ. D. Novak , Small groups and shared constructions In: Teaching Science for Understanding: A Human Constructivist View , San Diego, CA: Academic Press, 261-279. Google Scholar
  • Kelly G. J., Chen C. ( 1999 ). The sound of music: constructing science as sociocultural practices through oral and written discourse . J. Res. Sci. Teach. 36 , (8), 883-915. Google Scholar
  • Keys C. W. ( 1999 ). Revitalizing instruction in scientific genres: connecting knowledge production with writing to learn in science . Sci. Educ. 83 , (2), 115-130. Google Scholar
  • Keys C. W., Hand B., Prain V., Collins S. ( 1999 ). Using the science writing heuristic as a tool for learning from laboratory investigations in secondary science . J. Res. Sci. Teach. 36 , (10), 1065-1084. Google Scholar
  • Klein P. ( 1999 ). Reopening inquiry into cognitive processes in writing-to-learn . Ed. Psychol. Rev. 11 , (3), 203-270. Google Scholar
  • Klein P. D. ( 2004 ). Constructing scientific explanations through writing . Instr. Sci. 32 , (3), 191-231. Google Scholar
  • Klymkowsky M. W. ( 2006 ). Can nonmajors courses lead to biological literacy? Do majors courses do any better? . Cell. Biol. Educ. 4 , 42-44. Google Scholar
  • Kokkala I., Gessell D. A. ( 2003 ). Writing science effectively: biology and English students in an author-editor relationship . J. Coll. Sci. Teach. 32 , (4), 252-257. Google Scholar
  • Association for the Study of Higher Education ( 1988 ). Kurfiss J. G Critical Thinking: Theory, Research, Practice, and Possibilities In: Washington, DC: George Washington University. Google Scholar
  • Langer J. A., Applebee A. N. ( 1985 ). Learning to write: learning to think . Educ. Horizons 64 , (1), 36-38. Google Scholar
  • Langer J. A., Applebee A. N. ( 1987 ). How Writing Shapes Thinking: A Study of Teaching and Learning. NCTE research report no. 22 In: Urbana, IL: National Council of Teachers of English. Google Scholar
  • Lawson A. E. ( 2001 ). Using the learning cycle to teach biology concepts and reasoning patterns . J. Biol. Educ. 35 , (4), 165-169. Google Scholar
  • Malcom S. M., Abdallah J., Chubin D. E., Grogan K. ( 2005 ). A System of Solutions: Every School, Every Student , Washington, DC: American Association for the Advancement of Science. Google Scholar
  • Marzano R. J. ( 1991 ). Fostering thinking across the curriculum through knowledge restructuring . J. Reading 34 , (7), 518-525. Google Scholar
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine ( 2005 ). Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future In: Washington, DC: Committee on Prospering in the Global Economy of the 21st Century. Google Scholar
  • National Research Council ( 1995 ). National Science Education Standards In: Washington, DC: National Academy Press. Google Scholar
  • National Research Council ( 2003 ). Bio 2010, Transforming Undergraduate Education for Future Research Biologists In: Washington, DC: Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century. Google Scholar
  • National Science Foundation ( 1996 ). Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology In: Washington, DC: Directorate for Education and Human Resources. Google Scholar
  • Office of Educational Research Improvement ( 1991 ). Striving for excellence: The National Education Goals In: Washington, DC. Google Scholar
  • Project Kaleidoscope. ( 2006 ). Transforming America's Scientific and Technological Infrastructure: Recommendations for Urgent Action In: Washington, DC: National Science Foundation. Google Scholar
  • Resnick L. B. ( 1987 ). Education and Learning To Think , Washington DC: National Academy Press. Google Scholar
  • Rivard L. P. ( 1994 ). A review of writing to learn in science: implications for practice and research . J. Res. Sci. Teach. 31 , (9), 969-983. Google Scholar
  • Springer L., Donovan S. S., Stanne M. E. ( 1999 ). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis . Rev. Educ. Res. 69 , (1), 21-51. Google Scholar
  • Steglich C. S. ( 2000 ). A writing assignment that changes attitudes in biology classes . Am. Biol. Teach. 62 , (2), 98-101. Google Scholar
  • Taylor K. L., Sobota S. J. ( 1998 ). Writing in biology: an integration of disciplines . Am. Biol. Teach. 60 , (5), 350-353. Google Scholar
  • Tessier J. ( 2006 ). Writing assignment in a nonmajor introductory ecology class . J. Coll. Sci. Teach. 35 , (4), 25-29. Google Scholar
  • Tobin K. G., Tippins D. J., Gallard A. J. ( 1994 , Ed. D. L. Gabel , Research on instructional strategies for teaching science In: Handbook of Research on Science Teaching and Learning , New York: Macmillan, 45-93. Google Scholar
  • Tsui L. ( 1998 ). A review of research on critical thinking In: ASHE annual meeting paper , Miami, FL 1998 November5-8. Google Scholar
  • Tsui L. ( 2002 ). Fostering critical thinking through effective pedagogy: evidence from four institutional case studies . J. High. Educ. 73 , (6), 740-763. Google Scholar
  • U.S. Department of Education ( 1990 ). National Goals for Education In: Washington, DC. Google Scholar
  • Watson G., Glaser E. M. ( 1980 ). Watson-Glaser Critical Thinking Appraisal , Cleveland, OH: The Psychological Corporation (Harcourt Brace Jovanovich). Google Scholar
  • Investigating the Impact of Critical Thinking Instruction on Writing Performance: A Multilevel Modelling Analysis of Relative Gain Data in the Australian National Assessment Program Thinking Skills and Creativity, Vol. 66
  • Sounding the Alarm on Public Health Education: A Need for Equity-Focused Teaching 28 March 2024 | Pedagogy in Health Promotion, Vol. 18
  • Brave New Words: A Framework and Process for Developing Technology-Use Guidelines for Student Writing 4 March 2024 | Pedagogy in Health Promotion
  • Ten simple rules to leverage large language models for getting grants 1 March 2024 | PLOS Computational Biology, Vol. 20, No. 3
  • “If we can do it, anyone can!”: Evaluating a virtual “Paper Chase” collaborative writing model for rapid research dissemination 3 June 2022 | Active Learning in Higher Education, Vol. 25, No. 1
  • The Online Forum Impact on Student Engagement and Critical Thinking Disposition in General Education
  • Exploring Best Practices for New Climate Change Policy and Law Program IOP Conference Series: Earth and Environmental Science, Vol. 1304, No. 1
  • Enhancing Employability Skills of Biology Graduates through an Interdisciplinary Project-Based Service Learning Experience with Engineering and Translation Undergraduate Students 16 January 2024 | Education Sciences, Vol. 14, No. 1
  • Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes 30 December 2023 | GIST – Education and Learning Research Journal, Vol. 26
  • Evaluation of an 8-week high school science communication course designed to read, write, and present scientific research Advances in Physiology Education, Vol. 47, No. 4
  • How to coach student professional development during times of challenges and uncertainties 22 August 2023 | BMC Medical Education, Vol. 23, No. 1
  • Critical thinking skills and science achievement: A latent profile analysis Thinking Skills and Creativity, Vol. 49
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Comparing study features is easy but identifying next steps is hard: Evaluating critical thinking through the Biology Lab Inventory of Critical Thinking in Ecology 10 May 2023 | Ecology and Evolution, Vol. 13, No. 5
  • Teaching Psychopharmacology for Undergraduates 17 December 2022
  • EFL Students’ Perceptions and Challenges of Computer-Mediated Collaborative Writing in Academic Writing Courses at Van Lang University 7 April 2023
  • Julia Gouvea ,
  • Lara Appleby ,
  • Liren Fu , and
  • Vicente Talanquer, Monitoring Editor
  • Effectiveness of a manuscript writing workshop on writing confidence amongst nursing and health disciplines clinicians Journal of Medical Imaging and Radiation Sciences, Vol. 53, No. 4
  • Evaluation of Creative Writing Products of Primary School Students According to Some Variables 26 September 2022 | Pamukkale University Journal of Education
  • Writing goals in U.S. undergraduate data science course outlines: A textual analysis 1 August 2022 | Teaching Statistics, Vol. 44, No. 3
  • Writing Is Thinking: Implementation and Evaluation of an Internal Medicine Residency Clinical Reasoning and Documentation Curriculum 13 June 2022 | Medical Science Educator, Vol. 32, No. 4
  • Promoting Computational Thinking Skills in Non-Computer-Science Students: Gamifying Computational Notebooks to Increase Student Engagement IEEE Transactions on Learning Technologies, Vol. 15, No. 3
  • Effects of group awareness tools on students' engagement, performance, and perceptions in online collaborative writing: Intergroup information matters The Internet and Higher Education, Vol. 53
  • Exploring the structure of students’ scientific higher order thinking in science education Thinking Skills and Creativity, Vol. 43
  • Students’ perceptions of mathematics writing and its impact on their enjoyment and self-confidence 11 June 2021 | Teaching Mathematics and its Applications: An International Journal of the IMA, Vol. 41, No. 1
  • Teaching Psychopharmacology for Undergraduates 14 December 2021
  • Dennis Lee ,
  • Mallory Wright ,
  • Courtney Faber ,
  • Cazembe Kennedy , and
  • Dylan Dittrich-Reed
  • Stanley M. Lo, Monitoring Editor
  • Establishing a baseline of science communication skills in an undergraduate environmental science course 23 July 2021 | International Journal of STEM Education, Vol. 8, No. 1
  • Diedra M. Wrighting ,
  • Jamie Dombach ,
  • Mike Walker ,
  • Jenene Cook ,
  • Marlina Duncan ,
  • Gisselle Velez Ruiz ,
  • Adán Colón-Carmona , and
  • Bruce Birren
  • James Hewlett, Monitoring Editor
  • Analysis of university student responses to the pandemic in a formal microbiology assessment 7 July 2021 | FEMS Microbiology Letters, Vol. 368, No. 14
  • A challenge in teaching scientific communication: academic experience does not improve undergraduates’ ability to assess their or their peers’ writing 7 September 2020 | Assessment & Evaluation in Higher Education, Vol. 46, No. 5
  • The effect of practices based on argumentation-based inquiry approach on teacher candidates’ critical thinking 25 August 2019 | Educational Studies, Vol. 47, No. 1
  • Biosciences Proposal Bootcamp: Structured peer and faculty feedback improves trainees’ proposals and grantsmanship self-efficacy 28 December 2020 | PLOS ONE, Vol. 15, No. 12
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • A Search for a Method to Improve Critical Thinking Skills in Social Studies Teaching: Writing-to-Learn 31 August 2020 | Review of International Geographical Education Online
  • Using Poetry in the Undergraduate Biology Classroom The American Biology Teacher, Vol. 82, No. 6
  • Assessing Pharmacy Students’ Scientific Reasoning After Completing a Physics Course Taught Using Active-Learning Methods American Journal of Pharmaceutical Education, Vol. 84, No. 8
  • Kritikal at inferensyal na kakayahan sa pagbasa ng mga mag-aaral sa larangan ng Filipino ng Visayas State University 14 July 2020 | International Journal of Research Studies in Education, Vol. 9, No. 3
  • Concept map and problem-based learning Journal of Physics: Conference Series, Vol. 1567, No. 2
  • What I know Best: Assessing Indonesian Student’s Structural Knowledge through Student-Generated Topics 15 April 2020 | European Journal of Educational Research, Vol. volume-9-2020, No. volume-9-issue-2-april-2020
  • Use of a competency framework to explore the benefits of student-generated multiple-choice questions (MCQs) on student engagement 18 November 2019 | Pedagogies: An International Journal, Vol. 15, No. 2
  • Peculiarities of using of the Moodle test tools in philosophy teaching 20 March 2020 | CTE Workshop Proceedings, Vol. 7
  • Challenging endocrinology students with a critical-thinking workbook Advances in Physiology Education, Vol. 44, No. 1
  • Utility value interventions in a college biology lab: The impact on motivation 21 August 2019 | Journal of Research in Science Teaching, Vol. 57, No. 2
  • Teaching Psychopharmacology for Undergraduates 1 June 2020
  • Writing Assignments in Epidemiology Courses: How Many and How Good? 21 May 2019 | Public Health Reports, Vol. 134, No. 4
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Julie A. Reynolds
  • Rebecca Price, Monitoring Editor
  • Impact of Peer-Led Team Learning and the Science Writing and Workshop Template on the Critical Thinking Skills of First-Year Chemistry Students 28 March 2019 | Journal of Chemical Education, Vol. 96, No. 5
  • The Power of Practice: Adjusting Curriculum to Include Emphasis on Skills 1 September 2023 | Journal of College Science Teaching, Vol. 48, No. 5
  • Exploring the effects of emotional design and emotional intelligence in multimedia-based learning: an engineering educational perspective 26 March 2019 | New Review of Hypermedia and Multimedia, Vol. 25, No. 1-2
  • Boosting Engagement and Intercultural Competence Through Technology
  • Melanie L. Styers ,
  • Peter A. Van Zandt ,, and
  • Katherine L. Hayden
  • Erin L. Dolan, Monitoring Editor
  • Ana Maria Barral ,
  • Veronica C. Ardi-Pastores , and
  • Rachel E. Simmons
  • Jennifer Knight, Monitoring Editor
  • Using writing assignments to promote critical thinking, learning and professional identity: The Epidemiology Workplace Writing Repository 30 January 2018 | Journal of Public Health, Vol. 40, No. 3
  • Writing in the STEM classroom: Faculty conceptions of writing and its role in the undergraduate classroom 7 June 2018 | Science Education, Vol. 102, No. 5
  • Leslie A. Schiff , and
  • John Coley, Monitoring Editor
  • Learning to Write Like a Scientist: A Writing-Intensive Course for Microbiology/Health Science Students Journal of Microbiology & Biology Education, Vol. 19, No. 1
  • Stepwise Approach To Writing Journal-Style Lab Reports in the Organic Chemistry Course Sequence 20 November 2017 | Journal of Chemical Education, Vol. 95, No. 1
  • Using Scaffolding and Deliberate Practice to Improve Abstract Writing in an Introductory Biology Laboratory Course Journal of Microbiology & Biology Education, Vol. 19, No. 2
  • A Systematic Approach to Teaching Case Studies and Solving Novel Problems Journal of Microbiology & Biology Education, Vol. 19, No. 3
  • WEB based technical problem solving for enhancing writing skills of secondary vocational students 22 July 2016 | Education and Information Technologies, Vol. 22, No. 4
  • Childhood predictors of written expression in late adolescents with 22q11.2 deletion syndrome: a longitudinal study 8 March 2017 | Journal of Intellectual Disability Research, Vol. 61, No. 5
  • Effects of shoe heel height on the end-point and joint kinematics of the locomotor system when crossing obstacles of different heights 6 May 2016 | Ergonomics, Vol. 60, No. 3
  • The use of writing assignments to help students synthesize content in upper-level undergraduate biology courses 1 February 2017 | FEMS Microbiology Letters, Vol. 364, No. 3
  • K. F. Weaver ,
  • V. Morales ,
  • M. Nelson ,
  • P. F. Weaver ,
  • A. Toledo , and
  • Kathryn E. Perez, Monitoring Editor
  • Gili Marbach-Ad ,
  • Carly Rietschel , and
  • Katerina V. Thompson
  • A programme-wide training framework to facilitate scientific communication skills development amongst biological sciences Masters students 19 February 2015 | Journal of Further and Higher Education, Vol. 40, No. 4
  • An Assessment of Student Perceptions and Responses to Frequent Low-stakes Testing in Introductory Sociology Classes 26 January 2016 | Teaching Sociology, Vol. 44, No. 2
  • Incorporation of an Explicit Critical-Thinking Curriculum to Improve Pharmacy Students’ Critical-Thinking Skills American Journal of Pharmaceutical Education, Vol. 80, No. 3
  • Developing critical thinking skills using the Science Writing Heuristic in the chemistry laboratory 1 January 2016 | Chemistry Education Research and Practice, Vol. 17, No. 1
  • The effect of inference-making, deduction, and self-monitoring on EFL learners' language achievement, reading, and writing ability 18 November 2015 | International Journal of Research Studies in Language Learning, Vol. 5, No. 4
  • Community-Based Inquiry in Allied Health Biochemistry Promotes Equity by Improving Critical Thinking for Women and Showing Promise for Increasing Content Gains for Ethnic Minority Students 27 January 2015 | Journal of Chemical Education, Vol. 92, No. 5
  • How do learners experience joint writing: University students' conceptions of online collaborative writing tasks and environments Computers & Education, Vol. 82
  • Critical Thinking and Character
  • Oborové didaktiky: vývoj – stav – perspektivy
  • A Modern Twist on the Beaumont and St. Martin Case: Encouraging Analysis and Discussion in the Bioethics Classroom with Reflective Writing and Concept Mapping Journal of Microbiology & Biology Education, Vol. 15, No. 2
  • Michelle Mynlieff ,
  • Anita L. Manogaran ,
  • Martin St. Maurice , and
  • Thomas J. Eddinger
  • Mary Pat Wenderoth, Monitoring Editor
  • Evaluating Biological Claims to Enhance Critical Thinking through Position Statements Journal of Microbiology & Biology Education, Vol. 15, No. 1
  • Jessica Middlemis Maher ,
  • Jonathan C. Markey , and
  • Diane Ebert-May
  • Career-Oriented Performance Tasks in Chemistry: Effects on Students' Critical Thinking Skills Education Research International, Vol. 2013
  • The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments Assessing Writing, Vol. 17, No. 4
  • Ethics, Politics and Protests: Using Contentious Issues in Reproductive Sciences as Educational Opportunities 25 July 2012 | Reproduction in Domestic Animals, Vol. 47, No. s4
  • Julie Libarkin , and
  • Gabriel Ording
  • Vivian Siegel, Monitoring Editor
  • Meaning Check Questions as an Active Learning Tool in the University Biology Laboratory Assists International Students Creative Education, Vol. 03, No. 06
  • The value of writing-to-learn when using question prompts to support web-based learning in ill-structured domains 16 September 2010 | Educational Technology Research and Development, Vol. 59, No. 1
  • Learning surface anatomy: Which learning approach is effective in an integrated PBL curriculum? 23 December 2010 | Medical Teacher, Vol. 33, No. 1
  • Learning with Online Activities
  • An analysis of narratives to identify critical thinking contexts in psychiatric clinical practice 27 January 2010 | International Journal of Nursing Practice, Vol. 16, No. 1
  • Spacing and Testing Effects
  • Interactions Between Students and Tutor in Problem‐Based Learning: The Significance of Deep Learning 25 June 2009 | The Kaohsiung Journal of Medical Sciences, Vol. 25, No. 5
  • Prompting students’ context-generating cognitive activity in ill-structured domains: does the prompting mode affect learning? 8 November 2008 | Educational Technology Research and Development, Vol. 57, No. 2
  • The Impact of Prompting in Technology-Enhanced Learning as Moderated by Students’ Motivation and Metacognitive Skills
  • Ian J. Quitadamo ,
  • Celia L. Faiola ,
  • James E. Johnson , and
  • Deborah Allen, Monitoring Editor
  • Use of Portfolios by Medical Students: Significance of Critical Thinking 20 October 2008 | The Kaohsiung Journal of Medical Sciences, Vol. 24, No. 7
  • Robert Yuan , and
  • Eric Chudler, Monitoring Editor

Submitted: 27 November 2006 Revised: 16 February 2007 Accepted: 19 February 2007

© 2007 by The American Society for Cell Biology

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.6(2); Summer 2007

Learning to Improve: Using Writing to Increase Critical Thinking Performance in General Education Biology

Ian j. quitadamo.

*Department of Biological Sciences, Central Washington University, Ellensburg, WA 98926-7537; and

Martha J. Kurtz

† Department of Chemistry, Central Washington University, Ellensburg, WA 98926-7539

Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens.

INTRODUCTION

A national call to improve critical thinking in science.

In the past several years, an increasing number of national reports indicate a growing concern over the effectiveness of higher education teaching practices and the decreased science (and math) performance of U.S. students relative to other industrialized countries ( Project Kaleidoscope, 2006 ). A variety of national stakeholders, including business and educational leaders, politicians, parents, and public agencies, have called for long-term transformation of the K–20 educational system to produce graduates who are well trained in science, can engage intelligently in global issues that require local action, and in general are better able to solve problems and think critically. Specifically, business leaders are calling for graduates who possess advanced analysis and communication skills, for instructional methods that improve lifelong learning, and ultimately for an educational system that builds a nation of innovative and effective thinkers ( Business-Higher Education Forum and American Council on Education, 2003 ). Education leaders are similarly calling for institutions of higher education to produce graduates who think critically, communicate effectively, and who employ lifelong learning skills to address important scientific and civic issues ( Association of American Colleges and Universities, [AACU] 2005 ).

Many college faculty consider critical thinking to be one of the most important indicators of student learning quality. In its 2005 national report, the AACU indicated that 93% of higher education faculty perceived analytical and critical thinking to be an essential learning outcome (AACU, 2005) whereas 87% of undergraduate students indicated that college experiences contributed to their ability to think analytically and creatively. This same AACU report showed that only 6% of undergraduate seniors demonstrated critical thinking proficiency based on Educational Testing Services standardized assessments from 2003 to 2004. During the same time frame, data from the ACT Collegiate Assessment of Academic Proficiency test showed a similar trend, with undergraduates improving their critical thinking less than 1 SD from freshman to senior year. Thus, it appears a discrepancy exists between faculty expectations of critical thinking and students' ability to perceive and demonstrate critical thinking proficiency using standardized assessments (AACU, 2005).

Teaching that supports the development of critical thinking skills has become a cornerstone of nearly every major educational objective since the Department of Education released its six goals for the nation's schools in 1990. In particular, goal three of the National Goals for Education stated that more students should be able to reason, solve problems, and apply knowledge. Goal six specifically stated that college graduates must be able to think critically ( Office of Educational Research and Improvement, 1991 ). Since 1990, American education has tried—with some success—to make a fundamental shift from traditional teacher-focused instruction to more student-centered constructivist learning that encourages discovery, reflection, and in general is thought to improve student critical thinking skill. National science organizations have supported this trend with recommendations to improve the advanced thinking skills that support scientific literacy ( American Association for Higher Education, 1989 ; National Research Council, 1995 ; National Science Foundation, 1996 ).

More recent reports describe the need for improved biological literacy as well as international competitiveness ( Bybee and Fuchs, 2006 ; Klymkowsky, 2006 ). Despite the collective call for enhanced problem solving and critical thinking, educators, researchers, and policymakers are discovering a lack of evidence in existing literature for methods that measurably improve critical thinking skills ( Tsui, 1998 , 2002 ). As more reports call for improved K–20 student performance, it is essential that research-supported teaching and learning practices be used to better help students develop the cognitive skills that underlie effective science learning ( Malcom et al., 2005 ; Bybee and Fuchs, 2006 ).

Critical Thinking

Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions ( U.S. Department of Education, 1990 ), and are generally more employable ( Carnevale and American Society for Training and Development, 1990 ; Holmes and Clizbe, 1997 ; National Academy of Sciences, 2005 ). By focusing on instructional efforts that develop critical thinking skills, it may be possible to increase student performance while satisfying national stakeholder calls for educational improvement and increased ability to solve problems as engaged and productive citizens.

Although academics and business professionals consider critical thinking skill to be a crucial outcome of higher education, many would have difficulty defining exactly what critical thinking is. Historically, there has been little agreement on how to conceptualize critical thinking. Of the literally dozens of definitions that exist, one of the most organized efforts to define (and measure) critical thinking emerged from research done by Peter Facione and others in the early 1990s. Their consensus work, referred to as the Delphi report, was accomplished by a group of 46 leading theorists, teachers, and critical thinking assessment specialists from a variety of academic and business disciplines ( Facione and American Philosophical Association, 1990 ). Initial results from the Delphi report were later confirmed in a national survey and replication study ( Jones et al., 1995 ). In short, the Delphi panel expert consensus describes critical thinking as a “process of purposeful self-regulatory judgment that drives problem-solving and decision-making” ( Facione and American Philosophical Association, 1990 ). This definition implies that critical thinking is an intentional, self-regulated process that provides a mechanism for solving problems and making decisions based on reasoning and logic, which is particularly useful when dealing with issues of national and global significance.

The Delphi conceptualization of critical thinking encompasses several cognitive skills that include: 1) analysis (the ability to break a concept or idea into component pieces in order to understand its structure and inherent relationships), 2) inference (the skills used to arrive at a conclusion by reconciling what is known with what is unknown), and 3) evaluation (the ability to weigh and consider evidence and make reasoned judgments within a given context). Other critical thinking skills that are similarly relevant to science include interpretation, explanation, and self-regulation ( Facione and American Philosophical Association, 1990 ). The concept of critical thinking includes behavioral tendencies or dispositions as well as cognitive skills ( Ennis, 1985 ); these include the tendency to seek truth, to be open-minded, to be analytical, to be orderly and systematic, and to be inquisitive ( Facione and American Philosophical Association, 1990 ). These behavioral tendencies also align closely with behaviors considered to be important in science. Thus, an increased focus on teaching critical thinking may directly benefit students who are engaged in science.

Prior research on critical thinking indicates that students' behavioral dispositions do not change in the short term ( Giancarlo and Facione, 2001 ), but cognitive skills can be developed over a relatively short period of time (Quitadamo, Brahler, and Crouch, unpublished results). In their longitudinal study of behavioral disposition toward critical thinking, Giancarlo and Facione (2001) discovered that undergraduate critical thinking disposition changed significantly after two years. Specifically, significant changes in student tendency to seek truth and confidence in thinking critically occurred during the junior and senior years. Also, females tended to be more open-minded and have more mature judgment than males ( Giancarlo and Facione, 2001 ). Although additional studies are necessary to confirm results from the Giancarlo study, existing research seems to indicate that changes in undergraduate critical thinking disposition are measured in years, not weeks.

In contrast to behavioral disposition, prior research indicates that critical thinking skills can be measurably changed in weeks. In their study of undergraduate critical thinking skill in university science and math courses, Quitadamo, Brahler, and Crouch (unpublished results) showed that critical thinking skills changed within 15 wk in response to Peer Led Team Learning (a national best practice for small group learning). This preliminary study provided some evidence that undergraduate critical thinking skills could be measurably improved within an academic semester, but provided no information about whether critical thinking skills could be changed during a shorter academic quarter. It was also unclear whether the development of critical thinking skills was a function of chronological time or whether it was related to instructional time.

Numerous studies provide anecdotal evidence for pedagogies that improve critical thinking, but much of existing research relies on student self-report, which limits the scope of interpretation. From the literature it is clear that, although critical thinking skills are some of the most valued outcomes of a quality education, additional research investigating the effects of instructional factors on critical thinking performance is necessary ( Tsui, 1998 , 2002 ).

Writing and Critical Thinking

Writing has been widely used as a tool for communicating ideas, but less is known about how writing can improve the thinking process itself ( Rivard, 1994 ; Klein, 2004 ). Writing is thought to be a vehicle for improving student learning ( Champagne and Kouba, 1999 ; Kelly and Chen, 1999 ; Keys, 1999 ; Hand and Prain, 2002 ), but too often is used as a means to regurgitate content knowledge and derive prescribed outcomes ( Keys, 1999 ; Keys et al., 1999 ). Historically, writing is thought to contribute to the development of critical thinking skills ( Kurfiss, and Association for the Study of Higher Education, 1988 ). Applebee (1984) suggested that writing improves thinking because it requires an individual to make his or her ideas explicit and to evaluate and choose among tools necessary for effective discourse. Resnick (1987) stressed that writing should provide an opportunity to think through arguments and that, if used in such a way, could serve as a “cultivator and an enabler of higher order thinking.” Marzano (1991) suggested that writing used as a means to restructure knowledge improves higher-order thinking. In this context, writing may provide opportunity for students to think through arguments and use higher-order thinking skills to respond to complex problems ( Marzano, 1991 ).

Writing has also been used as a strategy to improve conceptual learning. Initial work focused on how the recursive and reflective nature of the writing process contributes to student learning ( Applebee, 1984 ; Langer and Applebee, 1985 , 1987 ; Ackerman, 1993 ). However, conclusions from early writing to learn studies were limited by confounding research designs and mismatches between writing activities and measures of student learning ( Ackerman, 1993 ). Subsequent work has focused on how writing within disciplines helps students to learn content and how to think. Specifically, writing within disciplines is thought to require deeper analytical thinking ( Langer and Applebee, 1987 ), which is closely aligned with critical thinking.

The influence of writing on critical thinking is less defined in science. Researchers have repeatedly called for more empirical investigations of writing in science; however, few provide such evidence ( Rivard, 1994 ; Tsui, 1998 ; Daempfle, 2002 ; Klein, 2004 ). In his extensive review of writing research, Rivard (1994) indicated that gaps in writing research limit its inferential scope, particularly within the sciences. Specifically, Rivard and others indicate that, despite the volume of writing students are asked to produce during their education, they are not learning to use writing to improve their awareness of thinking processes ( Resnick, 1987 ; Howard, 1990 ). Existing studies are limited because writing has been used either in isolation or outside authentic classroom contexts. Factors like gender, ethnicity, and academic ability that are not directly associated with writing but may nonetheless influence its effectiveness have also not been sufficiently accounted for in previous work ( Rivard, 1994 ).

A more recent review by Daempfle (2002) similarly indicates the need for additional research to clarify relationships between writing and critical thinking in science. In his review, Daempfle identified nine empirical studies that generally support the hypothesis that students who experience writing (and other nontraditional teaching methods) have higher reasoning skills than students who experience traditional science instruction. Of the relatively few noninstructional variables identified in those studies, gender and major did not affect critical thinking performance; however, the amount of time spent on and the explicitness of instruction to teach reasoning skills did affect overall critical thinking performance. Furthermore, the use of writing and other nontraditional teaching methods did not appear to negatively affect content knowledge acquisition ( Daempfle, 2002 ). Daempfle justified his conclusions by systematically describing the methodological inconsistencies for each study. Specifically, incomplete sample descriptions, the use of instruments with insufficient validity and reliability, the absence of suitable comparison groups, and the lack of statistical covariate analyses limit the scope and generalizability of existing studies of writing and critical thinking ( Daempfle, 2002 ).

Writing in the Biological Sciences

The conceptual nature and reliance on the scientific method as a means of understanding make the field of biology a natural place to teach critical thinking through writing. Some work has been done in this area, with literature describing various approaches to writing in the biological sciences that range from linked biology and English courses, writing across the biology curriculum, and directed use of writing to improve reasoning in biology courses ( Ebert-May et al., 1997 ; Holyoak, 1998 ; Taylor and Sobota, 1998 ; Steglich, 2000 ; Lawson, 2001 ; Kokkala and Gessell, 2003 ; Tessier, 2006 ). In their work on integrated biology and English, Taylor and Sobota (1998) discussed several problem areas that affected both biology and English students, including anxiety and frustration associated with writing, difficulty expressing thoughts clearly and succinctly, and a tendency to have strong negative responses to writing critique. Although the authors delineate the usefulness of several composition strategies for writing in biology ( Taylor and Sobota, 1998 ), it was unclear whether student data were used to support their recommendations. Kokkala and Gessell (2003) used English students to evaluate articles written by biology students. Biology students first reflected on initial editorial comments made by English students, and then resubmitted their work for an improved grade. In turn, English students had to justify their editorial comments with written work of their own. Qualitative results generated from a list of reflective questions at the end of the writing experience seemed to indicate that both groups of students improved editorial skills and writing logic. However, no formal measures of student editorial skill were collected before biology-English student collaboration, so no definitive conclusions on the usefulness of this strategy could be made.

Taking a slightly different tack, Steglich (2000) informally assessed student attitudes in nonmajors biology courses, and noted that writing produced positive changes in student attitudes toward biology. However, the author acknowledged that this work was not a research study. Finally, Tessier (2006) showed that students enrolled in a nonmajors ecology course significantly improved writing technical skills and committed fewer errors of fact regarding environmental issues in response to a writing treatment. Attitudes toward environmental issues also improved ( Tessier, 2006 ). Although this study surveyed students at the beginning and the end of the academic term and also tracked student progress during the quarter, instrument validity and reliability were not provided. The generalizability of results was further limited because of an overreliance on student self-reports and small sample size.

Each of the studies described above peripherally supports a relationship between writing and critical thinking. Although not explicitly an investigation of critical thinking, results from a relatively recent study support a stronger connection between writing and reasoning ability ( Daempfle, 2002 ). Ebert-May et al. (1997) used a modified learning cycle instructional method and small group collaboration to increase reasoning ability in general education biology students. A quasi-experimental pretest/posttest control group design was used on a comparatively large sample of students, and considerable thought was given to controlling extraneous variables across the treatment and comparison groups. A multifaceted assessment strategy based on writing, standardized tests, and student interviews was used to quantitatively and qualitatively evaluate student content knowledge and thinking skill. Results indicated that students in the treatment group significantly outperformed control group students on reasoning and process skills as indicated by the National Association of Biology Teachers (NABT) content exam. Coincidentally, student content knowledge did not differ significantly between the treatment and control sections, indicating that development of thinking skill did not occur at the expense of content knowledge ( Ebert-May et al., 1997 ). Interview data indicated that students experiencing the writing and collaboration-based instruction changed how they perceived the construction of biological knowledge and how they applied their reasoning skills. Although the Ebert-May study is one of the more complete investigations of writing and critical thinking to date, several questions remain. Supporting validity and reliability data for the NABT test was not included in the study, making interpretation of results somewhat less certain. In addition, the NABT exam is designed to assess high school biology performance, not college performance ( Daempfle, 2002 ). Perhaps more importantly, the NABT exam does not explicitly measure critical thinking skills.

Collectively, it appears that additional research is necessary to establish a more defined relationship between writing and critical thinking in science ( Rivard, 1994 ; Tsui, 1998 , 2002 ; Daempfle, 2002 ). The current study addresses some of the gaps in previous work by evaluating the effects of writing on critical thinking performance using relatively large numbers of students, suitable comparison groups, valid and reliable instruments, a sizable cadre of covariables, and statistical analyses of covariance. This study uses an experimental design similar to that of the Ebert-May et al. (1997) study but incorporates valid and reliable test measures of critical thinking that can be used both within and across different science disciplines.

Purpose of the Study

Currently there is much national discussion about increasing the numbers of students majoring in various science fields ( National Research Council, 2003 ; National Academy of Sciences, 2005 ). Although this is a necessary and worthwhile goal, attention should also be focused on improving student performance in general education science because these students will far outnumber science majors for the foreseeable future. If college instructors want general education students to think critically about science, they will need to use teaching methods that improve student critical thinking performance. In many traditional general education biology courses, students are not expected to work collaboratively, to think about concepts as much as memorize facts, or to develop and support a written thesis or argument. This presents a large problem when one considers the societal role that general education students will play as voters, community members, and global citizens. By improving their critical thinking skills in science, general education students will be better able to deal with the broad scientific, economic, social, and political issues they will face in the future.

The problem addressed by this study was to discover whether writing could improve student critical thinking performance in general education biology courses. How might writing in general education biology affect the analysis, inference, and evaluation skills that are inherent to critical thinking? What level of critical thinking skill do students bring to nonmajors biology courses? Can their critical thinking skills be measurably improved using writing? What other factors affect development of critical thinking skills? When do student critical thinking skills begin to change, and how much? In this study, the effect of writing on critical thinking performance was investigated using the California Critical Thinking Skills Test (CCTST) at the beginning (pretest) and end (posttest) of 10 sections of general education biology at a regional comprehensive university in the Pacific Northwest. Several research questions framed this investigation:

Does writing in laboratory affect critical thinking performance in general education biology? Does the development of analysis, inference, and evaluation skills differ between students who experience writing versus those who experience traditional laboratory instruction? What measurable effect do factors like gender, ethnicity, and prior thinking skill have on changes in critical thinking in general education biology? If critical thinking skills change during an academic quarter, when does that take place?

MATERIALS AND METHODS

Study context.

The study took place at a state-funded regional comprehensive university in the Pacific Northwest. All participants were nonmajor undergraduates who were taking biology to satisfy their general education science requirement. Ten total sections of general education biology offered over three academic quarters (one academic year) were included in the study. Four of the 10 sections implemented a writing component during weekly laboratory meetings (N = 158); six traditional quiz-based laboratory sections served as a nonwriting control group (N = 152). Only scores from students who had completed both the initial (pretest) and end-of-quarter (posttest) critical thinking assessments were included in the data analysis. A breakdown of participant demographics for the writing and nonwriting groups is provided in Table 1 .

Demographics for the writing and nonwriting groups

Demographics profile for the study sample. n values in parentheses.

a Other includes the ″choose not to answer″ response.

Each course section included a lecture component offered four times per week for 50 min and a laboratory component that met once a week for 2 h. Course lecture sections were limited to a maximum enrollment of 48 students, with two concurrent lab sections of 24 students. Two different instructors taught five writing sections and five other instructors taught 11 traditional sections over three consecutive quarters. Each course instructor materially participated in teaching laboratory with the help of one graduate assistant per lab section (two graduate students per course section). None of the instructors from treatment sections had implemented writing in the laboratory before the start of this study. Writing instructors were chosen on the basis of personal dissatisfaction with traditional laboratory teaching methods and willingness to try something new.

Strong efforts were made to establish equivalency between writing and nonwriting course sections a priori. Course elements that were highly similar included common lecture rooms, the use of similar (in most cases identical) textbooks, and a lab facility coordinated by a single faculty member. More specifically, three similarly appointed lecture rooms outfitted with contemporary instructional technology including dry erase boards, media cabinets, a networked computer, and digital projection were used to teach the nonmajors biology courses. The same nonmajors biology textbook was used across the writing and most of the nonwriting sections. All laboratory sections used a common lab facility and were taught on the same day of the week. Although the order in which specific labs were taught differed among sections, a common laboratory manual containing prescriptive exercises covering the main themes of biology (scientific method, cellular biology and genetics, natural selection and evolution, kingdoms of life, and a mammalian dissection) was used across all writing and nonwriting lab sections.

Primary course differences included a writing component in the laboratory, and how much time was devoted to laboratory activities. Those sections that experienced the writing treatment completed the prescriptive lab exercises in the first hour and engaged in writing during the second hour of the lab. Nonwriting sections allocated 2 h for the prescriptive lab exercises and included a traditional laboratory quiz rather than a writing assignment. The degree to which the writing and nonwriting sections included small group collaboration in laboratory varied and all course sections differed with regards to individual instructor teaching style. Although all course sections used traditional lecture exams during the quarter to assess content knowledge, the degree to which rote memorization-based exam questions were used to evaluate student learning varied.

Description of the Writing Treatment

On the first day of lecture, students in the writing treatment group were told that their laboratory performance would be evaluated using collaborative essays instead of traditional quizzes. A brief overview of the writing assignments was included in associated course syllabi. During the first laboratory session of the quarter, students were grouped into teams of three or four individuals, and the criteria for completing weekly writing assignments were further explained.

The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature ( Collier, 1980 ; Bruffee, 1984 ; Tobin et al., 1994 ; Jones and Carter, 1998 ; Springer et al., 1999 ) and prior research by Quitadamo, Brahler, and Crouch (unpublished results), who showed that Peer Led Team Learning (one form of collaborative learning) helped to measurably improve undergraduate critical thinking skills. Small group learning was also used in the nonwriting treatment groups to a greater or lesser extent depending on individual instructor preference.

Baseline critical thinking performance was established in the academic quarters preceding the writing experiment to more specifically attribute changes in critical thinking to the writing treatment. Concurrent nonwriting course sections were also used as comparison groups. The historical baseline provided a way to determine what student performance had been before experiencing the writing treatment, whereas the concurrent nonwriting groups allowed for a direct comparison of critical thinking performance during the writing treatment. Pretest scores indicating prior critical thinking skill were also used to further establish comparability between the writing and nonwriting groups.

Laboratory activities were coordinated for all sections by a single faculty member who taught in the nonwriting group. All faculty and graduate assistants met regularly to discuss course progress, laboratory procedure, and coordinate resources. Nonwriting faculty drafted quizzes that addressed laboratory content knowledge. Writing faculty collaboratively crafted a consensus essay, or thought question, designed to elicit student critical thinking and ability to apply content knowledge. Each thought question was designed so that students had to apply lecture concepts and build on their conceptual understanding by integrating actual laboratory experiences (see Supplemental Appendix 1 , available online) for thought question examples). Weekly thought questions became progressively more difficult as the term progressed. Initial planning meetings took place just before the beginning of the academic quarter and included graduate assistant training to help them learn to consistently evaluate student writing using a modified thesis-based essay rubric (see Supplemental Appendix 2 ; Beers et al., 1994 ). A range of sample essays from poor to high quality was used to calibrate graduate assistant scoring and ensure consistency between assistants from different laboratory sections within the writing group. All graduate assistants and course instructors applied the thesis-based rubric to sample essays and worked toward consensus. Initial training ended when all graduate assistants scored within 0.5 points of each other on at least two sample essays.

Students were given weekly thought questions before beginning laboratory to help them frame their efforts during laboratory exercises. Students completed the prescriptive lab activities during the first hour, and then each student group relocated to an assigned computer lab in the same building and worked around a common computer terminal to draft a collective response to the weekly thought question. Students were allowed to use any suitable information or materials (laboratory observations, laboratory manuals, lecture notes, textbooks, the Internet, etc.) to help them address their thought question. Internal group discussions allowed students to argue individual viewpoints as they worked toward group agreement on each thought question. Essay responses to thought questions were answered using a standard five-paragraph format. Each essay included an introduction with a group-generated thesis statement, two to three body paragraphs that provided sufficient detail to support the thesis statement, and a summary paragraph that concluded the essay. Students were not allowed to work on essays outside of the laboratory environment.

Initial essay drafts were composed in Microsoft Word and submitted to the graduate assistant by the end of the laboratory period using the campus e-mail system. Graduate assistants evaluated each group's essay (typically six per lab section) and assigned an initial grade based on the thesis-based essay rubric. Graduate assistants made comments and suggestions electronically using Microsoft Word revising and track changes tools. Evaluated essays were e-mailed back to each student group, which addressed comments and suggestions during the subsequent week's laboratory writing time. Each student group submitted a final draft that was re-evaluated and assigned a final grade. During the second week, students both revised their essay from the previous week and then generated an initial draft for the current week's thought question, all within the lab writing hour. This was done to help students become more proficient writers within a short period of time. Overall, students in the writing group completed eight essays that, along with lab book scores, constituted 25% of their overall course grade. An identical percentage was used to calculate traditional quiz and lab book scores in all nonwriting course sections.

At the end of the quarter, each writing group member completed a peer evaluation for all group members, including themselves (see Supplemental Appendix 3 ). This was done to help students reflect on and evaluate their own performance, maximize individual accountability within the group, and make sure students received credit proportional to their contributions. The average peer evaluation score for each student was included as 5% of the final course grade.

Collectively, this approach to writing and evaluation was used to 1) help students reflect on and discuss deficiencies in their collective and written work, 2) provide an opportunity for students to explicitly address deficiencies in thesis development and general writing skill, 3) provide a suitable reward for student efforts to revise their work relative to established performance benchmarks, 4) improve individual accountability within each group, and 5) help students develop more efficient and effective writing skills that collectively might lead to improved critical thinking skill.

Assessment of Critical Thinking

Using critical thinking to indicate student learning performance is particularly useful because it can be measured within and across disciplines. Various instruments are available to assess critical thinking ( Watson and Glaser, 1980 ; Ennis and Weir, 1985 ; Facione, 1990b ; Center for Critical Thinking and Moral Critique, 1996 ); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and has been evaluated for validity and reliability for measuring critical thinking at the college level ( Facione, 1990a ; Facione et al., 1992 , 2004 ). The CCTST measures cognitive skills of analysis, inference, evaluation, induction, and deduction, with results expressed as raw scores or national percentile equivalents based on a national norming sample of students from 4-yr colleges and universities. Construct validity for the CCTST is high as indicated by greater than 95% consensus of the Delphi panel experts on the component skills of critical thinking. Test reliability (calculated using the KR–20 internal consistency method) is 0.78–0.84 for the form used in this study, a value considered to be within the recommended range for tests that measure a wide range of critical thinking skills ( Facione, 1991 ). The CCTST norming sample for 4-yr colleges and universities is based on a stratified sample of 2000 students from various disciplines, with approximately 30% of the norming sample comprised of science and math students. Approximately 20,000 college students complete the CCTST each year ( Insight Assessment and Blohm, 2005 ).

The CCTST contains 34 questions and is a 45-min timed assessment of critical thinking. An online version of the CCTST was administered in this study, which allowed the researchers to collect student demographics data including gender, ethnicity, age, and several others at the same time critical thinking skill was measured. Total critical thinking skill as well as analysis, inference, and evaluation component critical thinking skills ( Facione, 1990c ) were determined for each CCTST administration and compared across the writing and nonwriting groups.

Research Design

A quasi-experimental pretest/posttest control group design was used for this study to determine whether critical thinking performance in the writing group differed significantly from the nonwriting group. This design was chosen in order to compare critical thinking performance between intact groups, and because it was not feasible to randomly assign students from one course section to another within the sample. Frequency distributions of pretest/posttest changes in total critical thinking skill and analysis, inference, and evaluation component critical thinking skills were constructed to provide some indication of sample randomness and to inform assumptions for subsequent statistical analyses of covariance (see Figure 1 , A–D).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740001.jpg

(A–D) Frequency distribution of change in critical thinking skills. Distribution of change in critical thinking skill for the experimental sample. Changes are indicated using raw scores from CCTST pre- and posttests for total critical thinking skill (A) as well as analysis (B), inference (C), and evaluation (D) component critical thinking skills.

The pretest/posttest control group design was also used in order to minimize internal validity threats that could potentially compete with the effects of the writing treatment on student critical thinking performance. This design is widely used in educational research, and generally controls for most threats to internal validity ( Campbell and Stanley, 1963 ). Internal threats that remain a concern include history, maturation, pretest sensitization, selection, and statistical regression toward the mean. In the current study, history and maturation threats were minimized to the extent that the CCTST pretest and posttest were administered only 9 wk apart, and class standing and age covariables that indicate maturation were included in the statistical analysis. Pretest sensitization and selection are larger concerns for this design. Pretest sensitization was minimized in several ways: 1) prior critical thinking skill indicated by the CCTST pretest was used as a covariable in statistical analyses, 2) pretest/posttest to posttest only comparison studies conducted by Insight Assessment indicate CCTST pretest sensitization is minimized ( Facione, 1990a ), and 3) neither the students, instructors, nor the test administrators have access to the correct answers on the CCTST, so repeat performance on the posttest is less likely. Selection threats were also reduced by using CCTST pretest scores in the statistical analyses, thereby making it more difficult to detect statistically significant differences in critical thinking performance between the writing and nonwriting groups. Statistical regression toward the mean, which was observed to some extent in this study, was minimized because this study used a valid and reliable instrument to assess critical thinking ( Facione, 1990a ). Regression threats were also minimized to the extent that students with higher initial scores regressed much less than students with lower initial scores.

The generalizability of study results is limited because all data were collected at a single university. Specific threats to external validity include selection-treatment interaction and treatment diffusion. These threats were minimized because writing was mandatory for all treatment group participants, thereby minimizing volunteer effects. Because the writing also took considerable student effort, it is less likely that treatment diffusion occurred. In summary, the pretest/posttest control group design was used to minimize internal and external validity threats and maximize the ability to determine the effects of writing on student critical thinking performance.

Study Variables and Data Analysis

Effect of writing on critical thinking performance..

General education biology students were divided into writing and nonwriting groups (independent variable). Changes in CCTST pretest/posttest scores (dependent variable) were determined to discover whether writing influenced student critical thinking performance. Two CCTST outcome measures were used to statistically test for writing effect: 1) raw scores for total critical thinking skill, and 2) raw scores for analysis, inference, and evaluation component skills. Results were reported using raw scores and corresponding national percentile rank so that critical thinking performance outcomes would be more meaningful and intuitive. Conversion of CCTST raw scores to national percentile ranking was done using SPSS (SPSS, Inc., Chicago, IL) statistical software and a linear estimation conversion script based on an equivalency scale from Insight Assessment (Millbrae, CA).

Several covariables were included in the analysis to increase statistical accuracy and precision, and to more specifically isolate the effects of writing on critical thinking performance. CCTST pretest scores were used to indicate initial critical thinking skill. Gender and ethnicity helped to account for male/female or race-specific changes in critical thinking performance and were also used to identify potential sources of performance bias. Academic term and time of day were used to account for critical thinking differences due to the time of year each course was offered and the time of day each student took the course, respectively. Class standing and age were used to indicate maturation related to time in college and chronological age, respectively. Finally, the instructor covariable was used to account for performance differences due to individual teaching styles.

Statistical Analysis of Effect of Writing.

Several statistical analyses were conducted to determine the effects of writing on critical thinking performance in general education biology. An analysis of covariance (ANCOVA) test provided insight regarding differences in overall critical thinking performance between the writing and nonwriting groups. Change in CCTST total raw scores and national percentile ranking was used as composite measures of critical thinking ( Facione, 1990c ) in this initial analysis. Second, changes in particular component critical thinking skills (analysis, inference, and evaluation) were evaluated using a multivariate analysis of covariance (MANCOVA) test because of the three dependent variables. The ANCOVA and MANCOVA tests also provided some insight into the effect the covariables had on critical thinking performance in general education biology. Collectively, these statistical tests allowed for a more accurate and precise analysis because variance associated with the covariables could be more specifically isolated from the writing treatment. Mean, SE, and effect size were also compared between the writing and nonwriting groups. Effect size, represented in standard units, was used to compare the magnitude of writing effect in the study.

Analysis of Thought Question Performance.

Performance on weekly thought questions was analyzed to discover specifically when and how much student critical thinking skills changed during the academic term. This analysis also provided context for CCTST critical thinking performance measures. Specifically, average scores from a representative sample of writing course sections (approximately 100 students) were used to compare initial essay drafts across the weeks of the term to discover when students began to show changes in their first attempt at each essay. Weekly performance on final revised essays was also compared to determine how student final submissions changed over time. Finally, the weekly difference between each initial essay and each final essay was compared to determine how much the revision process changed during the term. These calculations collectively helped to provide a profile of critical thinking performance over time.

Participant Demographics

Student demographics provided in Table 1 indicated an overall distribution of approximately 49% freshmen, 31% sophomores, 11% juniors, and 9% seniors. Approximately 74% of the writing group students were freshmen and sophomores, whereas 82% of the nonwriting group was underclassmen. Overall, 61% of the sample was female and 39% male, with near identical gender distribution across the writing and nonwriting groups. The predominant ethnicity in the sample was Caucasian (>83%), with Asian American (5%), Latino/Hispanic (3%), African American (2%), and Native American (1%) students comprising the remainder of the sample. About 6% of the sample classified themselves as having some other ethnicity or chose not to identify their ethnic heritage.

Statistical Assumptions

Analysis of covariance and multivariate analysis of covariance tests were used to compare critical thinking performance between the writing and nonwriting groups. The evaluated assumptions for the ANCOVA and MANCOVA tests were homogeneity of slopes, homogeneity of covariances, and normality. An analysis evaluating the homogeneity of slopes assumption indicated that the relationship between the covariables and the critical thinking performance dependent variable did not differ significantly by the writing/nonwriting independent variable for the ANCOVA test, F(2, 307) = 1.642, p = 0.195, power = 0.346, partial η 2 = 0.011, or the MANCOVA test, F(6, 610) = 1.685, p = 0.122, power = 0.645, partial η 2 = 0.016. These results confirmed that both analyses of covariance met the homogeneity of slopes assumption. The homogeneity of covariance assumption was tested using Levene's and Box's tests. Levene's test results for the ANCOVA indicated that error variances were not equal across writing and nonwriting groups, F(1,308) = 7.139, p = 0.008. Similarly, Box's test results indicated that covariance was not equal for the writing and nonwriting groups, F(6, 684,530) = 4.628, p = 0.000. These results indicated that the ANCOVA/MANCOVA tests did not meet the homogeneity of covariance assumption. To more fully evaluate this assumption, distributions of total and component critical thinking skill were constructed (see Figure 1 , A–D). Furthermore, the writing and nonwriting groups were highly similar in size and no post hoc tests were conducted. On the basis of these data, it was determined that the ANCOVA and MANCOVA tests were the best statistical measures to answer the research questions. Finally, the normality assumption was evaluated using the previously constructed frequency distributions for total change in critical thinking ( Figure 1 A) as well as change in analysis ( Figure 1 B), inference ( Figure 1 C), and evaluation ( Figure 1 D) critical thinking skills. Frequency distributions of total and component critical thinking dependent variables indicated that each approximated a standard normal curve.

Effect of Writing on Total Critical Thinking Performance

The ANCOVA test of total critical thinking performance showed that writing and nonwriting groups differed significantly, F(1, 300) = 19.357, p < 0.0001, power = 0.992, partial η 2 = 0.061 (see Table 2 ). The strength of the relationship between the writing/nonwriting groups and critical thinking performance was modest but significant, accounting for more than 6% of the variance in critical thinking performance.

ANCOVA results for total critical thinking performance

Analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pre-test), academic term, time of day, and instructor.

a Significance tested at 0.05 level.

Descriptive statistics of total critical thinking performance in the writing and nonwriting groups were also calculated (see Table 3 ). The writing group showed an average CCTST raw score change of 1.18 compared with the nonwriting group, which showed an average raw score change of −0.51. These critical thinking raw scores equated to gains in national percentile rank of 7.47 (45th to 53rd percentile) for the writing group and −2.09 (42nd to 40th percentile) for the nonwriting group. Critical thinking improvement in the writing group was approximately nine times greater than the nonwriting group (see Figure 2 ).

Writing effect on total critical thinking performance: CCTST raw scores

Comparison of writing and nonwriting group performance based on CCTST raw scores. CCTST raw score range was 0–34; n values in parentheses.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740002.jpg

Effect of writing on total critical thinking national percentile rank. Comparison of total critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The ANCOVA test of total critical thinking skill indicated that gender, ethnicity, age, class standing, and academic term did not significantly affect critical thinking performance (see Table 2 ). Covariables that significantly affected total critical thinking performance included 1) CCTST pretest score, F(1, 300) = 19.713, p < 0.0001, power = 0.993, partial η 2 = 0.062, 2) instructor, F(1, 300) = 7.745, p < 0.006, power = 0.792, partial η 2 = 0.025, and 3) time of day, F(1300) = 6.291, p < 0.013, power = 0.705, partial η 2 = 0.021. The effect of prior critical thinking skill (CCTST pretest) was moderately strong, accounting for more than 6% of the variance in total critical thinking performance. The effect of instructor and time of day were smaller, accounting for 2.5 and 2%, respectively, of total critical thinking performance variance. Critical thinking improvement associated with CCTST pretest score was approximately 2.5 times greater than for instructor and nearly three times greater than for time of day.

Effect of Writing on Component Critical Thinking Performance

The MANCOVA test indicated that analysis, inference, and evaluation critical thinking skills differed significantly between the writing and nonwriting groups, Wilks λ = 0.919, F(3, 296) = 8.746, p < 0.0001, power = 0.995, partial η 2 = 0.081 (see Table 4 ). The strength of the relationship between writing and component critical thinking performance was modest but significant, accounting for more than 8% of the variance in critical thinking performance.

MANCOVA results for component critical thinking performance

Multivariate analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pretest), academic term, time of day, and instructor.

Specifically, significant gains in analysis and inference skills were observed in the writing group but not the nonwriting group. No statistically significant gains in evaluation skill were observed in either group (see Table 5 ). National percentile rank equivalents for CCTST component raw scores indicated the writing group gained 10.51 percentile in analysis skill (42nd to 52nd percentile), 6.05 percentile in inference skill (45th to 52nd percentile), and 5.16 percentile in evaluation skill (46th to 52nd percentile). The nonwriting group showed a national percentile rank change of −4.43 percentile in analysis skill (47th to 42nd percentile), −2.23 percentile in inference skill (42nd to 40th percentile), and 1.37 percentile in evaluation (44th to 45th percentile; see Figure 3 ). Critical thinking performance for the writing group was 15 times greater for analysis and 8 times greater for inference skills than for the nonwriting group. Although neither the writing nor the nonwriting group showed significant gains in evaluation skill, the writing group showed more than 3 times greater improvement than did the nonwriting group.

Effect of writing on component critical thinking performance

Comparison of writing and nonwriting group performance based on critical thinking component skill raw scores (CCTST subscales). Score range was 0–7 (analysis), 0–16 (inference), and 0–11 (evaluation).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740003.jpg

Effect of writing on component critical thinking national percentile rank. Comparison of component critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The MANCOVA test of analysis, inference, and evaluation skills indicated that gender, ethnicity, age, class standing, academic term, and time of day did not significantly affect critical thinking performance. Critical thinking performance was affected by prior analysis, inference, and evaluation skill (CCTST component pretest scores) and instructor (see Table 4 ). Specifically, component pretest scores had a large effect on critical thinking, accounting for 38% (analysis), 32% (inference), and 39% (evaluation) of critical thinking performance variance. The effect of instructor was smaller, accounting for 4.4% of variation in critical thinking skill. The effect of prior component critical thinking skill was approximately 4.5 times greater than the effect of writing, and nearly 9 times greater than the effect of instructor.

Student Thought Question Performance

Critical thinking performance on student essays was evaluated by applying a thesis-based essay rubric (see Supplemental Appendix 2 ) on initial submissions and final revised essays. Average weekly performance during the academic term is shown in Figure 4 . A comparison of initial essays indicated that students improved 53.3% from week 1 (average score of 27.9%) to week 7 (average score of 81.2%). A similar comparison of final essays showed that students improved 32.5% from week 1 (average score of 54.1%) to week 7 (average score of 86.6%). The largest changes between initial and final essays occurred in week 1 (change of 26.2%), and decreased each week thereafter (24.8, 23.9, 18.8, 8, 7.8, and 5.4% for weeks 2 through 7, respectively). These results showed that students produced little evidence of critical thinking skill in their writing early in the term, but improved dramatically on both initial and revised essay submissions by the end of the term.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740004.jpg

Profile of change in critical thinking performance in writing group. Comparison of student writing performance on weekly initial and revised essays. Essay scores were derived using a thesis-based critical thinking rubric (see Supplemental Appendix 2 ). Average essay scores were computed across writing sections.

The purpose of this study was to discover whether writing could measurably influence critical thinking performance in general education biology. Results indicated that students from the writing group significantly outperformed their nonwriting peers in both total critical thinking skill and the component critical thinking skills of analysis and inference. The writing and nonwriting groups were highly similar initially and began the academic term with comparable critical thinking ability (45th and 42nd national percentile for writing and nonwriting, respectively). By the end of the term, writing students had improved their critical thinking skill to above the 52nd percentile whereas nonwriting students decreased to below the 40th percentile. In addition to writing, prior critical thinking skill and course instructor significantly affected critical thinking performance, with prior critical thinking skill having the largest effect on critical thinking gains of any variable tested. Further analysis of the writing group showed that the largest gains in critical thinking occurred during the first few weeks of the term, with graduated improvement during the remainder of the term. A comparison of average critical thinking performance on initial essays and revised essays showed that thinking skills improvement was greater on initial essays (53%) than on final essays (33%). Collectively, the results of this study indicated that students who experienced writing in general education biology significantly improved their critical thinking skills.

The covariance analysis that was conducted provided a partial means to separate out the effects of writing, prior critical thinking skill, instructor, and multiple covariables from total and component critical thinking gains. The analysis of total critical thinking skill indicated that writing students changed their critical thinking skill from below the national average to above the national average within an academic quarter, whereas nonwriting students remained below the national average. This observation is important because it shows that students can develop critical thinking skills within a fairly short 9-wk period of time, and that writing can play a role in that process. A similar study showed critical thinking skills improve over 15 wk (Quitadamo, Brahler, and Crouch, unpublished results); however, this study provided no insight into whether critical thinking skills could be changed over a shorter period of time, in a different academic setting, or in response to instructional variables such as writing.

Although critical thinking gains were influenced by writing, they did not appear to be affected by gender, ethnicity, class standing, or age. In fact, statistical results indicated that these variables collectively had a very small effect on critical thinking performance. Gender distribution was nearly identical across the writing and nonwriting groups, and was predominantly female (nearly 62%). Ethnic distribution was also highly similar across the writing and nonwriting groups, but the sampling was largely Caucasian (>84%). Class standing varied a little more across the writing and nonwriting groups, with the sample largely comprised of underclassmen (70%). Although nearly three-quarters of the sample was between 18 and 21 years of age, nearly 10% was over 21, with a fair number of older nontraditional students represented. It is possible that a more diverse sample would have produced different results, or it may be that the individuals participating in this study responded particularly well to writing. Although further investigation of these variables is necessary and important, it was beyond the scope of the current study.

The analysis of component skills provided greater insight into the particular critical thinking skills that students changed in response to writing. Specifically, writing students significantly improved their analysis and inference skills whereas nonwriting students did not. Writing students also improved their evaluation skills much more than nonwriting students, although not significantly. These results indicate that the process of writing helps students develop improved analytical and inference skills. Prior research indicates that the writing to learn strategy is effective because students must conceptually organize and structure their thoughts as well as their awareness of thinking processes ( Langer and Applebee, 1987 ; Ackerman, 1993 ; Holliday, 1994 ; Rivard, 1994 ). More specifically, as students begin to shape their thoughts at the point of construction and continually analyze, review, and clarify meaning through the processes of drafting and revision, they necessarily engage and apply analysis and inference skills ( Klein, 1999 ; Hand and Prain, 2002 ). In this study, the process of writing appears to have influenced critical thinking gains. It also seems likely that writing students experienced a greater cognitive demand than nonwriting students simply because the writing act required them to hypothesize, debate, and persuade ( Rivard, 1994 ; Hand and Prain, 2002 ) rather than memorize as was the case in nonwriting control courses.

Conversely, the lack of any significant change in analysis, inference, or evaluation skills in the nonwriting group indicated that the traditional lab instruction used in the general education biology control courses did not help students develop critical thinking skills. Based on the results of this study, it could be argued that traditional lab instruction actually prevents the development of critical thinking skills, which presents a rather large problem when one considers how frequently these traditional methods are used in general education biology courses. One also has to consider that the critical thinking gains seen in the writing group might also have resulted from the relative absence of traditional lab instruction rather than writing alone. Additional research will be necessary to gain further insight into this question. Either way, changes to the traditional model of lab instruction will be necessary if the goal is to enhance the critical thinking abilities of general education biology students.

The variable that had the largest impact on critical thinking performance gains was prior critical thinking skill. This phenomenon was previously observed by Quitadamo, Brahler, and Crouch (unpublished results) in a related study that investigated the effect of Peer Led Team Learning on critical thinking performance. That study focused on science and math major undergraduate critical thinking performance at a major research university, and found that, in addition to Peer Led Team Learning, prior critical thinking skill significantly influenced critical thinking performance (Quitadamo, Brahler, and Crouch, unpublished results). Specifically, students with the highest prior critical thinking skill showed the largest performance gains, whereas students with low initial skill were at a comparative disadvantage. The fact that prior critical thinking skill also had a large effect on critical thinking performance in this study increases the generalizability of the observation and underscores its importance. Simply put, students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills, not because they lack the cognitive hard-wiring to perform but because they lack the tools to build their knowledge. Is it reasonable or just to expect otherwise comparable students to perform at similar levels when only some of them have the keys for success? If we hope to improve the perception of science in this country, we need to educate people on how to think about important scientific issues, and not simply argue a position based on one school of thought. By helping general education students to develop critical thinking skills, it is hoped that they will be better able to think rationally about science.

The observation that students who come to general education biology with greater critical thinking skills leave with the largest skill gains has important implications for the K–12 school system as well. If a high proportion of students are coming to institutions of higher education lacking critical thinking skills, why are these skills not being explicitly taught in the K–12 system? Ideally, students would learn the foundational tenets of critical thinking at an earlier age, and be able to refine and hone these skills as they progress through the K–20 education system. The results of this study reinforce the idea that students should be explicitly taught critical thinking skills and be expected to practice them as early and often as possible.

Although its effect was smaller than writing or prior critical thinking skill, the instructor variable also played a significant role in student critical thinking performance, accounting for 2.5% of the total variance in critical thinking gains. Determining the particular qualities of each instructor that contributed to student critical thinking success and further separating instructor and writing effects will require additional research. Previous research indicates that teaching style positively influences certain aspects of student learning ( Grasha, 1994 ; Hativa et al., 2001 ; Bain, 2004 ), but the qualities that specifically influence student critical thinking gains have not been sufficiently investigated. Additional research in this area is necessary.

Faculty considering whether to use writing in the laboratory may wonder about how much time and energy it takes to implement, if efforts to change will translate into improved student learning, and how these changes affect disciplinary content. From a practical perspective, implementing writing did not take more time and effort per se; rather, it required faculty to reconceptualize how they spent their instructional time. Instead of individually developing course materials, writing faculty collaborated to a greater extent than nonwriting faculty on course design and assessments that required students to demonstrate their critical thinking skill. Interviews of faculty from the writing and nonwriting groups indicated that writing faculty felt the course was less work because they collaborated with colleagues and because students demonstrated improved thinking skill. Writing faculty generally became more comfortable with the new model after ∼2–3 wk when students began to show observable changes in writing proficiency and critical thinking. Together, collaboration with colleagues and observed gains in critical thinking tended to create a positive feedback loop that helped to sustain writing faculty efforts. In contrast, nonwriting faculty similarly wanted their students to think better but were convinced that traditional methods would be more effective, and so remained closed to change. There were some logistical challenges with writing, like scheduling computer labs where students could draft and revise their weekly essay responses under instructor and teaching assistant supervision. Teaching assistants (and faculty) also needed to be trained on how to evaluate writing using a rubric. Finally, with regards to content coverage, no lecture or laboratory content was killed in order to implement writing because writing and nonwriting students both performed the same lab activities. Collectively, the benefits of using writing in laboratory should encourage faculty who want their students to learn to think critically to give it a try.

Future Directions

This study showed that writing affects student critical thinking skill in a nonmajors biology course, but the results have generated more questions than have been answered. How does writing specifically produce gains in critical thinking performance? What factors influence student prior critical thinking skill? How do instructors specifically influence student gains in critical thinking? Future studies that analyze student essays in more detail would provide greater insight into how writing influences critical thinking skill. Using writing in other nonmajor science courses such as chemistry, geology, or physics could also be done to determine the transferability of this method. Additional studies that investigate student prior critical thinking skill and instructor variables are also necessary. These future studies would further contribute to the knowledge base in this area, and also address some of its identified limitations ( Ebert-May et al., 1997 ; Daempfle, 2002 ). Results from these studies would also increase the generalizability of the results from this study.

CONCLUSIONS

Building on existing research and on the basis of several lines of evidence presented in this study, we conclude that writing positively influences critical thinking performance for general education biology students. Those students with prior critical thinking skill may have a comparative advantage over other general education biology students who have not developed these same skills. To rectify that inequity critical thinking skills should be explicitly taught early and used often during the K–20 academic process. As it appears that particular instructors improve student critical thinking skills more than others, students should be discerning in their choice of instructors if they want to improve their critical thinking skills. Whether writing as a method to improve critical thinking skills will prove useful in other general education science courses will likely depend on a host of factors, but it has potential. Further study of writing in general education science will be necessary to verify these results and discover the breadth and depth of how writing affects critical thinking skill.

ACKNOWLEDGMENTS

We thank Drs. Holly Pinkart, Roberta Soltz, Phil Mattocks, and James Johnson and undergraduate researchers Matthew Brewer, Dayrk Flaugh, Adam Wallace, Colette Watson, Kelly Vincent, and Christine Weller for their valuable contributions to this study. The authors also acknowledge the generous financial support provided by the Central Washington University Office of the Provost and the Office of the Associate Vice President for Undergraduate Studies.

  • Ackerman J. M. The promise of writing to learn. Writ. Commun. 1993; 10 (3):334–370. [ Google Scholar ]
  • American Association for the Advancement of Science. Washington, DC: 1989. Science for All Americans. A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology. [ Google Scholar ]
  • Applebee A. N. Writing and reasoning. Rev. Educ. Res. 1984; 54 (4):577–596. [ Google Scholar ]
  • Association of American Colleges Universities. Washington, DC: 2005. Liberal Education Outcomes: A Preliminary Report on Student Achievement in College. [ Google Scholar ]
  • Bain K. Cambridge, MA: Harvard University Press; 2004. What the Best College Teachers Do. [ Google Scholar ]
  • Beers T., McIssac C., Henderson B., Gainen J. Writing: thesis and support scoring guide. 1994. [accessed 25 August 2006]. http://www.insightassessment.com/pdf_files/RUB_WTHS.PDF .
  • Bruffee K. A. Collaborative learning and the “conversation of mankind.” Coll. Engl. 1984; 46 (7):635–653. [ Google Scholar ]
  • Business-Higher Education Forum, and American Council on Education. Washington, DC: 2003. Building a Nation of Learners: The Need for Changes in Teaching and Learning To Meet Global Challenges. [ Google Scholar ]
  • Bybee R. W., Fuchs B. Preparing the 21st century workforce: a new reform in science and technology education. J. Res. Sci. Teach. 2006; 43 (4):349–352. [ Google Scholar ]
  • Campbell D. T., Stanley J. C. Boston, MA: Houghton Mifflin Company; 1963. Experimental and Quasi-experimental Designs for Research. [ Google Scholar ]
  • Carnevale A. P. American Society for Training Development. San Francisco, CA: Jossey-Bass; 1990. Workplace Basics: The Essential Skills Employers Want. [ Google Scholar ]
  • Center for Critical Thinking and Moral Critique. Rohnert Park, CA: Sonoma State University; 1996. ICAT Critical Thinking Essay Test. [ Google Scholar ]
  • Champagne A., Kouba V., Mintzes J., Wandersee J., Novak J. Assessing Science Understanding: A Human Constructivist View. New York: Academic Press; 1999. Written product as performance measures; pp. 224–248. [ Google Scholar ]
  • Collier K. G. Peer-group learning in higher education: the development of higher order skills. Stud. High. Educ. 1980; 5 (1):55–61. [ Google Scholar ]
  • Daempfle P. A. New York: U.S. Department of Education; 2002. Instructional Approaches for the Improvement of Reasoning in Introductory College Biology Courses: A Review of the Research. [ Google Scholar ]
  • Ebert-May D., Brewer C., Allred S. Innovation in large lectures—teaching for active learning. Bioscience. 1997; 47 (9):601–607. [ Google Scholar ]
  • Ennis R. H. A logical basis for measuring critical thinking skills. Educ. Leadership. 1985; 43 (2):44–48. [ Google Scholar ]
  • Ennis R. H., Weir E. Pacific Grove, CA: Midwest Publications; 1985. The Ennis-Weir Critical Thinking Essay Test. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990a. The California Critical Thinking Skills Test—College Level. Technical Report 1. Experimental Validation and Content Validity. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990b. The California Critical Thinking Skills Test—College Level. Technical Report 3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990c. The California Critical Thinking Skills Test—College Level. Technical Report 4. Interpreting the CCTST, Group Norms, and Sub-Scores. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1991. Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment. [ Google Scholar ]
  • Facione P. A. American Philosophical Association. Millbrae, CA: Insight Assessment; 1990. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. [ Google Scholar ]
  • Facione P. A., Facione N. C., Giancarlo C. A. Millbrae, CA: Insight Assessment; 1992. Test Manual: The California Critical Thinking Disposition Inventory. [ Google Scholar ]
  • Facione P. A, Facione N. C. Insight Assessment. 2004. [accessed 30 June 2006]. Test of everyday reasoning. http://www.insightassessment.com/test-ter.html . [ Google Scholar ]
  • Giancarlo C. A., Facione P. A. A look across four years at the disposition toward critical thinking among undergraduate students. J. Gen. Educ. 2001; 50 (1):29–55. [ Google Scholar ]
  • Grasha A. F. A matter of style: the teacher as expert, formal authority, personal model, facilitator, and delegator. Coll. Teach. 1994; 42 (4):142–149. [ Google Scholar ]
  • Hand B., Prain V. Teachers implementing writing-to-learn strategies in junior secondary science: a case study. Sci. Educ. 2002; 86 (6):737–755. [ Google Scholar ]
  • Hativa N., Barak R., Simhi E. Exemplary university teachers: knowledge and beliefs regarding effective teaching dimensions and strategies. J. High. Educ. 2001; 72 (6):699–729. [ Google Scholar ]
  • Holliday W. G. The reading-science learning-writing connection: breakthroughs, barriers, and promises. J. Res. Sci. Teach. 1994; 31 (9):877–893. [ Google Scholar ]
  • Holmes J., Clizbe E. Facing the 21st century. Bus. Educ. Forum. 1997; 52 (1):33–35. [ Google Scholar ]
  • Holyoak A. R. A plan for writing throughout (not just across) the biology curriculum. Am. Biol. Teach. 1998; 60 (3):186–190. [ Google Scholar ]
  • Howard V. A. Thinking on paper: a philosopher's look at writing. In: Howard V. A., editor. Varieties of Thinking: Essays from Harvard's Philosophy of Education Research Center. New York: Routledge; 1990. pp. 84–92. [ Google Scholar ]
  • Insight Assessment. Blohm S. Annual number of users for the CCTST form 2000. 2005 [accessed 8 December 2006]; [ Google Scholar ]
  • Jones E. A., Hoffman S., Moore L. M., Ratcliff G., Tibbets S., Click B., III . Report no. NCES-95-001. University Park, PA: U.S. Department of Education, Office of Educational Research and Improvement.; 1995. National Assessment of College Student Learning: Identifying College Graduates' Essential Skills in Writing, Speech and Listening, and Critical Thinking. Final project report. [ Google Scholar ]
  • Jones G. M, Carter G. Small groups and shared constructions. In: Mintzes J. J., Wandersee J. H., Novak J. D., editors. Teaching Science for Understanding: A Human Constructivist View. San Diego, CA: Academic Press; 1998. pp. 261–279. [ Google Scholar ]
  • Kelly G. J., Chen C. The sound of music: constructing science as sociocultural practices through oral and written discourse. J. Res. Sci. Teach. 1999; 36 (8):883–915. [ Google Scholar ]
  • Keys C. W. Revitalizing instruction in scientific genres: connecting knowledge production with writing to learn in science. Sci. Educ. 1999; 83 (2):115–130. [ Google Scholar ]
  • Keys C. W., Hand B., Prain V., Collins S. Using the science writing heuristic as a tool for learning from laboratory investigations in secondary science. J. Res. Sci. Teach. 1999; 36 (10):1065–1084. [ Google Scholar ]
  • Klein P. Reopening inquiry into cognitive processes in writing-to-learn. Ed. Psychol. Rev. 1999; 11 (3):203–270. [ Google Scholar ]
  • Klein P. D. Constructing scientific explanations through writing. Instr. Sci. 2004; 32 (3):191–231. [ Google Scholar ]
  • Klymkowsky M. W. Can nonmajors courses lead to biological literacy? Do majors courses do any better? Cell. Biol. Educ. 2006; 4 :42–44. [ PubMed ] [ Google Scholar ]
  • Kokkala I., Gessell D. A. Writing science effectively: biology and English students in an author-editor relationship. J. Coll. Sci. Teach. 2003; 32 (4):252–257. [ Google Scholar ]
  • Kurfiss J. G. Association for the Study of Higher Education. Washington, DC: George Washington University; 1988. Critical Thinking: Theory, Research, Practice, and Possibilities. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Learning to write: learning to think. Educ. Horizons. 1985; 64 (1):36–38. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Urbana, IL: National Council of Teachers of English; 1987. How Writing Shapes Thinking: A Study of Teaching and Learning. NCTE research report no. 22. [ Google Scholar ]
  • Lawson A. E. Using the learning cycle to teach biology concepts and reasoning patterns. J. Biol. Educ. 2001; 35 (4):165–169. [ Google Scholar ]
  • Malcom S. M., Abdallah J., Chubin D. E., Grogan K. A System of Solutions: Every School, Every Student. Washington, DC: American Association for the Advancement of Science; 2005. [ Google Scholar ]
  • Marzano R. J. Fostering thinking across the curriculum through knowledge restructuring. J. Reading. 1991; 34 (7):518–525. [ Google Scholar ]
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine. Washington, DC: Committee on Prospering in the Global Economy of the 21st Century; 2005. Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future. [ Google Scholar ]
  • National Research Council. Washington, DC: National Academy Press; 1995. National Science Education Standards. [ Google Scholar ]
  • National Research Council. Washington, DC: Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; 2003. Bio 2010, Transforming Undergraduate Education for Future Research Biologists. [ Google Scholar ]
  • National Science Foundation. Washington, DC: Directorate for Education and Human Resources; 1996. Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology. [ Google Scholar ]
  • Office of Educational Research Improvement. Washington, DC: 1991. Striving for excellence: The National Education Goals. [ Google Scholar ]
  • Project Kaleidoscope. Washington, DC: National Science Foundation; 2006. Transforming America's Scientific and Technological Infrastructure: Recommendations for Urgent Action. [ Google Scholar ]
  • Resnick L. B. Education and Learning To Think. Washington DC: National Academy Press; 1987. [ Google Scholar ]
  • Rivard L. P. A review of writing to learn in science: implications for practice and research. J. Res. Sci. Teach. 1994; 31 (9):969–983. [ Google Scholar ]
  • Springer L., Donovan S. S., Stanne M. E. Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis. Rev. Educ. Res. 1999; 69 (1):21–51. [ Google Scholar ]
  • Steglich C. S. A writing assignment that changes attitudes in biology classes. Am. Biol. Teach. 2000; 62 (2):98–101. [ Google Scholar ]
  • Taylor K. L., Sobota S. J. Writing in biology: an integration of disciplines. Am. Biol. Teach. 1998; 60 (5):350–353. [ Google Scholar ]
  • Tessier J. Writing assignment in a nonmajor introductory ecology class. J. Coll. Sci. Teach. 2006; 35 (4):25–29. [ Google Scholar ]
  • Tobin K. G., Tippins D. J., Gallard A. J. Research on instructional strategies for teaching science. In: Gabel D. L., editor. Handbook of Research on Science Teaching and Learning. New York: Macmillan; 1994. pp. 45–93. [ Google Scholar ]
  • Tsui L. ASHE annual meeting paper. Miami, FL: 1998. A review of research on critical thinking; pp. 5–8. 1998 November. [ Google Scholar ]
  • Tsui L. Fostering critical thinking through effective pedagogy: evidence from four institutional case studies. J. High. Educ. 2002; 73 (6):740–763. [ Google Scholar ]
  • U.S. Department of Education. Washington, DC: 1990. National Goals for Education. [ Google Scholar ]
  • Watson G., Glaser E. M. Watson-Glaser Critical Thinking Appraisal. Cleveland, OH: The Psychological Corporation (Harcourt Brace Jovanovich); 1980. [ Google Scholar ]

Metacognitive writing strategies, critical thinking skills, and academic writing performance: A structural equation modeling approach

  • Published: 24 November 2022
  • Volume 18 , pages 237–260, ( 2023 )

Cite this article

writing and critical thinking skills

  • Mark Feng Teng   ORCID: orcid.org/0000-0002-5134-8504 1 &
  • Mei Yue   ORCID: orcid.org/0000-0003-0688-1040 2  

2424 Accesses

18 Citations

1 Altmetric

Explore all metrics

The present study adopted the structural equation modeling approach to examine Chinese university students’ metacognition, critical thinking skills, and academic writing. In particular, this research explored whether awareness in metacognition can foster critical thinking and, thus, lead to enhancement in academic writing. The measure for exploring metacognitive writing strategies covered metacognitive knowledge and regulation in academic writing. The measure for understanding learners’ critical thinking encompassed the following five skills: inference, recognition of assumptions, deduction, interpretations, and evaluation of arguments. The academic writing assessment was based on an internal test. The participants consisted of 644 third-year students from a Chinese university. Three models tested: (1) the role of metacognition in academic writing; (2) the role of metacognition in critical thinking; and (3) correlations between metacognition, critical thinking skills, and academic writing. The results indicated significant relationships between the three variables, and the implications based on these findings were discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

writing and critical thinking skills

Similar content being viewed by others

writing and critical thinking skills

Validation of metacognitive academic writing strategies and the predictive effects on academic writing performance in a foreign language context

writing and critical thinking skills

Metacognition in Academic Writing: Learning Dimensions

writing and critical thinking skills

A path model for metacognition and its relation to problem-solving strategies and achievement for different tasks

Atkinson, D. (1997). A critical approach to critical thinking in TESOL. TESOL Quartely, 31 , 71–94.

Atkinson, D., & Ramanathan, V. (1995). Cultures of writing: An ethnographic comparison of Ll and L2 university writing language programs. TESOL Quarterly, 29 , 539–568.

Article   Google Scholar  

Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition . Erlbaum.

Google Scholar  

Bernard, R. M., Zhang, D., Abrami, P. C., Sicoly, F., Borokhovski, E., & Surkes, M. A. (2008). Exploring the structure of the watson–glaser critical thinking appraisal: One scale or many subscales? Thinking Skills and Creativity, 3 , 15–22.

Bui, G., & Kong, A. (2019). Metacognitive instruction for peer review interaction in L2 writing. Journal of Writing Research, 11 (2), 357–392.

Camp, H. (2012). The psychology of writing development—and its implications for assessment. Assessing Writing, 17 (2), 92–105.

Chaffee, J. (2015). Critical thinking, thoughtful writing (6th ed.). Cengage Learning.

Cottrell, S. (2017). Critical thinking skills: Effective analysis, argument and reflection . Bloomsbury Publishing.

Book   Google Scholar  

D’Alessio, F., Avolio, B., & Charles, V. (2019). Studying the impact of critical thinking on the academic performance of executive MBA students. Thinking Skills and Creativity, 31 , 275–283.

De Silva, R., & Graham, S. (2015). The effects of strategy instruction on writing strategy use for students of different proficiency levels. System, 53 , 47–59.

Efklides, A. (2001). Metacognitive experiences in problem solving: Metacognition, motivation, and self-regulation. In A. Efklides, J. Kuhl, & R. M. Sorrentino (Eds.), Trends and prospects in motivation research (pp. 297–323). Kluwer.

Chapter   Google Scholar  

Efklides, A. (2006). Metacognition and affect: What can metacognitive experiences tell us about the learning process? Educational Research Review, 1 , 3–14.

Efklides, A. (2008). Metacognition: Defining its facets and levels of functioning in relation to self-regulation and co-regulation. European Psychologist, 13 , 277–287.

Efklides, A., Niemivirta, M., & Yamauchi, H. (2003). Motivation and self-regulation: Processes involved and context effects—A discussion. Psychologia: An International Journal of Psychology in the Orient, 46 , 38–52.

Evans, J St. .B. T., & Stanovich, K. E. (2013). Dual-process theories of higher cognition: advancing the debate. Perspectives on Psychological Science, 8 , 223–241.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new era of cognitive developmental inquiry. American Psychologist, 34 , 906–911.

Flavell, J. H. (2004). Theory-of-Mind development: Retrospect and prospect. Merrill-Palmer Quarterly, 50 , 274–290.

Flower, L., & Hayes, J. R. (1980). The dynamics of composing: Making plans and juggling constraints. In L. Gregg & E. Steinberg (Eds.), Cognitive processes in writing (pp. 31–50). Erlbaum.

Gammil, D. (2006). Learning the write way. The Reading Teacher, 59 (8), 754–762.

Graham, S., & Harris, K. (2000). The role of self-regulation and transcription skills in writing and writing development. Educational Psychologist, 35 , 3–12.

Graham, S. (2006). Writing. In P. A. Alexander & P. H. Winne (Eds.,), Handbook of educational psychology (pp. 457–478). Mahwah: Lawrence Erlbaum.

Hacker, D. J., Keener, M. C., & Kircher, J. C. (2009). Writing is applied metacognition. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 154–172). Routledge.

Hall, N. C., & Goetz, T. (Eds.). (2013). Emotion, motivation, and self-regulation: A handbook for teachers . Bingley.

Halpern, D. F. (1998). Teaching critical thinking across domains: Dispositions, skills, structure training, and metacognitive monitoring. American Psychologist, 53 (4), 449–455.

Harris, K. R., Santangelo, T., & Graham, S. (2010). Metacognition and strategies instruction in writing. In H. S. Waters & W. Schneider (Eds.), Metacognition, strategy use, and instruction (pp. 226–256). The Guilford Press.

Hedgcock, J. S. (2012). Second language writing processes among adolescent and adult learners. In E. L. Grigorenko, E. Mambrino, & D. D. Preiss (Eds.), Writing a mosaic of new perspectives (pp. 221–239). Psychology Press.

Hu, L., & Bentler, P. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6 , 1–55.

Jacobs, J. E., & Paris, S. G. (1987). Children's metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22 (3-4), 255–278.

Kellogg, R. T. (1994). The psychology of writing . Oxford University Press.

Kellogg, R. T. (2008). Training writing skills: A cognitive developmental perspective. Journal of Writing Research, 1 (1), 1–26.

Kline, R. B. (1998). Methodology in the social sciences. Principles and practice of structural equation modeling . Guilford Press.

Ku, K. Y. L., & Ho, I. T. (2010). Metacognitive strategies that enhance critical thinking. Metacognition and Learning, 5 , 251–267.

Magno, C. (2010). The role of metacognitive skills in developing critical thinking. Metacognition and Learning, 5 , 137–156.

Mahoney, B. (2011). Critical thinking in psychology: Personality and individual differences . Learning Matters.

McCormick, C. B. (2003). Metacognition and learning. In W. M. Reynolds & G. E. Miller (Eds.), Handbook of psychology (Vol. 7, pp. 79–102). John Wiley & Sons.

Mehta, S. R., & Al-Mahrooqi, R. (2015). Can thinking be taught? Linking critical thinking and writing in an EFL context. RELC Journal, 46 , 23–36.

Meijera, J., Sleegersb, P., Elshout-Mohra, M., van Daalen-Kapteijnsa, M., Meeusc, W., & Tempelaar, D. (2013). The development of a questionnaire on metacognition for students in higher education. Education Research, 55 , 31–52.

Moss, P. A., & Kozdiol, S. M. (1991). Investigating the validity of a locally developed critical thinking test. Educational Measurement Issues and Practice, 10 (3), 17–22.

Negretti, R., & McGrath, L. (2018). Scaffolding genre knowledge and metacognition: Insights from an L2 doctoral research writing course. Journal of Second Language Writing, 40 , 12–31.

Nelson, T. O. (1996). Consciousness and metacognition. American Psychologist, 51 , 102–116.

Newman, D. R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Computing and Technology, 3 (2), 56–77.

Oxford, R. L. (2013). Teaching and researching language learning strategies (2nd ed.). Pearson.

Ozarska, M. (2008). Some suggestions for academic writing instruction at English teacher training colleges. English Teaching Forum, 48 , 30–33.

Pally, M. (2001). Skills development in 'sustained' contentbased curricula: Case studies in analytical/critical thinking and academic writing. Language and Education, 15 (4), 279–305.

Paul, R. (1995). What every student needs to survive in a rapidly changing world . The Foundation for Critical Thinking.

Phakiti, A. (2018). Assessing higher-order thinking skills in language learning. In J. I. Liontas (Ed.), The TESOL encyclopedia of English language teaching (pp. 1–7). Wiley.

Pintrich, P. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory into Practice, 41 (4), 219–225.

Pintrich, P. R., Wolters, C. A., & Baxter, G. P. (2000). Assessing metacognition and self-regulated learning. In J. C. Impara, G. Schraw, & J. C. Impara (Eds.), Issues in the measurement of metacognition (pp. 43–97). University of Nebraska-Lincoln.

Puustinen, M., & Pulkkinen, L. (2001). Models of Self-regulated learning: A review. Scandinavian Journal of Educational Research, 45 , 269–286.

Qin, L. M., & Zhang, L. J. (2019). English as a foreign language writers’ metacognitive strategy knowledge of writing and their working performance in multimedia environments. Journal of Writing Research, 12 (2), 393–413.

Ramanathan, V., & Atkinson, D. (1999). Individualism, academic writing, and ESL writers. Journal of Second Language Writing, 8 , 45–75.

Ruan, Z. (2014). Metacognitive awareness of EFL student writers in a Chinese ELT context. Language Awareness, 23 (1–2), 76–91.

Sato, M. (2022). Metacognition. In S. Li, P. Hiver & M. Papi (eds.), The Routledge handbook of second language acquisition and individual differences (95–108). Routledge.

Schoenfeld, A. (1987). What's all the fuss about metacognition. In Schoen- feld, A. (Ed.) Cognitive science and mathematics education (pp.189–215). Lawrence Erlbaum Associates.

Schraw, G. A. (1998). Promoting general metacognitive awareness. Instructional Science, 26 , 113–125.

Schraw, G. A. (2001). Promoting general metacognitive awareness. In H. J. Hartman (Ed.), Metacognition in learning and instruction: Theory, research and practice (pp. 3–16). Springer.

Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19 , 460–475.

Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7 (4), 351–371.

Schroyens, W. (2005). Knowledge and thought: An introduction to critical thinking. Experimental Psychology, 52 (2), 163–164.

Schuster, S. (2019). The critical thinker: The path to better problem solving, accurate decision making, and self-disciplined thinking . CreateSpace Independent Publishing Platform.

Shimamura, A. P. (2000). Toward a cognitive neuroscience of metacognition. Consciousness and Cognition, 9 , 313–323.

Shor, I. (1992). Empowering education: Critical teaching for social change . University of Chicago Press.

Stanovich, K. E. (1999). Who is rational? Studies of individual differences in reasoning . Elrbaum.

Stanovich, K. E., & Toplak, M. E. (2012). Defining features versus incidental correlates of Type 1 and Type 2 processing. Mind & Society, 11 , 3–13.

Stanovich, K. E. (2009). Distinguishing the reflective, algorithmic, and autonomous minds: Is it time for a tri-process theory? In J. St. B. T. Evans & K. Frankish (Eds.), In two minds: Dual processes and beyond (pp.55–88). New York: Oxford University Press.

Sternberg, R. (1985). Approaches to intelligence. In S. F. Chipman, J. W. Segal & R. Glaser, (Eds.) Thinking and learning skills (Vol.2). Hillsdale, NJ: Erlbaum.

Tan, M., Randi, J., Barbot, B., Levenson, C., Friedlaender, L. K., & Grigorenko, E. L. (2012). Seeing, connecting, writing: Developing creativity and narrative writing in children. In E. L. Grigorenko, E. Mambrino, & D. Preiss (Eds.), Writing: A mosaic of new perspectives (pp. 275–291). Psychology Press.

Teng, F. (2016). Immediate and delayed effects of embedded metacognitive instruction on Chinese EFL students’ English writing and regulation of cognition. Thinking Skills & Creativity, 22 , 289–302.

Teng, F. (2019). The role of metacognitive knowledge and regulation in mediating university EFL learners’ writing performance. Innovation in Language Learning and Teaching . https://doi.org/10.1080/17501229.2019.1615493

Teng, F. (2021). Interactive-whiteboard-technology-supported collaborative writing: Writing achievement, metacognitive activities, and co-regulation patterns. System, 97 , 102426. https://doi.org/10.1016/j.system.2020.102426

Teng, F., & Huang, J. (2019). Predictive effects of writing strategies for self-regulated learning on secondary school learners’ EFL writing proficiency. TESOL Quarterly, 53 , 232–247.

Teng, F., & Huang, J. (2021). The effects of incorporating metacognitive strategies instruction into collaborative writing on writing complexity, accuracy, and fluency. Asia Pacific Journal of Education . https://doi.org/10.1080/02188791.2021.1982675

Teng, L. S., & Zhang, L. J. (2018). Effects of motivational regulation strategies on writing performance: A mediation model of self-regulated learning of writing in English as a second/foreign language. Metacognition and Learning, 13 , 213–240.

Teng, F., & Zhang, L. J. (2021). Development of children’s metacognitive knowledge, and reading and writing proficiency in English as a foreign language: Longitudinal data using multilevel models. British Journal of Educational Psychology, 91 (4), 1202–1230.

Teng, F., Wang, C., & Zhang, L. J. (2022a). Assessing self-regulatory writing strategies and their predictive effects on young EFL learners’ writing performance. Assessing Writing, 51 , 100573. https://doi.org/10.1016/j.asw.2021.100573

Teng, F., Qin, C., & Wang, C. (2022b). Validation of metacognitive academic writing strategies and the predictive effects on academic writing performance in a foreign language context. Metacognition and Learning, 17 , 167–190.

Troia, G. A., Harbaugh, A. G., Shankland, R. K., Wolbers, K. A., & Lawrence, A. M. (2013). Relationships between writing motivation, writing activity, and writing performance: Effects of grade, sex, and ability. Reading and Writing, 26 , 17–44.

Veenman, M. V. J., & Elshout, J. J. (1999). Changes in the relation between cognitive and metacognitive skills during the acquisition of expertise. European Journal of Psychology of Education, XIV , 509–523.

Veenman, M. V. J., & Van Cleef, D. (2019). Measuring metacognitive skills for mathematics: Students’ self-reports vs. on-line assessment methods. ZDM International Journal on Mathematics Education, 51 , 691–701.

Veenman, M. V. J., van Hout-Wolters, B. H. A. M., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning, 1 , 3–14.

Vygotsky, L. S. (1987). Thinking and speech. In R.W. Rieber & A.S. Carton (Eds.), The collected works of L.S. Vygotsky, Volume 1: Problems of general psychology (pp. 39–285). Plenum Press.

Watson, G., & Glaser, E. M. (1980). Watson-Glaser critical thinking appraisal . Psychological Corp.

Watson, G., & Glaser, E. M. (2008). Watson-glaser critical thinking appraisal: Short form manual . Pearson.

Watson, G., & Glaser, E. M. (2009). Watson-glaser II critical thinking appraisal: Technical manual and user’s guide . Pearson.

Wenden, A. L. (1998). Metacognitive knowledge and language learning. Applied Linguistics, 19 , 515–537.

Wenden, A. (2001). Metacognitive knowledge in SLA: The neglected variable. In M. Breen (Ed.), Learner contributions to language learning: New directions in research (pp. 44–64). Harlow: Pearson Education

Wolters, C. A. (1999). The relation between high school students’ motivational regulation and their use of learning strategies, effort, and classroom performance. Learning & Individual Differences, 11 , 281–299.

Wolters, C. A., & Benzon, M. B. (2013). Assessing and predicting college students’ use of strategies for the self-regulation of motivation. Journal of Experimental Education, 81 , 199–221.

Woodrow, L. (2011). College English writing affect: Self efficacy and anxiety. System, 39 , 510–522.

Zhang, L. J., & Qin, T. L. (2018). Validating a questionnaire on EFL writers’ metacognitive awareness of writing strategies in multimedia environments. In A. Haukås, C. Bjørke, & Dypedahl, M. (Eds.), Metacognition in language learning and teaching (pp. 157–179). London, England: Routledge.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41 (2), 64–70.

Zimmerman, B. J., & Risemberg, R. (1997). Becoming a self-regulated writer: A social cognitive perspective. Contemporary Educational Psychology, 22 , 73–101.

Download references

Acknowledgements

The study was supported by the Project from the Education Department of Hainan Province (Project number: Hnky2020ZD-9). We appreciate Professor Chuang Wang’ help in proofreading this article.

Author information

Authors and affiliations.

Center for Linguistic Sciences, Beijing Normal University, Zhuhai, China

Mark Feng Teng

School of Foreign Languages, Hainan University, Haikou, China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mei Yue .

Ethics declarations

This article involves human participants performed by the authors. Informed consent was obtained from all individual participants involved in the study. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and its later amendments or comparable ethical standards.

Conflict of interest

The authors declare that they have no conflicts of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (PDF 365 kb)

Rights and permissions.

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Teng, M.F., Yue, M. Metacognitive writing strategies, critical thinking skills, and academic writing performance: A structural equation modeling approach. Metacognition Learning 18 , 237–260 (2023). https://doi.org/10.1007/s11409-022-09328-5

Download citation

Received : 27 March 2022

Accepted : 07 November 2022

Published : 24 November 2022

Issue Date : April 2023

DOI : https://doi.org/10.1007/s11409-022-09328-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Metacognitive knowledge
  • Metacognitive regulation
  • Metacognition
  • Academic writing
  • Find a journal
  • Publish with us
  • Track your research

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.1: Breaking down critical thinking into categories

  • Last updated
  • Save as PDF
  • Page ID 235724

  • Rachel Bell, Jim Bowsher, Eric Brenner, Serena Chu-Mraz, Liza Erpelo, Kathleen Feinblum, Nina Floro, Gwen Fuller, Chris Gibson, Katharine Harer, Cheryl Hertig, Lucia Lachmayr, Eve Lerman, Nancy Kaplan-Beigel, Nathan Jones, Garry Nicol, Janice Sapigao, Leigh Anne Shaw, Paula Silva, Jessica Silver-Sharp, Mine Suer, Mike Urquidez, Rob Williams, Karen Wong, Susan Zoughbie, Leigh Anne Shaw, Paula Silva, Jessica Silver-Sharp, Mine Suer, Mike Urquidez, Rob Williams, Karen Wong, and Susan Zoughbie
  • Skyline College

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

WHAT IS CRITICAL THINKING?

  • Critical thinking is a set of skills designed to help the thinker analyze, assess and question a given situation or reading.
  • Critical thinking skills push the thinker to reject simplistic conclusions based on human irrationality, false assumptions, prejudices, biases and anecdotal evidence.
  • Critical thinking skills give thinkers confidence that they can see issues which are complex and which have several answers and points of view and that opinions and insights can change with new information.

WHAT DO CRITICAL THINKERS DO?

  • Consider all sides of an issue
  • Judge well the quality of an argument
  • Judge well the credibility of sources
  • Create convincing arguments using sound evidence and analysis
  • Effectively recognize and use ethos (ethics), pathos (empathy) and logos (logic) in argument

WHY IS IT IMPORTANT?

People will listen to and respect critical thinkers with these abilities because…

  • Considering all sides of an issue means they are open-minded, informed, and mindful of alternatives and other points of view.
  • Judging well the quality of an argument means they can effectively identify and evaluate another’s reasons, assumptions and conclusions and not be fooled into believing false or unsubstantiated claims.
  • Judging well the credibility of sources means they can recognize and present the most reputable, trustworthy and convincing evidence.
  • Creating convincing arguments using sound evidence and analysis means they can formulate plausible hypotheses and draw conclusions which are thoughtful and verifiable.
  • Effectively recognizing and using ethos, pathos and logos in argument means they construct well-crafted points using a balance of morality and ethics, consideration and empathy for others, as well as sound and logical reasoning.

HOW DO I USE CRITICAL THINKING?

Breaking down into categories how to analyze a topic or text (one written by you or another author) will help you examine it thoroughly and critically. Use these questions to assist you:

Clarity: Is it understandable and can the meaning be clearly grasped?

  • Is the main idea clear?
  • Can examples be added to better illustrate the points?
  • Are there confusing or unrelated points?

Accuracy: Is it free from errors or distortions—is it true?

  • Do I need to verify the truth of the claims?
  • Is credible evidence used correctly and fairly?
  • Is additional research needed?

Precision: Is it exact with specific details?

  • Can the wording be more exact?
  • Are the claims too general?
  • Are claims supported with concrete evidence?

Relevance: How does it relate to the topic or assignment?

  • Does it help illuminate the topic or assignment?
  • Does it provide new or important information?
  • Who does the content have the most relevance for?

Depth: Does it contain complexities and delve into the larger implications?

  • What are some of the complexities explored?
  • What are some of the difficulties that should be addressed?
  • What are the larger implications or impact?

Breadth: Does it encompass multiple viewpoints?

  • Do I need to look at this from another perspective?
  • What other people would have differing viewpoints?
  • Do I need to look at this in other ways?

Logic: Do the parts make sense together and are there no contradictions?

  • Do all the points work together logically to prove one clear argument?
  • Does one paragraph follow logically from the next?
  • Does the evidence directly prove the main points?

Significance: Does it focus on what is important?

  • Is this the most important aspect to consider?
  • Which of the facts or points are the most important?
  • Does it examine a larger significance?

Fairness: Is it justifiable and not self-serving or one-sided?

  • Do I have any vested interest in this issue that can affect my reaction?
  • Is personal bias or a hidden agenda driving the point?
  • Are the viewpoints of others sympathetically represented?

Use this chart to help you apply these critical thinking categories to a particular text or topic:

Critical thinking definition

writing and critical thinking skills

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

  • Augsburg.edu
  • Inside Augsburg

Search Strommen Center for Meaningful Work

  • Faculty & Staff
  • Graduate Students
  • First Generation
  • International
  • Students With Disabilities
  • Undocumented
  • Business & Finance
  • Culture and Language
  • Environmental Sustainability
  • Government, Law & Policy
  • Health Professions
  • Human & Social Services
  • Information Technology & Data
  • Marketing, Media & Communications
  • Resumes and Cover Letters
  • Expand Your Network / Mentor
  • Explore Your Interests / Self Assessment
  • Negotiate an Offer
  • Prepare for an Interview
  • Prepare for Graduate School
  • Search for a Job / Internship
  • Job Fair Preparation
  • Start Your Internship
  • Choosing a Major
  • Career Collaborative
  • Travelers EDGE
  • Meet the Team

Critical Thinking: A Simple Guide and Why It’s Important

  • Share This: Share Critical Thinking: A Simple Guide and Why It’s Important on Facebook Share Critical Thinking: A Simple Guide and Why It’s Important on LinkedIn Share Critical Thinking: A Simple Guide and Why It’s Important on X

Critical Thinking: A Simple Guide and Why It’s Important was originally published on Ivy Exec .

Strong critical thinking skills are crucial for career success, regardless of educational background. It embodies the ability to engage in astute and effective decision-making, lending invaluable dimensions to professional growth.

At its essence, critical thinking is the ability to analyze, evaluate, and synthesize information in a logical and reasoned manner. It’s not merely about accumulating knowledge but harnessing it effectively to make informed decisions and solve complex problems. In the dynamic landscape of modern careers, honing this skill is paramount.

The Impact of Critical Thinking on Your Career

☑ problem-solving mastery.

Visualize critical thinking as the Sherlock Holmes of your career journey. It facilitates swift problem resolution akin to a detective unraveling a mystery. By methodically analyzing situations and deconstructing complexities, critical thinkers emerge as adept problem solvers, rendering them invaluable assets in the workplace.

☑ Refined Decision-Making

Navigating dilemmas in your career path resembles traversing uncertain terrain. Critical thinking acts as a dependable GPS, steering you toward informed decisions. It involves weighing options, evaluating potential outcomes, and confidently choosing the most favorable path forward.

☑ Enhanced Teamwork Dynamics

Within collaborative settings, critical thinkers stand out as proactive contributors. They engage in scrutinizing ideas, proposing enhancements, and fostering meaningful contributions. Consequently, the team evolves into a dynamic hub of ideas, with the critical thinker recognized as the architect behind its success.

☑ Communication Prowess

Effective communication is the cornerstone of professional interactions. Critical thinking enriches communication skills, enabling the clear and logical articulation of ideas. Whether in emails, presentations, or casual conversations, individuals adept in critical thinking exude clarity, earning appreciation for their ability to convey thoughts seamlessly.

☑ Adaptability and Resilience

Perceptive individuals adept in critical thinking display resilience in the face of unforeseen challenges. Instead of succumbing to panic, they assess situations, recalibrate their approaches, and persist in moving forward despite adversity.

☑ Fostering Innovation

Innovation is the lifeblood of progressive organizations, and critical thinking serves as its catalyst. Proficient critical thinkers possess the ability to identify overlooked opportunities, propose inventive solutions, and streamline processes, thereby positioning their organizations at the forefront of innovation.

☑ Confidence Amplification

Critical thinkers exude confidence derived from honing their analytical skills. This self-assurance radiates during job interviews, presentations, and daily interactions, catching the attention of superiors and propelling career advancement.

So, how can one cultivate and harness this invaluable skill?

✅ developing curiosity and inquisitiveness:.

Embrace a curious mindset by questioning the status quo and exploring topics beyond your immediate scope. Cultivate an inquisitive approach to everyday situations. Encourage a habit of asking “why” and “how” to deepen understanding. Curiosity fuels the desire to seek information and alternative perspectives.

✅ Practice Reflection and Self-Awareness:

Engage in reflective thinking by assessing your thoughts, actions, and decisions. Regularly introspect to understand your biases, assumptions, and cognitive processes. Cultivate self-awareness to recognize personal prejudices or cognitive biases that might influence your thinking. This allows for a more objective analysis of situations.

✅ Strengthening Analytical Skills:

Practice breaking down complex problems into manageable components. Analyze each part systematically to understand the whole picture. Develop skills in data analysis, statistics, and logical reasoning. This includes understanding correlation versus causation, interpreting graphs, and evaluating statistical significance.

✅ Engaging in Active Listening and Observation:

Actively listen to diverse viewpoints without immediately forming judgments. Allow others to express their ideas fully before responding. Observe situations attentively, noticing details that others might overlook. This habit enhances your ability to analyze problems more comprehensively.

✅ Encouraging Intellectual Humility and Open-Mindedness:

Foster intellectual humility by acknowledging that you don’t know everything. Be open to learning from others, regardless of their position or expertise. Cultivate open-mindedness by actively seeking out perspectives different from your own. Engage in discussions with people holding diverse opinions to broaden your understanding.

✅ Practicing Problem-Solving and Decision-Making:

Engage in regular problem-solving exercises that challenge you to think creatively and analytically. This can include puzzles, riddles, or real-world scenarios. When making decisions, consciously evaluate available information, consider various alternatives, and anticipate potential outcomes before reaching a conclusion.

✅ Continuous Learning and Exposure to Varied Content:

Read extensively across diverse subjects and formats, exposing yourself to different viewpoints, cultures, and ways of thinking. Engage in courses, workshops, or seminars that stimulate critical thinking skills. Seek out opportunities for learning that challenge your existing beliefs.

✅ Engage in Constructive Disagreement and Debate:

Encourage healthy debates and discussions where differing opinions are respectfully debated.

This practice fosters the ability to defend your viewpoints logically while also being open to changing your perspective based on valid arguments. Embrace disagreement as an opportunity to learn rather than a conflict to win. Engaging in constructive debate sharpens your ability to evaluate and counter-arguments effectively.

✅ Utilize Problem-Based Learning and Real-World Applications:

Engage in problem-based learning activities that simulate real-world challenges. Work on projects or scenarios that require critical thinking skills to develop practical problem-solving approaches. Apply critical thinking in real-life situations whenever possible.

This could involve analyzing news articles, evaluating product reviews, or dissecting marketing strategies to understand their underlying rationale.

In conclusion, critical thinking is the linchpin of a successful career journey. It empowers individuals to navigate complexities, make informed decisions, and innovate in their respective domains. Embracing and honing this skill isn’t just an advantage; it’s a necessity in a world where adaptability and sound judgment reign supreme.

So, as you traverse your career path, remember that the ability to think critically is not just an asset but the differentiator that propels you toward excellence.

The Pros and Cons of AI in Special Education

writing and critical thinking skills

  • Share article

Special education teachers fill out mountains of paperwork, customize lessons for students with a wide range of learning differences, and attend hours of bureaucratic meetings.

It’s easy to see why it would be tempting to outsource parts of that job to a robot.

While there may never be a special educator version of “Star Wars”’ protocol droid C-3PO, generative artificial tools—including ChatGPT and others developed with the large language models created by its founder, Open AI—can help special education teachers perform parts of their job more efficiently, allowing them to spend more time with their students, experts and educators say.

But those shortcuts come with plenty of cautions, they add.

Teachers need to review artificial intelligence’s suggestions carefully to ensure that they are right for specific students. Student data—including diagnoses of learning differences or cognitive disorders—need to be kept private.

Even special educators who have embraced the technology urge to proceed with care.

“I’m concerned about how AI is being presented right now to educators, that it’s this magical tool,” said Julie Tarasi, who teaches special education at Lakeview Middle School in the Park Hill school district near Kansas City, Mo. She recently completed a course in AI sponsored by the International Society for Technology in Education. “And I don’t think that the AI literacy aspect of it is necessarily being [shared] to the magnitude that it should be with teachers.”

Park Hill is cautiously experimenting with AI’s potential as a paperwork partner for educators and an assistive technology for some students in special education.

The district is on the vanguard. Only about 1 in 6 principals and district leaders—16 percent—said their schools or districts were piloting AI tools or using them in a limited manner with students in special education, according to a nationally representative EdWeek Research Center survey conducted in March and April.

AI tools may work best for teachers who already have a deep understanding of what works for students in special education, and of the tech itself, said Amanda Morin, a member of the advisory board for the learner-variability project at Digital Promise, a nonprofit organization that works on equity and technology issues in schools.

“If you feel really confident in your special education knowledge and experience and you have explored AI [in depth], I think those two can combine in a way that can really accelerate the way you serve students,” Morin said.

But “if you are a novice at either, it’s not going to serve your students well because you don’t know what you don’t know yet,” she added. “You may not even know if the tool is giving you a good answer.”

Here are some of the areas where Park Hill educators and other school and district leaders see AI’s promise for special education—and what caveats to look out for:

Promise: Reducing the paperwork burden.

Some special education teachers spend as many as eight hours a week writing student-behavior plans, progress reports, and other documentation.

“Inevitably, we’re gonna get stuck, we’re gonna struggle to word things,” Tarasi said. AI can be great for busting through writer’s block or finding a clearer, more objective way to describe a student’s behavior, she said.

What’s more, tools such as Magic School—an AI platform created for K-12 education—can help special education teachers craft the student learning goals that must be included in an individualized education program, or IEP.

“I can say ‘I need a reading goal to teach vowels and consonants to a student,’ and it will generate a goal,” said Tara Bachmann, Park Hill’s assistive-technology facilitator. “You can put the criteria you want in, but it makes it measurable, then my teachers can go in and insert the specifics about the student” without involving AI, Bachmann said.

These workarounds can cut the process of writing an IEP by up to 30 minutes, Bachmann said—giving teachers more time with students.

AI can also come to the rescue when a teacher needs to craft a polite, professional email to a parent after a stress-inducing encounter with their child.

Some Park Hill special education teachers use “Goblin,” a free tool aimed at helping neurodivergent people organize tasks, to take the “spice” out of those messages, Tarasi said.

A teacher could write “the most emotionally charged email. Then you hit a button called ‘formalize.’ And it makes it like incredibly professional,” Bachmann said. “Our teachers like it because they have a way to release the emotion but still communicate the message to the families.”

Caveat: Don’t share personally identifiable student information. Don’t blindly embrace AI’s suggestions.

Teachers must be extremely careful about privacy issues when using AI tools to write documents—from IEPs to emails—that contain sensitive student information, Tarasi said.

“If you wouldn’t put it on a billboard outside of the school, you should not be putting it into any sort of AI,” Tarasi said. “There’s no sense of guaranteed privacy.”

Tarasi advises her colleagues to “absolutely not put in names” when using generative AI to craft documents, she said. While including students’ approximate grade level may be OK in certain circumstances, inputting their exact age or mentioning a unique diagnosis is a no-no.

To be sure, if the information teachers put into AI is too vague, educators might not get accurate suggestions for their reports. That requires a balance.

“You need to be specific without being, without being pinpoint,” Tarasi said.

Caveat: AI works best for teachers who already understand special education

Another caution: Although AI tools can help teachers craft a report or customize a general education lesson for students in special education, teachers need to already have a deep understanding of their students to know whether to adopt its recommendations.

Relying solely on AI tools for lesson planning or writing reports “takes the individualized out of individualized education,” Morin said. “Because what [the technology] is doing is spitting out things that come up a lot” as opposed to carefully considering what’s best for a specific student, like a good teacher can.

Educators can tweak their prompts—the questions they ask AI—to get better, more specific advice, she added.

“A seasoned special educator would be able to say ‘So I have a student with ADHD, and they’re fidgety’ and get more individualized recommendations,” Morin said.

Promise: Making lessons more accessible.

Ensuring students in special education master the same course content as their peers can require teachers to spend hours simplifying the language of a text to an appropriate reading level.

Generative AI tools can accomplish that same task—often called “leveling a text"—in just minutes, said Josh Clark, the leader of the Landmark School , a private school in Massachusetts serving children with dyslexia and other language-based learning differences.

“If you have a class of 30 kids in 9th grade, and they’re all reading about photosynthesis, then for one particular child, you can customize [the] reading level without calling them out and without anybody else knowing and without you, the teacher, spending hours,” Clark said. “I think that’s a super powerful way of allowing kids to access information they may not be able to otherwise.”

Similarly, in Park Hill, Bachmann has used Canva—a design tool with a version specifically geared toward K-12 schools and therefore age-appropriate for many students—to help a student with cerebral palsy create the same kind of black-and-white art his classmates were making.

Kristen Ponce, the district’s speech and language pathologist, has used Canva to provide visuals for students in special education as they work to be more specific in their communication.

Case-in-point: One of Ponce’s students loves to learn about animals, but he has a very clear idea of what he’s looking for, she said. If the student just says “bear,” Canva will pull up a picture of, for instance, a brown grizzly. But the student may have been thinking of a polar bear.

That gives Ponce the opportunity to tell him, “We need to use more words to explain what you’re trying to say here,” she said. “We were able to move from ‘bear’ to ‘white bear on ice.’”

Caveat: It’s not always appropriate to use AI as an accessibility tool.

Not every AI tool can be used with every student. For instance, there are age restrictions for tools like ChatGPT, which isn’t for children under 13 or those under 18 without parent permission, Bachmann said. (ChatGPT does not independently verify a user’s age.)

“I caution my staff about introducing it to children who are too young and remembering that and that we try to focus on what therapists and teachers can do collectively to make life easier for [students],” she said.

“Accessibility is great,” she said. But when a teacher is thinking about “unleashing a child freely on AI, there is caution to it.”

Promise: Using AI tools to help students in special education communicate.

Park Hill is just beginning to use AI tools to help students in special education express their ideas.

One recent example: A student with a traumatic brain injury that affected her language abilities made thank you cards for several of her teachers using Canva.

“She was able to generate personal messages to people like the school nurses,” Bachmann said. “To her physical therapist who has taken her to all kinds of events outside in the community. She said, ‘You are my favorite therapist.’ She got very personal.”

There may be similar opportunities for AI to help students in special education write more effectively.

Some students with learning and thinking differences have trouble organizing their thoughts or getting their point across.

“When we ask a child to write, we’re actually asking them to do a whole lot of tasks at once,” Clark said. Aspects of writing that might seem relatively simple to a traditional learner—word retrieval, grammar, punctuation, spelling—can be a real roadblock for some students in special education, he said.

“It’s a huge distraction,” Clark said. The student may “have great ideas, but they have difficulty coming through.”

Caveat: Students may miss out on the critical-thinking skills writing builds.

Having students with language-processing differences use AI tools to better express themselves holds potential, but if it is not done carefully, students may miss developing key skills, said Digital Promise’s Morin.

AI “can be a really positive adaptive tool, but I think you have to be really structured about how you’re doing it,” she said.

ChatGPT or a similar tool may be able to help a student with dyslexia or a similar learning difference “create better writing, which I think is different than writing better,” Morin said.

Since it’s likely that students will be able to use those tools in the professional world, it makes sense that they begin using them in school, she said.

But the tools available now may not adequately explain the rationale behind the changes they make to a student’s work or help students express themselves more clearly in the future.

“The process is just as important as the outcome, especially with kids who learn differently, right?” Morin said. “Your process matters.”

Clark agreed on the need for moving cautiously. His own school is trying what he described as “isolated experiments” in using AI to help students with language-processing differences express themselves better.

The school is concentrating, for now, on older students preparing to enter college. Presumably, many will be able to use AI to complete some postsecondary assignments. “How do we make sure it’s an equal playing field?” Clark said.

A version of this article appeared in the May 22, 2024 edition of Education Week as The Pros and Cons of AI in Special Education

A teacher putting her arms around her students, more students than she can manage herself. A shortage of Special Education teachers.

Sign Up for EdWeek Update

Edweek top school jobs.

Image of a group of students meeting with their teacher. One student is giving the teacher a high-five.

Sign Up & Sign In

module image 9

Critical Writing/Critical Thinking

Course session, target audience:.

Individuals who are competent in basic writing skills but interested in strengthening their ability to communicate through their government writing and technical staff who want to construct more compelling, logical, and persuasive arguments and whose job it is to write or review a variety of regulatory program documents, including findings, decision documents, technical reports, recommendations, comment letters, informational documents, and planning documents.

Summary and Objectives:

This intensive, hands-on writing course is designed to achieve excellence in writing through clarity in thinking. Critical thinking involves evaluating information to reach a well-justified decision or recommendation, and critical writing is the clear, unambiguous communication of the information. The goal of this course is to develop participants' skills as writers and editors/reviewers in order to make U.S. Fish and Wildlife Service policy and science writing clear, well-organized, persuasive, and logical.

Upon completion of this course, participants will be able to:

  • Engage in the various stages of the writing process (invention, arrangement, selection, writing, editing, and revision) when composing documents.
  • Consider the essential elements of content, organization, tone, and clarity when writing.
  • Construct sound, logical, and compelling arguments.
  • Develop, organize, and link ideas, resulting in clear, persuasive, and logical writing.
  • Use critical thinking skills to differentiate between fact and opinion, identify author bias, develop inferential skills, and recognize logical fallacies and faulty reasoning.
  • Apply simple techniques (such as the IRAC format) that make documents analytical, reasoned, and understandable.
  • Write well-organized sentences, paragraphs, and documents using proper conventions with reference to legal and biological standards.
  • Apply proper grammar, sentence structure structure Something temporarily or permanently constructed, built, or placed; and constructed of natural or manufactured parts including, but not limited to, a building, shed, cabin, porch, bridge, walkway, stair steps, sign, landing, platform, dock, rack, fence, telecommunication device, antennae, fish cleaning table, satellite dish/mount, or well head. Learn more about structure , and plain language rules. 
  • Diagnose problems in writing samples and determine improvements that could strengthen the piece.
  • Edit and revise their writing so that it is concise, contains relevant information, and is free of errors.

This course does not cover how to write journal articles for publication in scientific journals; for this type of training, please see Scientific Writing for Publication: Course and Clinic (CSP3182) .

Competency Addressed:

Critical Thinking - Awareness, Writing - Intermediate, Written Communication - Intermediate, Communicating Science - Intermediate 

Questions and Registration

Course contact, upcoming sessions of this course.

Session date and time Mon, 06/03/2024 - 08:00 - Fri, 06/07/2024 - 12:00 (MDT) Location Classroom Albuquerque, NM

Session Contact

*DOI PIV card holders may use the button above to register for courses directly in DOI Talent. If you are not affiliated with DOI, follow instructions for External, Non-DOI learners to obtain an account. Need help for registration, contact session contact.

You are exiting the U.S. Fish and Wildlife Service website

You are being directed to

We do not guarantee that the websites we link to comply with Section 508 (Accessibility Requirements) of the Rehabilitation Act. Links also do not constitute endorsement, recommendation, or favoring by the U.S. Fish and Wildlife Service.

IMAGES

  1. Critical_Thinking_Skills_Diagram_svg

    writing and critical thinking skills

  2. Critical Thinking

    writing and critical thinking skills

  3. Tools Of Critical Thinking

    writing and critical thinking skills

  4. 6 Main Types of Critical Thinking Skills (With Examples)

    writing and critical thinking skills

  5. Critical Thinking Skills

    writing and critical thinking skills

  6. The benefits of critical thinking for students and how to develop it

    writing and critical thinking skills

VIDEO

  1. Reflective Writing & Critical Thinking||Unit-1||Part-1||TLP||Bsn 5th semester||In Urdu/English

  2. Pathways 1 Reading Writing Critical Thinking

  3. Top Critical Thinking Skills

  4. Role of Critical Reading and Writing In Enhancing Writing Skills

  5. Critical Thinking Skills An Epilogue

  6. How to develop critical thinking skills

COMMENTS

  1. Writing to Think: Critical Thinking and the Writing Process

    "Writing is thinking on paper." (Zinsser, 1976, p. vii) Google the term "critical thinking." How many hits are there? On the day this tutorial was completed, Google found about 65,100,000 results in 0.56 seconds. That's an impressive number, and it grows more impressively large every day. That's because the nation's educators, business leaders, and political…

  2. What Are Critical Thinking Skills and Why Are They Important?

    According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. Introduction: Critical Thinking, Reading, & Writing

    Critical thinkers will identify, analyze, and solve problems systematically rather than by intuition or instinct. Someone with critical thinking skills can: Understand the links between ideas. Determine the importance and relevance of arguments and ideas. Recognize, build, and appraise arguments. Identify inconsistencies and errors in reasoning.

  5. Critical Thinking & Writing

    Critical writing requires you to apply interpretation, analysis, and evaluation to the descriptions you have provided. Critical writing often responds to questions framed as 'how' or 'why'.Often, critical writing will require you to build an argument which is supported by evidence.. Some indicators of critical writing are:. Investigation of positive and negative perspectives on ideas

  6. Academic writing: a practical guide

    Critical thinking skills are important for engaging with academic literature and informing your own writing. What is critical writing? Academic writing requires criticality; it's not enough to just describe or summarise evidence, you also need to analyse and evaluate information and use it to build your own arguments.

  7. Critical Thinking

    Critical thinking is fundamentally a process of questioning information and data. You may question the information you read in a textbook, or you may question what a politician or a professor or a classmate says. You can also question a commonly-held belief or a new idea. With critical thinking, anything and everything is subject to question ...

  8. Critical writing

    Quoting, paraphrasing and synthesising. Using evidence to build critical arguments. This guide contains key resources to introduce you to the features of critical writing. For more in-depth advice and guidance on critical writing, visit our specialist academic writing guides: Academic writing: Criticality in academic writing.

  9. Critical Thinking and Writing: Critical Writing

    Key features of critical writing. Key features in critical writing include: Presenting strong supporting evidence and a clear argument that leads to a reasonable conclusion. Presenting a balanced argument that indicates an unbiased view by evaluating both the evidence that supports your argument as well as the counter-arguments that may show an ...

  10. The Importance of Critical Thinking in Writing (and How to Apply It)

    If you are thinking applying critical thinking in writing is too much of a hassle, then understand that not incorporating critical thinking leads to poor writing. And it's easy to detect the effects of not using critical thinking in writing. Some of the obvious signs of not applying critical thinking is a piece of writing include:

  11. Thinking through Writing

    Thinking through Writing: A Guide to Becoming a Better Writer and Thinker. John Kaag Jonathan van Belle. A concise and practical manual on developing reading, writing, and critical thinking skills in tandem. Series: Skills for Scholars. Hardcover. Price: $24.95/£20.00.

  12. Learning to Improve: Using Writing to Increase Critical Thinking

    Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve ...

  13. Writing as a Thinking Tool

    Writing is an invaluable tool for exercising our cognitive faculties. Extensive and diverse research has suggested links between writing and mental capacities in such domains as memory, critical thinking, creativity, verbal skills, and overall health. Below, you will find recommendations and explanations for how writing can be harnessed as a thinking tool both inside and...

  14. Learning to Improve: Using Writing to Increase Critical Thinking

    Various instruments are available to assess critical thinking (Watson and Glaser, 1980; Ennis and Weir, 1985; Facione, 1990b; Center for Critical Thinking and Moral Critique, 1996); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and ...

  15. (PDF) Exploring the Connection between Critical Thinking Skills and

    The teaching of critical thinking skills can be embedded in the teaching of writing in the classroom. This study explores the connection between critical thinking skills and academic writing. It ...

  16. Cambridge Life Competencies: Critical Thinking

    Adult learners: self-evaluating Critical Thinking skills for writing. Why not try this task the next time you would like to set your learners a writing task? Learners are challenged to tackle a specific area of Critical Thinking and apply it to their own writing. Download the lesson plan. These lesson plans were developed by Dr Niall Curry.

  17. Writing and Critical Thinking Through ...

    No headers. This text offers instruction in analytical, critical, and argumentative writing, critical thinking, research strategies, information literacy, and proper documentation through the study of literary works from major genres, while developing students' close reading skills and promoting an appreciation of the aesthetic qualities of literature.

  18. Metacognitive writing strategies, critical thinking skills, and

    The present study adopted the structural equation modeling approach to examine Chinese university students' metacognition, critical thinking skills, and academic writing. In particular, this research explored whether awareness in metacognition can foster critical thinking and, thus, lead to enhancement in academic writing. The measure for exploring metacognitive writing strategies covered ...

  19. (PDF) THINKING ABOUT THINKING IN WRITING

    Scriven & Paul, (2004) listed four critical thinking skills (Figure 6) in writing and they are: (a) Applying. Information, (b) Analysing Information, (c) Synthesizing Information, and (d ...

  20. 2.1: Breaking down critical thinking into categories

    Critical thinking is a set of skills designed to help the thinker analyze, assess and question a given situation or reading. Critical thinking skills push the thinker to reject simplistic conclusions based on human irrationality, false assumptions, prejudices, biases and anecdotal evidence. Critical thinking skills give thinkers confidence that ...

  21. 5 Top Critical Thinking Skills (And How To Improve Them)

    Top 5 critical thinking skills. Here are five common and impactful critical thinking skills you might consider highlighting on your resume or in an interview: 1. Observation. Observational skills are the starting point for critical thinking. People who are observant can quickly sense and identify a new problem.

  22. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process ...

  23. Writing As Thinking: Why Writing Is Still A Critical Skill In Business

    Writing is a vital skill and is crucial to your organization. No matter your field, role, industry or passion, effective writing can help sharpen critical thinking and position you to thrive in ...

  24. Critical Thinking: A Simple Guide and Why It's Important

    Critical thinking enriches communication skills, enabling the clear and logical articulation of ideas. Whether in emails, presentations, or casual conversations, individuals adept in critical thinking exude clarity, earning appreciation for their ability to convey thoughts seamlessly. ☑ Adaptability and Resilience

  25. A Guide On How to Write a Critical Thinking Essay

    Steps to Writing a Good Critical Thinking Essay. Understand the Assignment: Ensure you fully understand the essay prompt and requirements. Identify the key questions you need to answer and the objectives you need to achieve. This step is crucial for staying focused and meeting the instructor's expectations.

  26. The Pros and Cons of AI in Special Education

    Caveat: Students may miss out on the critical-thinking skills writing builds. Having students with language-processing differences use AI tools to better express themselves holds potential, but if ...

  27. Critical Writing/Critical Thinking

    Critical thinking involves evaluating information to reach a well-justified decision or recommendation, and critical writing is the clear, unambiguous communication of the information. The goal of this course is to develop participants' skills as writers and editors/reviewers in order to make U.S. Fish and Wildlife Service policy and science ...

  28. Pin by Antoinette Panico on Importante

    Critical Thinking Skills. A poster created for our English language arts and social studies departments (humanities) to use in their classroom, as well as in the library. ... English Writing Skills. Essay Writing Skills. Writing Words. Sentence starters. A. Autumn Oldeng. Educational Technology. Thinking Skills. Learning Theory. Teaching Tips.