critical thinking collaboration and the writing process

Writing to Think: Critical Thinking and the Writing Process

“Writing is thinking on paper.” (Zinsser, 1976, p. vii)

Google the term “critical thinking.” How many hits are there? On the day this tutorial was completed, Google found about 65,100,000 results in 0.56 seconds. That’s an impressive number, and it grows more impressively large every day. That’s because the nation’s educators, business leaders, and political representatives worry about the level of critical thinking skills among today’s students and workers.

What is Critical Thinking?

Simply put, critical thinking is sound thinking. Critical thinkers work to delve beneath the surface of sweeping generalizations, biases, clichés, and other quick observations that characterize ineffective thinking. They are willing to consider points of view different from their own, seek and study evidence and examples, root out sloppy and illogical argument, discern fact from opinion, embrace reason over emotion or preference, and change their minds when confronted with compelling reasons to do so. In sum, critical thinkers are flexible thinkers equipped to become active and effective spouses, parents, friends, consumers, employees, citizens, and leaders. Every area of life, in other words, can be positively affected by strong critical thinking.

Released in January 2011, an important study of college students over four years concluded that by graduation “large numbers [of American undergraduates] didn’t learn the critical thinking, complex reasoning and written communication skills that are widely assumed to be at the core of a college education” (Rimer, 2011, para. 1). The University designs curriculum, creates support programs, and hires faculty to help ensure you won’t be one of the students “[showing]no significant gains in . . . ‘higher order’ thinking skills” (Rimer, 2011, para. 4). One way the University works to help you build those skills is through writing projects.

Writing and Critical Thinking

Say the word “writing” and most people think of a completed publication. But say the word “writing” to writers, and they will likely think of the process of composing. Most writers would agree with novelist E. M. Forster, who wrote, “How can I know what I think until I see what I say?” (Forster, 1927, p. 99). Experienced writers know that the act of writing stimulates thinking.

Inexperienced and experienced writers have very different understandings of composition. Novice writers often make the mistake of believing they have to know what they’re going to write before they can begin writing. They often compose a thesis statement before asking questions or conducting research. In the course of their reading, they might even disregard material that counters their pre-formed ideas. This is not writing; it is recording.

In contrast, experienced writers begin with questions and work to discover many different answers before settling on those that are most convincing. They know that the act of putting words on paper or a computer screen helps them invent thought and content. Rather than trying to express what they already think, they express what the act of writing leads them to think as they put down words. More often than not, in other words, experienced writers write their way into ideas, which they then develop, revise, and refine as they go.

What has this notion of writing to do with critical thinking? Everything.

Consider the steps of the writing process: prewriting, outlining, drafting, revising, editing, seeking feedback, and publishing. These steps are not followed in a determined or strict order; instead, the effective writer knows that as they write, it may be necessary to return to an earlier step. In other words, in the process of revision, a writer may realize that the order of ideas is unclear. A new outline may help that writer re-order details. As they write, the writer considers and reconsiders the effectiveness of the work.

The writing process, then, is not just a mirror image of the thinking process: it is the thinking process. Confronted with a topic, an effective critical thinker/writer

  • asks questions
  • seeks answers
  • evaluates evidence
  • questions assumptions
  • tests hypotheses
  • makes inferences
  • employs logic
  • draws conclusions
  • predicts readers’ responses
  • creates order
  • drafts content
  • seeks others’ responses
  • weighs feedback
  • criticizes their own work
  • revises content and structure
  • seeks clarity and coherence

Example of Composition as Critical Thinking

“Good writing is fueled by unanswerable questions” (Lane, 1993, p. 15).

Imagine that you have been asked to write about a hero or heroine from history. You must explain what challenges that individual faced and how they conquered them. Now imagine that you decide to write about Rosa Parks and her role in the modern Civil Rights movement. Take a moment and survey what you already know. She refused to get up out of her seat on a bus so a White man could sit in it. She was arrested. As a result, Blacks in Montgomery protested, influencing the Montgomery Bus Boycott. Martin Luther King, Jr. took up leadership of the cause, and ultimately a movement was born.

Is that really all there is to Rosa Parks’s story? What questions might a thoughtful writer ask? Here a few:

  • Why did Rosa Parks refuse to get up on that particular day?
  • Was hers a spontaneous or planned act of defiance?
  • Did she work? Where? Doing what?
  • Had any other Black person refused to get up for a White person?
  • What happened to that individual or those individuals?
  • Why hadn’t that person or those persons received the publicity Parks did?
  • Was Parks active in Civil Rights before that day?
  • How did she learn about civil disobedience?

Even just these few questions could lead to potentially rich information.

Factual information would not be enough, however, to satisfy an assignment that asks for an interpretation of that information. The writer’s job for the assignment is to convince the reader that Parks was a heroine; in this way the writer must make an argument and support it. The writer must establish standards of heroic behavior. More questions arise:

  • What is heroic action?
  • What are the characteristics of someone who is heroic?
  • What do heroes value and believe?
  • What are the consequences of a hero’s actions?
  • Why do they matter?

Now the writer has even more research and more thinking to do.

By the time they have raised questions and answered them, raised more questions and answered them, and so on, they are ready to begin writing. But even then, new ideas will arise in the course of planning and drafting, inevitably leading the writer to more research and thought, to more composition and refinement.

Ultimately, every step of the way over the course of composing a project, the writer is engaged in critical thinking because the effective writer examines the work as they develop it.

Why Writing to Think Matters

Writing practice builds critical thinking, which empowers people to “take charge of [their] own minds” so they “can take charge of [their] own lives . . . and improve them, bringing them under [their] self command and direction” (Foundation for Critical Thinking, 2020, para. 12). Writing is a way of coming to know and understand the self and the changing world, enabling individuals to make decisions that benefit themselves, others, and society at large. Your knowledge alone – of law, medicine, business, or education, for example – will not be enough to meet future challenges. You will be tested by new unexpected circumstances, and when they arise, the open-mindedness, flexibility, reasoning, discipline, and discernment you have learned through writing practice will help you meet those challenges successfully.

Forster, E.M. (1927).  Aspects of the novel . Harcourt, Brace & Company.

The Foundation for Critical Thinking. (2020, June 17).  Our concept and definition of critical thinking . https://www.criticalthinking.org/pages/our-concept-of-critical-thinking/411

Lane, B. (1993).  After the end: Teaching and learning creative revision . Heinemann.

Rimer, S. (2011, January 18).  Study: Many college students not learning to think critically . The Hechinger Report. https://www.mcclatchydc.com/news/nation-world/national/article24608056.html

Zinsser, W. (1976).  On writing well: The classic guide to writing nonfiction . HarperCollins.

Share this:

  • Click to email a link to a friend (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Pinterest (Opens in new window)
  • Click to print (Opens in new window)

Follow Blog via Email

Enter your email address to follow this blog and receive email notifications of new posts.

Email Address

  • RSS - Posts
  • RSS - Comments
  • COLLEGE WRITING
  • USING SOURCES & APA STYLE
  • EFFECTIVE WRITING PODCASTS
  • LEARNING FOR SUCCESS
  • PLAGIARISM INFORMATION
  • FACULTY RESOURCES
  • Student Webinar Calendar
  • Academic Success Center
  • Writing Center
  • About the ASC Tutors
  • DIVERSITY TRAINING
  • PG Peer Tutors
  • PG Student Access

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

  • College Writing
  • Using Sources & APA Style
  • Learning for Success
  • Effective Writing Podcasts
  • Plagiarism Information
  • Faculty Resources
  • Tutor Training

Twitter feed

Logo for Pressbooks @ Howard Community College

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 The Writing Process

Kathy Boylan

Communication skills, including writing, are some of the most important soft skills (employable skills that have more to do with emotional IQ such as common sense, communication, problem-solving, and collaboration) that students learn when they are in college because most professions require high competency in written communication, which can be a chance for one to shine or to falter. With emails, memos, letters, texts, and even Tweets, most people spend a fair amount of time at work communicating via the written word. Whether you are messaging a colleague, writing to your manager, creating the company newsletter, or writing a press release to the media, your writing skills can boost or hinder your career easily, even if you do not have a “writing” profession. Basically, writing skills make a difference in how you are perceived in college and in the workplace.  That is the reason it is important to be sure you are following expected guidelines, always using the steps of the writing process, and making sure that all of your writing is coherent, concise, credible, and correct.

1.       What is the writing process?

2.       What is prewriting?

3.        What is a thesis statement?

4.        How to organize and arrange.

5.        How to write a rough draft.

6.        What is revising?

7.        What is done during editing & proofreading & formatting?

8.        What are other types of academic writing?

1.   What is the writing process?

No matter what type of writing you are doing, academic writing, professional writing, or personal writing, it can be made easier by using the writing process.  The writing process consists of the different stages that a writer follows to produce a good piece of writing.  Although different sources may label and group the stages in various ways, the stages of the writing process are essentially as follows:

Prewriting – Deciding what to write about (the topic) and gathering information to support or explain what you want to say about your subject, and planning how to organize your ideas in a way that effectively develops the topic.

Drafting -Writing the first copy of the piece (essay, article, etc.). This is often called the rough draft. Ultimately, you should have multiple copies or drafts of your work.

Revising -Reconsidering the ideas and content of the essay as well as refining the style and structure of the paper.

Editing/Proofreading – Correcting grammar, punctuation, spelling, and mechanics.

Publishing – Sharing the final draft with others.

Figure 4.1 The Writing Process

The Writing Process

However, the writing process is not a series of neatly developed steps and may differ somewhat for everyone.  Sometimes ideas do not flow easily, and the essay that you originally start out to write is not the essay that you end up writing.  Often the stages proceed erratically and overlap; the important thing is to keep writing and improving until a final product is achieved.  The more that you write, the better you will become as a writer.

2.   What is prewriting?

Prewriting describes all of the thinking and planning that precedes the actual writing of a paper.

Much careful thought needs to be given to the assignment in general at the beginning of prewriting before focusing on your topic.

  • First, understand the writing assignment and its limits.  Consider the assignment’s length.  Always know the expected length of a writing assignment.  A two-page paper has a much narrower topic than a ten-page paper would have.  If there is no page limit, consider the nature of the assignment to suggest its length .  A summary of a chapter will be much shorter than the original chapter.  An analysis of a poem may likely be longer than the poem itself.
  • Expressive writing conveys personal feelings or impressions to the audience.
  • Informative writing enlightens the audience about something.
  • Persuasive writing attempts to convince the audience to think or act in a certain way.

Other more specific purposes can include entertaining, analyzing, hypothesizing, assessing, summarizing, questioning, reporting, recommending, suggesting, evaluating, describing, recounting, requesting, and instructing.

  • Next, determine the assignment’s audience .  You must determine to whom you are writing.  An audience can be an individual or a group.  An audience can be general or specialized.  Once you define your audience, you must determine how much the audience already knows about the subject to know how much or little background information should be included. You should also determine how best to approach your audience in terms of language, rhetorical strategies, purposes for reading, and background knowledge.
  • Then devise the assignment’s occasion .  The occasion for which you are writing will determine the formality and scope of a writing project.  An in-class writing assignment will differ from an out-of-class formal assignment.  A memo for fellow office workers will differ from a report written for the company’s president.  A letter to an aunt will differ from a letter written to a bank to request a personal loan.
  • Finally, assess your own previous knowledge of the subject . Before writing, you need to determine what you already know about a subject, what you need to find out about the subject, and what you think about the subject.  Personal essays draw upon your own experiences and observations; research essays require you to gain new knowledge through research.

Topic Choice

The next step in prewriting, and often the hardest, is choosing a topic for an essay if one has not been assigned.  Choosing a viable general topic for an assignment is an essential step. Sometimes your instructor will give you an idea to begin an assignment, and other times your instructor will ask you to come up with a topic on your own. A captivating topic covers what an assignment will be about and fits the assignment’s purpose and its audience.  There are various methods you may use to discover an appropriate topic for your writing.

Using Experience and Observations

When selecting a topic, you may also want to consider something that interests you or something based on your own life and personal experiences. Even everyday observations can lead to interesting topics. After writers think about their experiences and observations, they often take notes on paper to better develop their thoughts. These notes help writers discover what they have to say about their topic.

Have you seen an attention-grabbing story on your local news channel? Many current issues appear on television, in magazines, and on the Internet. These can all provide inspiration for your writing. Our library’s database   (http://infoguides.virginiawestern.edu/az.php)  Issues and Controversies is a first-rate source.

Reading plays a vital role in all the stages of the writing process, but it first figures in the development of ideas and topics. Different kinds of documents can help you choose a topic and develop that topic. For example, a magazine advertising the latest research on the threat of global warming may catch your eye in the supermarket. This cover may interest you, and you may consider global warming as a topic, or maybe a novel’s courtroom drama sparks your curiosity of a particular lawsuit or legal controversy.  After you choose a topic, critical reading is essential to the development of a topic. While reading almost any document, you evaluate the author’s point of view by thinking about his main idea and his support. When you judge the author’s argument, you discover more about the author’s opinion as well as your own. If these steps already seem daunting, remember that even the best writers need to use prewriting strategies to generate ideas.

The steps in the writing process may seem time consuming at first, but following these steps will save you time in the future. The more you plan in the beginning by reading and using prewriting strategies, the less time you may spend writing and editing later because your ideas will develop more swiftly.  Prewriting strategies depend on your critical reading skills. Reading prewriting exercises (and outlines and drafts later in the writing process) will further develop your topic and ideas. As you continue to follow the writing process, you will see how to use critical reading skills to assess your own prewriting exercises.

Freewriting

Freewriting (also called brainstorming) is an exercise in which you write freely (jot, list, write paragraphs, dialog, take off on tangents: whatever “free” means to you) about a topic for a set amount of time (usually three to five minutes or until you run out of ideas or energy). Jot down any thoughts that come to your mind. Try not to worry about what you are saying, how it sounds, whether it is good or true, grammar, spelling, or punctuation. Instead, write as quickly as you can without stopping. If you are stuck, just copy the same word or phrase repeatedly until you come up with a new thought or write about why you cannot continue. Just keep writing; that is the power of this technique!

Writing often comes easier when you have a personal connection with the topic. Remember, to generate ideas in your freewriting, think about readings that you have enjoyed or that have challenged your thinking. Then write about it. Doing this may lead your thoughts in interesting directions.  Quickly recording your thoughts on paper will help you discover what you have to say about a topic. When writing quickly, try not to doubt or question your ideas, but if you do, write those, too. Allow yourself to write freely and unselfconsciously. Once you start writing with few limitations, you may find you have more to say than you first realized. Your flow of thoughts can lead you to discover more ideas about the topic as well as different perspectives on it. Freewriting may even lead you to discover another topic that excites you even more than your original idea.  Freewriting can also be used to narrow a topic and/or to develop supporting ideas once a broad topic has been chosen.

Journaling is another useful strategy for generating topic and content ideas. Journaling can be useful in exploring different topic ideas and serve as possible topic ideas for future papers.

Some prewriting strategies can be used together. For example, you could use experience and observations to come up with a topic related to your course studies. Then you could use freewriting to describe your topic in more detail and figure out what you have to say about it.

Focusing Topic

Once a general topic has been assigned to or chosen by you, then you must decide on the scope of the topic.  Broad topics always need to be narrowed down to topics that are more specific.  Then you need to determine what you are going to say about a subject.  Two ways to help narrow a general subject down to a narrower topic are probing and focused freewriting .

  • Probing is asking a series of questions about the topic. Who? What? Where? When? Why? How? As you choose your topic, answering these questions can help you revisit the ideas you already have and generate new ways to think about your topic. You may also discover aspects of the topic that are unfamiliar to you and that you would like to learn more about. All these idea-gathering techniques will help you plan for future work on your assignment.

For example, if you were writing about tattoos, then you might ask yourself the following questions:   Who do you know that has tattoos or who are some celebrities with memorable tattoos?   What kinds of tattoos do people usually get–what symbols and what words?   Where do people place tattoos on their bodies or where do people go to get tattoos–tattoo parlors?   When do people get tattoos–is it after some memorable event or life stage?   Why do people get tattoos? Finally, how do people get tattoos–what is the actual process?

  • Focused Freewriting is freewriting again and again with each freewriting cycle becoming more focused (also called looping ), and it can yield a great deal of useful material. Try this by taking the most compelling idea from one freewriting and starting the next with it.

Developing a Topic

The following checklist can help you decide if your narrowed topic is a possible topic for your assignment:

  • Why am I interested in this topic?
  • Would my audience be interested and why?
  • Do I have prior knowledge or experience with this topic? If so, would I be comfortable exploring this topic and sharing my experiences?
  • Why do I want to learn more about this topic?
  • Is this topic specific? What specifics or details about this topic stand out to me?
  • Does it fit the purpose of the assignment, and will it meet the required length of the assignment?

3. What is a thesis statement?

Once the topic has been narrowed to a workable subject, then determine what you are going to say about it; you need to come up with your controlling or main idea.  A thesis is the main idea of an essay.  It communicates the essay’s purpose with clear and concise wording and indicates the direction and scope of the essay.  It should not just be a statement of fact nor should it be an announcement of your intentions.  It should be an idea, an opinion of yours that needs to be explored, expanded, and developed into an argument .

A thesis statement:

  • tells the reader how you will interpret the significance of the subject matter under discussion.
  • is a road map for the paper; in other words, it tells the reader what to expect from the rest of the paper.
  • is an interpretation of a question or subject, not the subject itself. The subject, or topic, of an essay might be World War II or Moby Dick ; a thesis must then offer a way to understand the war or the novel.
  • makes a claim that others might dispute.
  • is usually a single sentence somewhere in the introductory paragraph that presents the writer’s argument to the reader. However, as essays get longer, a sentence alone is usually not enough to contain a complex thesis.  The rest of the paper, the body of the essay, gathers and organizes evidence that will persuade the readers of the logic of their interpretation.

If an assignment asks you to take a position or develop a claim about a subject, you may need to convey that position or claim in a thesis statement near the beginning of your draft. The assignment may not explicitly state that the writer needs a thesis statement because the instructor may assume the writer will include one. When an assignment asks you to analyze, to interpret, to compare and contrast, to demonstrate cause and effect, or to take a stand on an issue, it is likely that you are being asked to develop a thesis and to support it persuasively.

How do I get a thesis?

A thesis is the result of a lengthy thinking process. Formulating a thesis is not the first thing you do after reading an essay assignment. Before you develop an argument on any topic, you have to collect and organize evidence, look for possible relationships between known facts (such as surprising contrasts or similarities), and think about the significance of these relationships. (See chapter on argument for more detailed information on building an argument.)   Once you have done this thinking, you will probably have a “working thesis,” a basic or main idea, an argument that you can support with evidence.  It is deemed a “working thesis” because it is a work in progress, and it is subject to change as you move through the writing process.  Writers use all kinds of techniques to stimulate their thinking and to help them clarify relationships or comprehend the broader significance of a topic to arrive at a thesis statement.

For example, there is the question strategy. One way to start identifying and narrowing a thesis idea is to form a question that you want to answer. For example, if the starting question was “Do cats have a positive effect on people with depression? If so, what are three effects? The question sends you off to explore for answers. You then begin developing support. The first answer you might find is that petting cats lowers blood pressure, and, further question how that works. From your findings (research, interviews, background reading, etc.), you might detail how that happens physically or you might describe historical evidence. You could explain medical research that illustrates the concept. Then you have your first supporting point — as well as the first prong of your thesis: Cats have a positive effect on people with depression because they can lower blood pressure . . . . When you start with a specific question and find the answers, the argument falls into place. The answer to the question becomes the thesis, and how the answer was conceived becomes the supporting points (and, usually, the topic sentences for each point).

How do I know if my thesis is strong?

If there is time, run it by the instructor or make an appointment at the Composition and Literature Center to get some feedback (https://www.howardcc.edu/programs-courses/academics/academic-divisions/english-world-languages/resources/). Even if you do not have time to get advice elsewhere, you can do some thesis evaluation of your own.

When reviewing the first draft and its working thesis, ask the following:

  • Is my thesis statement an opinion , and is it a complete thought?   Beware of posing a question as your thesis statement.  Your thesis should answer a question that the audience may have about your topic.  Also, be sure that your thesis statement is a complete sentence rather than just a phrase stating your topic.
  • Have I taken a position that others might challenge or oppose? If your thesis simply states facts that no one would, or even could, disagree with, it is possible that you are simply providing a summary, rather than making an argument.
  • Is my thesis statement provable ? Can I establish the validity of it through the evidence and explanation that I offer in my essay?
  • Is my thesis statement specific ? Thesis statements that are too vague often do not have a strong argument. If your thesis contains words like “good” or “successful,” see if you could be more specific: W hy is something “good”; what specifically makes something “successful”?
  • Does my thesis pass the “So what?” test? If a reader’s first response is, “So what?” then you need to clarify, to forge a relationship, or to connect to a larger issue.
  • Does my essay support my thesis specifically and without wandering? If your thesis and the body of your essay do not seem to go together, one of them has to change. It is okay to change your working thesis to reflect things you have figured out in the course of writing your paper. Remember, always reassess and revise your writing as necessary.
  • Does my thesis pass the “how and why?” test? If a reader’s first response is “how?” or “why?” your thesis may be too open-ended and lack guidance for the reader. See what you can add to give the reader a better take on your position right from the beginning.

To create a thesis statement simply follow this formula:

TOPIC  +  CLAIM  =  THESIS STATEMENT

  • Animals + Dogs make better pets than cats. =When it comes to animals, dogs make better pets than cats because they are more trainable, more social, and more empathetic.
  • Movies & Emotions + Titanic evoked many emotions. = The movie Titanic evoked many emotions from an audience.
  • Arthur Miller & Death of a Salesman + Miller’s family inspired the Loman family. = Arthur Miller’s family and their experiences during the Great Depression inspired the creation of the Loman family in his play Death of a Salesman .

For more information on bad, good and better thesis statements from the writing center at the University of Evansville, go here (https://tinyurl.com/y8sfjale).

Exercise: Creating Effective Thesis Statements

Using the formula, create effective thesis statements for the following topics:

  • Drone Technology
  • Helicopter Parents

Then have a partner check your thesis statements to see if they pass the tests to be strong thesis statements.

Once a working thesis statement has been created, then it is time to begin building the body of the essay.  Get all of the key supporting ideas written down, and then you can begin to flesh out the body paragraphs by reading, asking, observing, researching, connecting personal experiences, etc. Use the information from below to maintain the internal integrity of the paragraphs and smooth the flow of your ideas.

4. How to organize and arrange?

Once you have generated supporting ideas for the main idea of your paper, you need to arrange those ideas in some type of order. Clustering and outlining can help organize the ideas.

Clustering (also called idea mapping) is a way of visually arranging ideas. Begin clustering by writing the topic in the center of a sheet of paper. Circle the topic, and then surround it with words and phrases that identify the major points to be discussed in the paper. Continue the process until all supporting details and secondary details have been listed. Many writers like this method because the shapes show how the ideas relate or connect, and writers can find a focused topic from the connections mapped. Using clustering, you might discover connections between topics that you had not thought of before.

Figure 4.2 Clustering

Clustering

Outlining lists the major supporting details in a tentative order and includes secondary supporting details.

Figure 4.3 Traditional Formal Outline

Traditional Formal Outline

Before you write, you need to decide how to organize your ideas.  You need to determine the rhetorical mode(s) that will be used and the order of the supporting ideas. Simplistically speaking, there are nine basic rhetorical modes .  They are as follows:   narration, description, exemplification, process, cause and effect, comparison and contrast, classification and division, definition, and argument.  However, most complex writing begins with an argument of some sort and then uses a combination of modes to relay one’s message.

Structure of a Paragraph and Essay

All formal paragraphs and essays have a title , a beginning or an introduction , a middle-a body of supporting paragraphs , and an end or conclusion .

A title is at the top of your paragraph or essay, but it is often the last thing that you create because until the paper is complete, you do not really know what your final product will be.  A good title makes people want to read your paper; it does not merely repeat the topic sentence or thesis statement; it hints at your main idea.  It is not a complete sentence, but it is a phrase or phrases that indicate your topic.

An effective introduction captures your readers’ attention and arouses their curiosity.  In a paragraph, it is often your topic sentence, and in an essay, it usually includes your thesis statement, which narrows your subject, claims something specific and significant, and conveys your purpose and often your form of organization.  You can include a question, tell a story, use a quotation, give interesting facts or statistics, give background information, or outline a problem and/or a solution.   Do not tell the reader what your topic is–show them.  Do not be vague and mysterious.  Do not refer back to your title.  Do not apologize for what you are about to say, and be original.  The important thing is that you hook your readers’ attention and motivate them to continue reading.

Your body of supporting evidence should be organized, unified and coherent.  The support can be organized using chronological order, spatial order, or emphatic order.  Each supporting detail should have its own topic sentence and be developed with valuable supporting details.  In an essay, the supporting ideas should support your thesis statement.   You should use transitional words or phrases to establish connections between paragraphs and different ideas.  You should use parallel structure   throughout your paper and use repetition sparingly and only when it is effective and necessary.  Be consistent in tense, number, and person throughout your paper as well.  The entire body of supporting evidence should be focused on supporting your main idea without straying off topic or including unrelated ideas.

Your conclusion should let the readers know that you are finished and not leave them with any unanswered questions.  It may recommend a call to action, or it may just summarize a long and complex paper.  The conclusion may repeat some of the ideas from the introduction, but it should not be a replica of that paragraph.  It may restate your main idea.  The conclusion can be either hopeful or hopeless depending on the mood of your paper.  You may leave your reader with some final important facts, or a compelling example, or a final visual image.   It is important that you do not go off in a new direction in your conclusion.  Do not make sweeping generalizations, and again do not apologize for any of your ideas.  Once these arrangements and ideas have been decided, then an outline should be constructed.

Figure 4.4 The Essay Structure

Essay Structure Graphic

Using a Clear Organizational Pattern

Depending on your topic, you might find it beneficial to use one of these common organizational patterns, either within individual paragraphs or within the entire essay:

Process analysis

A process analysis paragraph is used to describe how something is made or to explain the steps for how something is done.

The first key to growing good tomatoes is to give the seedlings plenty of room. Make sure to transplant them to small pots once they get their first leaves. Even when they are just starting out in pots, they need plenty of light, air, and heat. Make sure to warm the ground in advance by covering it in plastic sheeting for a couple of weeks. When you are ready to plant them in soil, plant them deeply enough, so they can put down some strong roots. Mulch next, and once the stems of the tomato plants have reached a few inches in height, cut off the lower leaves to avoid fungi. Carefully prune the suckers that develop in the joints of the developing stems.

Chronological

Chronological arrangement presents information in time order.

As soon as I arrived at the farmers’ market, I bought a large bag of lettuce. I walked around the corner and saw the biggest, most gorgeous sunflower I had ever seen. I bought it and added it to my lettuce bag. The flower was so big that I had to hold the bag right in front of me to keep it from being bumped. At the Wilson Pork Farm booth, I tasted a little pulled pork. You guessed it—I had to buy a quart of it. I went on with a plastic quart container in my left hand and my lettuce and flower in my right hand. I was handling it all just fine until I saw a huge hanging spider plant I had to have. Ever so gently, I placed my pulled pork container inside the spider fern plant pot. Now I was holding everything right in front of me as I tried to safely make my way through the crowd. That is when I met up with little Willie. Willie was about seven years old, and he was playing tag with his brother. I am not sure where their mother was, but Willie came running around the corner and smacked right into me. You are probably thinking that poor Willie had pulled pork all over his clothes and an upside-down plant on his head, but no, not at all. I was the one. Willie didn’t even notice. He was too busy chasing his brother.

General-to-specific

A common paragraph format is to present a general idea and then give examples.

The displays at the farmers’ market do not lack for variety. You will see every almost every kind of fresh, locally grown food you can imagine. The featured fruits on a given day might be as varied as pomegranates, persimmons, guava, jackfruit, and citron. Vegetables might include shiitake mushrooms, artichokes, avocados, and garlic. Some vendors also sell crafts, preserves, seeds, and other supplies suitable for starting your own garden.

Specific-to-general

The reverse of the above format is to give some examples and then summarize them with a general idea.

Your sense of smell is awakened by eighteen varieties of fresh roma tomatoes. Your mouth waters at the prospect of sampling the fresh breads. Your eye catches a glimpse of the colors of handmade, embroidered bags. You linger to touch a perfectly ripe peach. Your ears catch the strain of an impromptu jug band. A walk up and down the aisles of your local farmers’ market will engage all of your senses.

Spatial

A paragraph using spatial organization presents details as you would naturally encounter them, such as from top to bottom or from the inside to the outside. In other words, details are presented based on their physical location.

From top to bottom, the spice booth at our farmers’ market is amazing. Up high vendors display artwork painstakingly made with spices. At eye level, you see at least ten different fresh spices in small baggies. On the tabletop is located an assortment of tasting bowls with choices ranging from desserts to drinks to salads. Below the table, but out of the way of customers, are large bags of the different spices. Besides being a great use of space, the spice booth looks both professional and charming.

5. How to Write a Rough Draft.

Make the Writing Process Work for You! What makes the writing process beneficial to writers is that it encourages alternatives to standard practices and motivates you to develop your best ideas. For instance, the following approaches, done alone or in combination with others, may improve your writing and help you move forward in the writing process:

  • Begin writing with the part you know the most about. The purpose of a first draft is to get ideas down on paper that can then be revised.  Consider beginning with the body paragraphs and drafting the introduction and conclusion later. You can start with the third point in your outline if ideas come easily to mind, or you can start with the first or second point   Although paragraphs may vary in length, keep in mind that short paragraphs may contain insufficient support. Readers may also think the writing is abrupt. Long paragraphs may be wordy and may lose your reader’s interest. As a guideline, try to write paragraphs longer than one sentence but shorter than the length of an entire double-spaced page.
  • Write one supporting point at a time and then stop. As long as you complete the assignment on time, you may choose how many paragraphs you complete in one sitting. Pace yourself. On the other hand, try not to procrastinate. Writers should always meet their deadlines.
  • Take short breaks to refresh your mind. This tip might be most useful if you are writing a multipage report or essay. Still, if you are antsy or cannot concentrate, take a break to let your mind rest, but do not let breaks extend too long. If you spend too much time away from your essay, you may have trouble starting again. You may forget key points or lose momentum. Try setting an alarm to limit your break, and when the time is up, return to your desk to write.
  • Be reasonable with your goals. If you decide to take ten-minute breaks, try to stick to that goal. If you told yourself that you need more facts, then commit to finding them. Holding yourself to your own goals will create successful writing assignments.
  • Keep your audience and purpose in mind as you write. These aspects of writing are just as important when you are writing a single paragraph for your essay as when you are considering the direction of the entire essay.
  • Of all of these considerations, keeping your purpose and your audience at the front of your mind is key to writing success. If your purpose is to persuade, for example, you will present your facts and details in the most logical and convincing way you can for the particular audience you have in mind. If your audience dwells on logic, for example, points that use reason, facts, documented information, and the like, will provide the persuasion to which those readers best respond. Some writers find it useful to keep the purpose and audience at the top of every page, highlighted in some way, as a reminder of the targets of each point.
  • Your purpose will guide your mind as you compose your sentences. Your audience will guide word choice. Are you writing for experts, for a general audience, for other college students, or for people who know very little about your topic? Keep asking yourself what your readers, with their background and experience, need to know to understand your ideas. How can you best express your ideas, so they are meaningful and memorable and your communication is effective?
  • Write knowing that the revision and editing processes lie ahead, so leave plenty of time for those stages.

You may want to identify your purpose and audience on an index card that you clip to your paper (or keep next to your computer). On that card, you may want to write notes to yourself—perhaps about what that audience might not know or what it needs to know—so that you will be sure to address those issues when you write. It may be a good idea to state exactly what you want to explain to that audience, or the subject about which you want to inform them or persuade them.

Writing at Work

Many of the documents you produce at work target a particular audience for a particular purpose. You may find that it is highly advantageous to know as much as you can about your target audience and to prepare your message to reach that audience, even if the audience is a coworker or your boss. Menu language is a common example. Descriptions like “organic romaine” and “free-range chicken” are intended to appeal to a certain type of customer though perhaps not to the same customer who craves a thick steak. Similarly, mail-order companies research the demographics of the people who buy their merchandise. Successful vendors customize product descriptions in catalogs to appeal to their buyers’ tastes. For example, the product descriptions in a skateboarder catalog will differ from the descriptions in a clothing catalog for mature adults.

Tips to Avoid Writer’s Block

Set up scheduled times to write and set deadlines to accomplish different parts of your essay, and avoid perfectionism–that comes later in the writing process.

Maintaining Internal Integrity of Paragraphs

A paragraph needs to provide links between the ideas, and here are techniques that you can put into practice.

Linkages

Paragraphs with unity flow well so that readers can follow along easily. You need to present an idea and then link the rest of the ideas in the paragraph together. Do not leave any unifying for your readers to do mentally. Do it all for them.

Not all the booths at a farmers’ market feature food. One couple has a booth that sells only fresh flowers. They display some flowers in antique containers and sell the flowers, the containers, or both. A clothesline above our heads displays a variety of dried flowers. A table holds about fifty vases of varying sizes, and they are all full of flowers. Some vases hold only one kind of long-stem flowers. Others hold mixtures of uncut flowers. Still, others display gorgeous arrangements. Both the man and the woman wear a wreath of flowers on their heads. The whole display is so attractive and smells so fabulous that it really draws people.

Parallelism

Parallelism means that you maintain the same general wording and format for similar situations throughout the paragraph so that once readers figure out what is going on, they can easily understand the whole paragraph.

The history of this farmers’ market followed a typical pattern. It started out in the 1970s as a co-op of local farmers, featuring a small city block of modest tables and temporary displays every Saturday morning from April to October from 9 a.m. to 1 p.m. In the early 1990s, with the help of a grant from the city, the market expanded its footprint to a larger, more centrally located city block with ample parking. It benefited greatly from the installation of permanent booths, electrical outlets, and a ready water supply. These amenities drew far more customers and merchants. Its popularity reached unprecedented levels by 2000, when the city offered to help with the staffing needed to keep it open from 9 a.m. to 5 p.m. on Saturdays and from noon to 5 p.m. on Sundays. Recently, discussions began about how to open the market on weeknights in the summer from 5 p.m. to 8 p.m.

Consistency

A paragraph with consistency uses the same point of view and the same verb tense throughout. In other words, if you are using third person in the beginning of the paragraph, you use it throughout the paragraph. If you are using present tense to start the paragraph, you stick with it.

There comes a time each year when you must begin the all-important step of actually harvesting your vegetable garden. You will want to pick some of your vegetables before they are fully ripe. Eggplants, cucumbers, and squash fall into this category because they can further ripen once you have picked them. On the other hand, you will find that tomatoes, pumpkins, and most melons really need to ripen fully before you harvest them. You should also keep in mind that you would need plenty of storage space for your bounty. If you have a good harvest, you might want to have a few friends in mind, especially as recipients for your squash and cucumbers.

Using Transitions

Transitions within paragraphs are words that connect one sentence to another so that readers can follow the intended meanings of sentences and relationships between sentences. Transitions may also smooth the flow between body paragraphs.  The following table shows some commonly used transition words:

Commonly Used Transition Words

To compare/contrast

after that, again, also, although, and then, but, despite, even though, finally, first/second/third/etc., however, in contrast, in the same way, likewise, nevertheless, next, on the other hand, similarly, then

To signal cause and effect

as a result, because, consequently, due to, hence, since, therefore, thus

To show sequence or time

after, as soon as, at that time, before, during, earlier, finally, immediately, in the meantime, later, meanwhile, now, presently, simultaneously, so far, soon, until, then, thereafter, when, while

To indicate place or direction

above, adjacent to, below, beside, beyond, close, nearby, next to, north/south/east/west, opposite, to the left/right

To present examples

for example, for instance, in fact, to illustrate, specifically

To suggest relationships

and, also, besides, further, furthermore, in addition, moreover, too

6.   What Is revising?

Once a rough draft is created, take some time to step away from the essay to get a newer and better perspective.  Then begin revising.  Revising means reexamining and rethinking the first draft, adding and deleting ideas extensively; rearranging any of the ideas, sentences, or paragraphs in the first draft; rewriting sentences and paragraphs for more variety, better flow, and more precise word choices.  Often times, you may have three or four drafts before you are finally satisfied with a final draft.  For easier revision, follow the following tips:

  • Take time between the first draft and the later revisions to approach it more objectively.
  • Revise on hard copy rather than on the computer screen.  Do not delete any drafts!  Do label each successive one. Allow yourself and others to annotate (comment on and give suggestions to improve) your draft.
  • Read the draft aloud.  Better yet, have someone else read it aloud.
  • Take advantage of opportunities to get feedback; however, do not become overwhelmed by feedback.
  • Do not allow ego to get in the way of a successful paper.
  • Revise for overall meaning and structure. Does the essay develop a central point clearly and logically and are the purpose, tone, and point-of-view suited for the audience of the essay?
  • Revise for paragraph development. Check that your paragraphs are logically ordered, unified, and specific.
  • Revise sentence structure. Make your sentences consistent with your overall tone, varied in type and length, emphatic, and economical.
  • Finally, revise for word choices. Aim for an appropriate level of diction, word choices that do not overstate or understate, specific rather than general terms, strong verbs, only necessary modifiers, and original and nonsexist language.
  • When you get your essays back, read the essay and heed your instructor’s comments.  They can help improve your future essays.  If you do not understand your grade or the instructor’s comments, schedule a conference to discuss them with her.  As you revise your future essays, revisit the mistakes made before and be sure you avoid repeating them.

7.   What is done during editing & proofreading & formatting?

  • To edit , search for grammatical errors, check punctuation, check spelling, and look over sentence style and word choices one last time. See chapters 8 , 9 , and 10 for additional help.
  • To proofread , look for surface errors, such as typos, incorrect spacing, or formatting problems.
  • To format , be sure that you are following the formatting style your instructor requires whether it is Modern Language Association (MLA), American Psychological Association (APA), etc. For more information, see the Brown Library Citation Styles Guide (http://infoguides.virginiawestern.edu/citations).
  • Overall, look carefully for any error, large or small, that may weaken the essay’s message or undermine its credibility.

8. What are other types of academic writing?

There are many different types of writing that you will be asked to create during your academic and professional careers.  Always be clear what your boss or professor expects in an assignment before you begin writing.  Below is just a sample of the various assignments you may be given:

Personal/reflective writing assig nment–personal expression about an experience, event, situation, or information.

Expository writing assignment –writing that explains, describes, or informs.

Case study –a written report about a situation, group, or person that one has studied.

Review –summarizing as well as analyzing the strengths and weaknesses of a piece of writing, a show, or an event.

Technical report –clear, detailed report of the procedures undertaken and the results obtained during a scientific or technical procedure.

Lab report –writing that details the steps taken and the results of a scientific experiment.

Book report –writing that summarizes the contents of a book as well as some commentary concerning the writer’s opinion of the book.

Critical analysis/critique –writing an informed review and an analysis of the significance of a piece of writing or an event.

Bibliography –writing a full list of all resources consulted during a research project.

Annotated bibliography –writing not only a list of all resources consulted for a research project, but also including a summary and analysis of each resource.

Literature review –writing that focuses on a specific research topic and the critical aspects of the literature consulted during the research process.

Research paper –the final product following an extended period of research, critical thinking, and composition that encompasses the writer’s own ideas supported by a combination of primary and secondary sources.

E-mail –writing in electronic mail

Web writing –writing web content, which needs to be direct, concise, and credible.

Oral presentation of written report –developing an effective summary of a project to be delivered in front of an audience; may include visual aids.

Midterm/final exam essay –exams often include short essay questions that need to be written in a short amount of time.

Resume & other ‘business’ writing –writing that must communicate pertinent information in a concise, easy-to-read format.

Key Takeaways

  • All writers rely on steps and strategies to begin the writing process.
  • The steps in the writing process are prewriting, drafting, revising, editing/proofreading, and publishing.
  • Prewriting is the transfer of ideas from abstract thoughts into words, phrases, and sentences on paper.
  • A good topic interests the writer, appeals to the audience, and fits the purpose of the assignment.  Writers often choose a general topic first and then narrow the focus to a more specific topic.
  • A strong thesis statement is key to having a focused and unified essay.
  • Rough drafts are opportunities to get ideas down onto paper to get a first look at how your ideas will work together.
  • Revising improves your writing as far as supporting ideas, organization, sentence flow, and word choices.
  • Editing spots and corrects any errors in grammar, mechanics, spelling, and formatting.
  • Regardless of the type of assignment you may be given in college or in work, it benefits you to follow a writing process, to put in the work necessary to understand your subject and audience, and to communicate your ideas confidently and coherently.

CC Licensed Content, Shared Previously

English Composition I , Lumen Learning, CC-BY 4.0.

Rhetoric and Composition , John Barrett, et al., CC-BY-SA 3.0.

Writing for Success , CC-BY-NC-SA 3.0.

Image Credits

Figure 4.1 “The Writing Proces,” Kalyca Schultz, Virginia Western Community College, CC-0, .

Figure 4.2 “Clustering,” Kalyca Schultz, Virginia Western Community College, CC-0.

Figure 4.3 “Traditional Formal Outline,” Kalyca Schultz, Virginia Western Community College, CC-0.

Figure 4.4 “The Essay Structure,” Kalyca Schultz, Virginia Western Community College, CC-0.

Figure 4.5 “Point of View,” Kalyca Schultz, Virginia Western Community College, CC-0.

The Writing Process Copyright © 2021 by Kathy Boylan is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.6(2); Summer 2007

Learning to Improve: Using Writing to Increase Critical Thinking Performance in General Education Biology

Ian j. quitadamo.

*Department of Biological Sciences, Central Washington University, Ellensburg, WA 98926-7537; and

Martha J. Kurtz

† Department of Chemistry, Central Washington University, Ellensburg, WA 98926-7539

Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens.

INTRODUCTION

A national call to improve critical thinking in science.

In the past several years, an increasing number of national reports indicate a growing concern over the effectiveness of higher education teaching practices and the decreased science (and math) performance of U.S. students relative to other industrialized countries ( Project Kaleidoscope, 2006 ). A variety of national stakeholders, including business and educational leaders, politicians, parents, and public agencies, have called for long-term transformation of the K–20 educational system to produce graduates who are well trained in science, can engage intelligently in global issues that require local action, and in general are better able to solve problems and think critically. Specifically, business leaders are calling for graduates who possess advanced analysis and communication skills, for instructional methods that improve lifelong learning, and ultimately for an educational system that builds a nation of innovative and effective thinkers ( Business-Higher Education Forum and American Council on Education, 2003 ). Education leaders are similarly calling for institutions of higher education to produce graduates who think critically, communicate effectively, and who employ lifelong learning skills to address important scientific and civic issues ( Association of American Colleges and Universities, [AACU] 2005 ).

Many college faculty consider critical thinking to be one of the most important indicators of student learning quality. In its 2005 national report, the AACU indicated that 93% of higher education faculty perceived analytical and critical thinking to be an essential learning outcome (AACU, 2005) whereas 87% of undergraduate students indicated that college experiences contributed to their ability to think analytically and creatively. This same AACU report showed that only 6% of undergraduate seniors demonstrated critical thinking proficiency based on Educational Testing Services standardized assessments from 2003 to 2004. During the same time frame, data from the ACT Collegiate Assessment of Academic Proficiency test showed a similar trend, with undergraduates improving their critical thinking less than 1 SD from freshman to senior year. Thus, it appears a discrepancy exists between faculty expectations of critical thinking and students' ability to perceive and demonstrate critical thinking proficiency using standardized assessments (AACU, 2005).

Teaching that supports the development of critical thinking skills has become a cornerstone of nearly every major educational objective since the Department of Education released its six goals for the nation's schools in 1990. In particular, goal three of the National Goals for Education stated that more students should be able to reason, solve problems, and apply knowledge. Goal six specifically stated that college graduates must be able to think critically ( Office of Educational Research and Improvement, 1991 ). Since 1990, American education has tried—with some success—to make a fundamental shift from traditional teacher-focused instruction to more student-centered constructivist learning that encourages discovery, reflection, and in general is thought to improve student critical thinking skill. National science organizations have supported this trend with recommendations to improve the advanced thinking skills that support scientific literacy ( American Association for Higher Education, 1989 ; National Research Council, 1995 ; National Science Foundation, 1996 ).

More recent reports describe the need for improved biological literacy as well as international competitiveness ( Bybee and Fuchs, 2006 ; Klymkowsky, 2006 ). Despite the collective call for enhanced problem solving and critical thinking, educators, researchers, and policymakers are discovering a lack of evidence in existing literature for methods that measurably improve critical thinking skills ( Tsui, 1998 , 2002 ). As more reports call for improved K–20 student performance, it is essential that research-supported teaching and learning practices be used to better help students develop the cognitive skills that underlie effective science learning ( Malcom et al., 2005 ; Bybee and Fuchs, 2006 ).

Critical Thinking

Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions ( U.S. Department of Education, 1990 ), and are generally more employable ( Carnevale and American Society for Training and Development, 1990 ; Holmes and Clizbe, 1997 ; National Academy of Sciences, 2005 ). By focusing on instructional efforts that develop critical thinking skills, it may be possible to increase student performance while satisfying national stakeholder calls for educational improvement and increased ability to solve problems as engaged and productive citizens.

Although academics and business professionals consider critical thinking skill to be a crucial outcome of higher education, many would have difficulty defining exactly what critical thinking is. Historically, there has been little agreement on how to conceptualize critical thinking. Of the literally dozens of definitions that exist, one of the most organized efforts to define (and measure) critical thinking emerged from research done by Peter Facione and others in the early 1990s. Their consensus work, referred to as the Delphi report, was accomplished by a group of 46 leading theorists, teachers, and critical thinking assessment specialists from a variety of academic and business disciplines ( Facione and American Philosophical Association, 1990 ). Initial results from the Delphi report were later confirmed in a national survey and replication study ( Jones et al., 1995 ). In short, the Delphi panel expert consensus describes critical thinking as a “process of purposeful self-regulatory judgment that drives problem-solving and decision-making” ( Facione and American Philosophical Association, 1990 ). This definition implies that critical thinking is an intentional, self-regulated process that provides a mechanism for solving problems and making decisions based on reasoning and logic, which is particularly useful when dealing with issues of national and global significance.

The Delphi conceptualization of critical thinking encompasses several cognitive skills that include: 1) analysis (the ability to break a concept or idea into component pieces in order to understand its structure and inherent relationships), 2) inference (the skills used to arrive at a conclusion by reconciling what is known with what is unknown), and 3) evaluation (the ability to weigh and consider evidence and make reasoned judgments within a given context). Other critical thinking skills that are similarly relevant to science include interpretation, explanation, and self-regulation ( Facione and American Philosophical Association, 1990 ). The concept of critical thinking includes behavioral tendencies or dispositions as well as cognitive skills ( Ennis, 1985 ); these include the tendency to seek truth, to be open-minded, to be analytical, to be orderly and systematic, and to be inquisitive ( Facione and American Philosophical Association, 1990 ). These behavioral tendencies also align closely with behaviors considered to be important in science. Thus, an increased focus on teaching critical thinking may directly benefit students who are engaged in science.

Prior research on critical thinking indicates that students' behavioral dispositions do not change in the short term ( Giancarlo and Facione, 2001 ), but cognitive skills can be developed over a relatively short period of time (Quitadamo, Brahler, and Crouch, unpublished results). In their longitudinal study of behavioral disposition toward critical thinking, Giancarlo and Facione (2001) discovered that undergraduate critical thinking disposition changed significantly after two years. Specifically, significant changes in student tendency to seek truth and confidence in thinking critically occurred during the junior and senior years. Also, females tended to be more open-minded and have more mature judgment than males ( Giancarlo and Facione, 2001 ). Although additional studies are necessary to confirm results from the Giancarlo study, existing research seems to indicate that changes in undergraduate critical thinking disposition are measured in years, not weeks.

In contrast to behavioral disposition, prior research indicates that critical thinking skills can be measurably changed in weeks. In their study of undergraduate critical thinking skill in university science and math courses, Quitadamo, Brahler, and Crouch (unpublished results) showed that critical thinking skills changed within 15 wk in response to Peer Led Team Learning (a national best practice for small group learning). This preliminary study provided some evidence that undergraduate critical thinking skills could be measurably improved within an academic semester, but provided no information about whether critical thinking skills could be changed during a shorter academic quarter. It was also unclear whether the development of critical thinking skills was a function of chronological time or whether it was related to instructional time.

Numerous studies provide anecdotal evidence for pedagogies that improve critical thinking, but much of existing research relies on student self-report, which limits the scope of interpretation. From the literature it is clear that, although critical thinking skills are some of the most valued outcomes of a quality education, additional research investigating the effects of instructional factors on critical thinking performance is necessary ( Tsui, 1998 , 2002 ).

Writing and Critical Thinking

Writing has been widely used as a tool for communicating ideas, but less is known about how writing can improve the thinking process itself ( Rivard, 1994 ; Klein, 2004 ). Writing is thought to be a vehicle for improving student learning ( Champagne and Kouba, 1999 ; Kelly and Chen, 1999 ; Keys, 1999 ; Hand and Prain, 2002 ), but too often is used as a means to regurgitate content knowledge and derive prescribed outcomes ( Keys, 1999 ; Keys et al., 1999 ). Historically, writing is thought to contribute to the development of critical thinking skills ( Kurfiss, and Association for the Study of Higher Education, 1988 ). Applebee (1984) suggested that writing improves thinking because it requires an individual to make his or her ideas explicit and to evaluate and choose among tools necessary for effective discourse. Resnick (1987) stressed that writing should provide an opportunity to think through arguments and that, if used in such a way, could serve as a “cultivator and an enabler of higher order thinking.” Marzano (1991) suggested that writing used as a means to restructure knowledge improves higher-order thinking. In this context, writing may provide opportunity for students to think through arguments and use higher-order thinking skills to respond to complex problems ( Marzano, 1991 ).

Writing has also been used as a strategy to improve conceptual learning. Initial work focused on how the recursive and reflective nature of the writing process contributes to student learning ( Applebee, 1984 ; Langer and Applebee, 1985 , 1987 ; Ackerman, 1993 ). However, conclusions from early writing to learn studies were limited by confounding research designs and mismatches between writing activities and measures of student learning ( Ackerman, 1993 ). Subsequent work has focused on how writing within disciplines helps students to learn content and how to think. Specifically, writing within disciplines is thought to require deeper analytical thinking ( Langer and Applebee, 1987 ), which is closely aligned with critical thinking.

The influence of writing on critical thinking is less defined in science. Researchers have repeatedly called for more empirical investigations of writing in science; however, few provide such evidence ( Rivard, 1994 ; Tsui, 1998 ; Daempfle, 2002 ; Klein, 2004 ). In his extensive review of writing research, Rivard (1994) indicated that gaps in writing research limit its inferential scope, particularly within the sciences. Specifically, Rivard and others indicate that, despite the volume of writing students are asked to produce during their education, they are not learning to use writing to improve their awareness of thinking processes ( Resnick, 1987 ; Howard, 1990 ). Existing studies are limited because writing has been used either in isolation or outside authentic classroom contexts. Factors like gender, ethnicity, and academic ability that are not directly associated with writing but may nonetheless influence its effectiveness have also not been sufficiently accounted for in previous work ( Rivard, 1994 ).

A more recent review by Daempfle (2002) similarly indicates the need for additional research to clarify relationships between writing and critical thinking in science. In his review, Daempfle identified nine empirical studies that generally support the hypothesis that students who experience writing (and other nontraditional teaching methods) have higher reasoning skills than students who experience traditional science instruction. Of the relatively few noninstructional variables identified in those studies, gender and major did not affect critical thinking performance; however, the amount of time spent on and the explicitness of instruction to teach reasoning skills did affect overall critical thinking performance. Furthermore, the use of writing and other nontraditional teaching methods did not appear to negatively affect content knowledge acquisition ( Daempfle, 2002 ). Daempfle justified his conclusions by systematically describing the methodological inconsistencies for each study. Specifically, incomplete sample descriptions, the use of instruments with insufficient validity and reliability, the absence of suitable comparison groups, and the lack of statistical covariate analyses limit the scope and generalizability of existing studies of writing and critical thinking ( Daempfle, 2002 ).

Writing in the Biological Sciences

The conceptual nature and reliance on the scientific method as a means of understanding make the field of biology a natural place to teach critical thinking through writing. Some work has been done in this area, with literature describing various approaches to writing in the biological sciences that range from linked biology and English courses, writing across the biology curriculum, and directed use of writing to improve reasoning in biology courses ( Ebert-May et al., 1997 ; Holyoak, 1998 ; Taylor and Sobota, 1998 ; Steglich, 2000 ; Lawson, 2001 ; Kokkala and Gessell, 2003 ; Tessier, 2006 ). In their work on integrated biology and English, Taylor and Sobota (1998) discussed several problem areas that affected both biology and English students, including anxiety and frustration associated with writing, difficulty expressing thoughts clearly and succinctly, and a tendency to have strong negative responses to writing critique. Although the authors delineate the usefulness of several composition strategies for writing in biology ( Taylor and Sobota, 1998 ), it was unclear whether student data were used to support their recommendations. Kokkala and Gessell (2003) used English students to evaluate articles written by biology students. Biology students first reflected on initial editorial comments made by English students, and then resubmitted their work for an improved grade. In turn, English students had to justify their editorial comments with written work of their own. Qualitative results generated from a list of reflective questions at the end of the writing experience seemed to indicate that both groups of students improved editorial skills and writing logic. However, no formal measures of student editorial skill were collected before biology-English student collaboration, so no definitive conclusions on the usefulness of this strategy could be made.

Taking a slightly different tack, Steglich (2000) informally assessed student attitudes in nonmajors biology courses, and noted that writing produced positive changes in student attitudes toward biology. However, the author acknowledged that this work was not a research study. Finally, Tessier (2006) showed that students enrolled in a nonmajors ecology course significantly improved writing technical skills and committed fewer errors of fact regarding environmental issues in response to a writing treatment. Attitudes toward environmental issues also improved ( Tessier, 2006 ). Although this study surveyed students at the beginning and the end of the academic term and also tracked student progress during the quarter, instrument validity and reliability were not provided. The generalizability of results was further limited because of an overreliance on student self-reports and small sample size.

Each of the studies described above peripherally supports a relationship between writing and critical thinking. Although not explicitly an investigation of critical thinking, results from a relatively recent study support a stronger connection between writing and reasoning ability ( Daempfle, 2002 ). Ebert-May et al. (1997) used a modified learning cycle instructional method and small group collaboration to increase reasoning ability in general education biology students. A quasi-experimental pretest/posttest control group design was used on a comparatively large sample of students, and considerable thought was given to controlling extraneous variables across the treatment and comparison groups. A multifaceted assessment strategy based on writing, standardized tests, and student interviews was used to quantitatively and qualitatively evaluate student content knowledge and thinking skill. Results indicated that students in the treatment group significantly outperformed control group students on reasoning and process skills as indicated by the National Association of Biology Teachers (NABT) content exam. Coincidentally, student content knowledge did not differ significantly between the treatment and control sections, indicating that development of thinking skill did not occur at the expense of content knowledge ( Ebert-May et al., 1997 ). Interview data indicated that students experiencing the writing and collaboration-based instruction changed how they perceived the construction of biological knowledge and how they applied their reasoning skills. Although the Ebert-May study is one of the more complete investigations of writing and critical thinking to date, several questions remain. Supporting validity and reliability data for the NABT test was not included in the study, making interpretation of results somewhat less certain. In addition, the NABT exam is designed to assess high school biology performance, not college performance ( Daempfle, 2002 ). Perhaps more importantly, the NABT exam does not explicitly measure critical thinking skills.

Collectively, it appears that additional research is necessary to establish a more defined relationship between writing and critical thinking in science ( Rivard, 1994 ; Tsui, 1998 , 2002 ; Daempfle, 2002 ). The current study addresses some of the gaps in previous work by evaluating the effects of writing on critical thinking performance using relatively large numbers of students, suitable comparison groups, valid and reliable instruments, a sizable cadre of covariables, and statistical analyses of covariance. This study uses an experimental design similar to that of the Ebert-May et al. (1997) study but incorporates valid and reliable test measures of critical thinking that can be used both within and across different science disciplines.

Purpose of the Study

Currently there is much national discussion about increasing the numbers of students majoring in various science fields ( National Research Council, 2003 ; National Academy of Sciences, 2005 ). Although this is a necessary and worthwhile goal, attention should also be focused on improving student performance in general education science because these students will far outnumber science majors for the foreseeable future. If college instructors want general education students to think critically about science, they will need to use teaching methods that improve student critical thinking performance. In many traditional general education biology courses, students are not expected to work collaboratively, to think about concepts as much as memorize facts, or to develop and support a written thesis or argument. This presents a large problem when one considers the societal role that general education students will play as voters, community members, and global citizens. By improving their critical thinking skills in science, general education students will be better able to deal with the broad scientific, economic, social, and political issues they will face in the future.

The problem addressed by this study was to discover whether writing could improve student critical thinking performance in general education biology courses. How might writing in general education biology affect the analysis, inference, and evaluation skills that are inherent to critical thinking? What level of critical thinking skill do students bring to nonmajors biology courses? Can their critical thinking skills be measurably improved using writing? What other factors affect development of critical thinking skills? When do student critical thinking skills begin to change, and how much? In this study, the effect of writing on critical thinking performance was investigated using the California Critical Thinking Skills Test (CCTST) at the beginning (pretest) and end (posttest) of 10 sections of general education biology at a regional comprehensive university in the Pacific Northwest. Several research questions framed this investigation:

Does writing in laboratory affect critical thinking performance in general education biology? Does the development of analysis, inference, and evaluation skills differ between students who experience writing versus those who experience traditional laboratory instruction? What measurable effect do factors like gender, ethnicity, and prior thinking skill have on changes in critical thinking in general education biology? If critical thinking skills change during an academic quarter, when does that take place?

MATERIALS AND METHODS

Study context.

The study took place at a state-funded regional comprehensive university in the Pacific Northwest. All participants were nonmajor undergraduates who were taking biology to satisfy their general education science requirement. Ten total sections of general education biology offered over three academic quarters (one academic year) were included in the study. Four of the 10 sections implemented a writing component during weekly laboratory meetings (N = 158); six traditional quiz-based laboratory sections served as a nonwriting control group (N = 152). Only scores from students who had completed both the initial (pretest) and end-of-quarter (posttest) critical thinking assessments were included in the data analysis. A breakdown of participant demographics for the writing and nonwriting groups is provided in Table 1 .

Demographics for the writing and nonwriting groups

SampleClass distribution (%) Gender distribution (%)
FrSoJrSr2nd SrMF
Writing (158)44.933.515.23.82.538.661.4
No writing (152)53.328.37.29.22.038.261.8
Overall (310)49.031.011.36.52.338.461.6
SampleEthnic distribution (%)
CaucasianHispanicAfrican AmericanNative AmericanAsianOther
Writing (158)84.81.92.504.46.3
No writing (152)81.64.61.31.35.95.3
Overall (310)83.23.21.90.65.25.8

Demographics profile for the study sample. n values in parentheses.

a Other includes the ″choose not to answer″ response.

Each course section included a lecture component offered four times per week for 50 min and a laboratory component that met once a week for 2 h. Course lecture sections were limited to a maximum enrollment of 48 students, with two concurrent lab sections of 24 students. Two different instructors taught five writing sections and five other instructors taught 11 traditional sections over three consecutive quarters. Each course instructor materially participated in teaching laboratory with the help of one graduate assistant per lab section (two graduate students per course section). None of the instructors from treatment sections had implemented writing in the laboratory before the start of this study. Writing instructors were chosen on the basis of personal dissatisfaction with traditional laboratory teaching methods and willingness to try something new.

Strong efforts were made to establish equivalency between writing and nonwriting course sections a priori. Course elements that were highly similar included common lecture rooms, the use of similar (in most cases identical) textbooks, and a lab facility coordinated by a single faculty member. More specifically, three similarly appointed lecture rooms outfitted with contemporary instructional technology including dry erase boards, media cabinets, a networked computer, and digital projection were used to teach the nonmajors biology courses. The same nonmajors biology textbook was used across the writing and most of the nonwriting sections. All laboratory sections used a common lab facility and were taught on the same day of the week. Although the order in which specific labs were taught differed among sections, a common laboratory manual containing prescriptive exercises covering the main themes of biology (scientific method, cellular biology and genetics, natural selection and evolution, kingdoms of life, and a mammalian dissection) was used across all writing and nonwriting lab sections.

Primary course differences included a writing component in the laboratory, and how much time was devoted to laboratory activities. Those sections that experienced the writing treatment completed the prescriptive lab exercises in the first hour and engaged in writing during the second hour of the lab. Nonwriting sections allocated 2 h for the prescriptive lab exercises and included a traditional laboratory quiz rather than a writing assignment. The degree to which the writing and nonwriting sections included small group collaboration in laboratory varied and all course sections differed with regards to individual instructor teaching style. Although all course sections used traditional lecture exams during the quarter to assess content knowledge, the degree to which rote memorization-based exam questions were used to evaluate student learning varied.

Description of the Writing Treatment

On the first day of lecture, students in the writing treatment group were told that their laboratory performance would be evaluated using collaborative essays instead of traditional quizzes. A brief overview of the writing assignments was included in associated course syllabi. During the first laboratory session of the quarter, students were grouped into teams of three or four individuals, and the criteria for completing weekly writing assignments were further explained.

The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature ( Collier, 1980 ; Bruffee, 1984 ; Tobin et al., 1994 ; Jones and Carter, 1998 ; Springer et al., 1999 ) and prior research by Quitadamo, Brahler, and Crouch (unpublished results), who showed that Peer Led Team Learning (one form of collaborative learning) helped to measurably improve undergraduate critical thinking skills. Small group learning was also used in the nonwriting treatment groups to a greater or lesser extent depending on individual instructor preference.

Baseline critical thinking performance was established in the academic quarters preceding the writing experiment to more specifically attribute changes in critical thinking to the writing treatment. Concurrent nonwriting course sections were also used as comparison groups. The historical baseline provided a way to determine what student performance had been before experiencing the writing treatment, whereas the concurrent nonwriting groups allowed for a direct comparison of critical thinking performance during the writing treatment. Pretest scores indicating prior critical thinking skill were also used to further establish comparability between the writing and nonwriting groups.

Laboratory activities were coordinated for all sections by a single faculty member who taught in the nonwriting group. All faculty and graduate assistants met regularly to discuss course progress, laboratory procedure, and coordinate resources. Nonwriting faculty drafted quizzes that addressed laboratory content knowledge. Writing faculty collaboratively crafted a consensus essay, or thought question, designed to elicit student critical thinking and ability to apply content knowledge. Each thought question was designed so that students had to apply lecture concepts and build on their conceptual understanding by integrating actual laboratory experiences (see Supplemental Appendix 1 , available online) for thought question examples). Weekly thought questions became progressively more difficult as the term progressed. Initial planning meetings took place just before the beginning of the academic quarter and included graduate assistant training to help them learn to consistently evaluate student writing using a modified thesis-based essay rubric (see Supplemental Appendix 2 ; Beers et al., 1994 ). A range of sample essays from poor to high quality was used to calibrate graduate assistant scoring and ensure consistency between assistants from different laboratory sections within the writing group. All graduate assistants and course instructors applied the thesis-based rubric to sample essays and worked toward consensus. Initial training ended when all graduate assistants scored within 0.5 points of each other on at least two sample essays.

Students were given weekly thought questions before beginning laboratory to help them frame their efforts during laboratory exercises. Students completed the prescriptive lab activities during the first hour, and then each student group relocated to an assigned computer lab in the same building and worked around a common computer terminal to draft a collective response to the weekly thought question. Students were allowed to use any suitable information or materials (laboratory observations, laboratory manuals, lecture notes, textbooks, the Internet, etc.) to help them address their thought question. Internal group discussions allowed students to argue individual viewpoints as they worked toward group agreement on each thought question. Essay responses to thought questions were answered using a standard five-paragraph format. Each essay included an introduction with a group-generated thesis statement, two to three body paragraphs that provided sufficient detail to support the thesis statement, and a summary paragraph that concluded the essay. Students were not allowed to work on essays outside of the laboratory environment.

Initial essay drafts were composed in Microsoft Word and submitted to the graduate assistant by the end of the laboratory period using the campus e-mail system. Graduate assistants evaluated each group's essay (typically six per lab section) and assigned an initial grade based on the thesis-based essay rubric. Graduate assistants made comments and suggestions electronically using Microsoft Word revising and track changes tools. Evaluated essays were e-mailed back to each student group, which addressed comments and suggestions during the subsequent week's laboratory writing time. Each student group submitted a final draft that was re-evaluated and assigned a final grade. During the second week, students both revised their essay from the previous week and then generated an initial draft for the current week's thought question, all within the lab writing hour. This was done to help students become more proficient writers within a short period of time. Overall, students in the writing group completed eight essays that, along with lab book scores, constituted 25% of their overall course grade. An identical percentage was used to calculate traditional quiz and lab book scores in all nonwriting course sections.

At the end of the quarter, each writing group member completed a peer evaluation for all group members, including themselves (see Supplemental Appendix 3 ). This was done to help students reflect on and evaluate their own performance, maximize individual accountability within the group, and make sure students received credit proportional to their contributions. The average peer evaluation score for each student was included as 5% of the final course grade.

Collectively, this approach to writing and evaluation was used to 1) help students reflect on and discuss deficiencies in their collective and written work, 2) provide an opportunity for students to explicitly address deficiencies in thesis development and general writing skill, 3) provide a suitable reward for student efforts to revise their work relative to established performance benchmarks, 4) improve individual accountability within each group, and 5) help students develop more efficient and effective writing skills that collectively might lead to improved critical thinking skill.

Assessment of Critical Thinking

Using critical thinking to indicate student learning performance is particularly useful because it can be measured within and across disciplines. Various instruments are available to assess critical thinking ( Watson and Glaser, 1980 ; Ennis and Weir, 1985 ; Facione, 1990b ; Center for Critical Thinking and Moral Critique, 1996 ); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and has been evaluated for validity and reliability for measuring critical thinking at the college level ( Facione, 1990a ; Facione et al., 1992 , 2004 ). The CCTST measures cognitive skills of analysis, inference, evaluation, induction, and deduction, with results expressed as raw scores or national percentile equivalents based on a national norming sample of students from 4-yr colleges and universities. Construct validity for the CCTST is high as indicated by greater than 95% consensus of the Delphi panel experts on the component skills of critical thinking. Test reliability (calculated using the KR–20 internal consistency method) is 0.78–0.84 for the form used in this study, a value considered to be within the recommended range for tests that measure a wide range of critical thinking skills ( Facione, 1991 ). The CCTST norming sample for 4-yr colleges and universities is based on a stratified sample of 2000 students from various disciplines, with approximately 30% of the norming sample comprised of science and math students. Approximately 20,000 college students complete the CCTST each year ( Insight Assessment and Blohm, 2005 ).

The CCTST contains 34 questions and is a 45-min timed assessment of critical thinking. An online version of the CCTST was administered in this study, which allowed the researchers to collect student demographics data including gender, ethnicity, age, and several others at the same time critical thinking skill was measured. Total critical thinking skill as well as analysis, inference, and evaluation component critical thinking skills ( Facione, 1990c ) were determined for each CCTST administration and compared across the writing and nonwriting groups.

Research Design

A quasi-experimental pretest/posttest control group design was used for this study to determine whether critical thinking performance in the writing group differed significantly from the nonwriting group. This design was chosen in order to compare critical thinking performance between intact groups, and because it was not feasible to randomly assign students from one course section to another within the sample. Frequency distributions of pretest/posttest changes in total critical thinking skill and analysis, inference, and evaluation component critical thinking skills were constructed to provide some indication of sample randomness and to inform assumptions for subsequent statistical analyses of covariance (see Figure 1 , A–D).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740001.jpg

(A–D) Frequency distribution of change in critical thinking skills. Distribution of change in critical thinking skill for the experimental sample. Changes are indicated using raw scores from CCTST pre- and posttests for total critical thinking skill (A) as well as analysis (B), inference (C), and evaluation (D) component critical thinking skills.

The pretest/posttest control group design was also used in order to minimize internal validity threats that could potentially compete with the effects of the writing treatment on student critical thinking performance. This design is widely used in educational research, and generally controls for most threats to internal validity ( Campbell and Stanley, 1963 ). Internal threats that remain a concern include history, maturation, pretest sensitization, selection, and statistical regression toward the mean. In the current study, history and maturation threats were minimized to the extent that the CCTST pretest and posttest were administered only 9 wk apart, and class standing and age covariables that indicate maturation were included in the statistical analysis. Pretest sensitization and selection are larger concerns for this design. Pretest sensitization was minimized in several ways: 1) prior critical thinking skill indicated by the CCTST pretest was used as a covariable in statistical analyses, 2) pretest/posttest to posttest only comparison studies conducted by Insight Assessment indicate CCTST pretest sensitization is minimized ( Facione, 1990a ), and 3) neither the students, instructors, nor the test administrators have access to the correct answers on the CCTST, so repeat performance on the posttest is less likely. Selection threats were also reduced by using CCTST pretest scores in the statistical analyses, thereby making it more difficult to detect statistically significant differences in critical thinking performance between the writing and nonwriting groups. Statistical regression toward the mean, which was observed to some extent in this study, was minimized because this study used a valid and reliable instrument to assess critical thinking ( Facione, 1990a ). Regression threats were also minimized to the extent that students with higher initial scores regressed much less than students with lower initial scores.

The generalizability of study results is limited because all data were collected at a single university. Specific threats to external validity include selection-treatment interaction and treatment diffusion. These threats were minimized because writing was mandatory for all treatment group participants, thereby minimizing volunteer effects. Because the writing also took considerable student effort, it is less likely that treatment diffusion occurred. In summary, the pretest/posttest control group design was used to minimize internal and external validity threats and maximize the ability to determine the effects of writing on student critical thinking performance.

Study Variables and Data Analysis

Effect of writing on critical thinking performance..

General education biology students were divided into writing and nonwriting groups (independent variable). Changes in CCTST pretest/posttest scores (dependent variable) were determined to discover whether writing influenced student critical thinking performance. Two CCTST outcome measures were used to statistically test for writing effect: 1) raw scores for total critical thinking skill, and 2) raw scores for analysis, inference, and evaluation component skills. Results were reported using raw scores and corresponding national percentile rank so that critical thinking performance outcomes would be more meaningful and intuitive. Conversion of CCTST raw scores to national percentile ranking was done using SPSS (SPSS, Inc., Chicago, IL) statistical software and a linear estimation conversion script based on an equivalency scale from Insight Assessment (Millbrae, CA).

Several covariables were included in the analysis to increase statistical accuracy and precision, and to more specifically isolate the effects of writing on critical thinking performance. CCTST pretest scores were used to indicate initial critical thinking skill. Gender and ethnicity helped to account for male/female or race-specific changes in critical thinking performance and were also used to identify potential sources of performance bias. Academic term and time of day were used to account for critical thinking differences due to the time of year each course was offered and the time of day each student took the course, respectively. Class standing and age were used to indicate maturation related to time in college and chronological age, respectively. Finally, the instructor covariable was used to account for performance differences due to individual teaching styles.

Statistical Analysis of Effect of Writing.

Several statistical analyses were conducted to determine the effects of writing on critical thinking performance in general education biology. An analysis of covariance (ANCOVA) test provided insight regarding differences in overall critical thinking performance between the writing and nonwriting groups. Change in CCTST total raw scores and national percentile ranking was used as composite measures of critical thinking ( Facione, 1990c ) in this initial analysis. Second, changes in particular component critical thinking skills (analysis, inference, and evaluation) were evaluated using a multivariate analysis of covariance (MANCOVA) test because of the three dependent variables. The ANCOVA and MANCOVA tests also provided some insight into the effect the covariables had on critical thinking performance in general education biology. Collectively, these statistical tests allowed for a more accurate and precise analysis because variance associated with the covariables could be more specifically isolated from the writing treatment. Mean, SE, and effect size were also compared between the writing and nonwriting groups. Effect size, represented in standard units, was used to compare the magnitude of writing effect in the study.

Analysis of Thought Question Performance.

Performance on weekly thought questions was analyzed to discover specifically when and how much student critical thinking skills changed during the academic term. This analysis also provided context for CCTST critical thinking performance measures. Specifically, average scores from a representative sample of writing course sections (approximately 100 students) were used to compare initial essay drafts across the weeks of the term to discover when students began to show changes in their first attempt at each essay. Weekly performance on final revised essays was also compared to determine how student final submissions changed over time. Finally, the weekly difference between each initial essay and each final essay was compared to determine how much the revision process changed during the term. These calculations collectively helped to provide a profile of critical thinking performance over time.

Participant Demographics

Student demographics provided in Table 1 indicated an overall distribution of approximately 49% freshmen, 31% sophomores, 11% juniors, and 9% seniors. Approximately 74% of the writing group students were freshmen and sophomores, whereas 82% of the nonwriting group was underclassmen. Overall, 61% of the sample was female and 39% male, with near identical gender distribution across the writing and nonwriting groups. The predominant ethnicity in the sample was Caucasian (>83%), with Asian American (5%), Latino/Hispanic (3%), African American (2%), and Native American (1%) students comprising the remainder of the sample. About 6% of the sample classified themselves as having some other ethnicity or chose not to identify their ethnic heritage.

Statistical Assumptions

Analysis of covariance and multivariate analysis of covariance tests were used to compare critical thinking performance between the writing and nonwriting groups. The evaluated assumptions for the ANCOVA and MANCOVA tests were homogeneity of slopes, homogeneity of covariances, and normality. An analysis evaluating the homogeneity of slopes assumption indicated that the relationship between the covariables and the critical thinking performance dependent variable did not differ significantly by the writing/nonwriting independent variable for the ANCOVA test, F(2, 307) = 1.642, p = 0.195, power = 0.346, partial η 2 = 0.011, or the MANCOVA test, F(6, 610) = 1.685, p = 0.122, power = 0.645, partial η 2 = 0.016. These results confirmed that both analyses of covariance met the homogeneity of slopes assumption. The homogeneity of covariance assumption was tested using Levene's and Box's tests. Levene's test results for the ANCOVA indicated that error variances were not equal across writing and nonwriting groups, F(1,308) = 7.139, p = 0.008. Similarly, Box's test results indicated that covariance was not equal for the writing and nonwriting groups, F(6, 684,530) = 4.628, p = 0.000. These results indicated that the ANCOVA/MANCOVA tests did not meet the homogeneity of covariance assumption. To more fully evaluate this assumption, distributions of total and component critical thinking skill were constructed (see Figure 1 , A–D). Furthermore, the writing and nonwriting groups were highly similar in size and no post hoc tests were conducted. On the basis of these data, it was determined that the ANCOVA and MANCOVA tests were the best statistical measures to answer the research questions. Finally, the normality assumption was evaluated using the previously constructed frequency distributions for total change in critical thinking ( Figure 1 A) as well as change in analysis ( Figure 1 B), inference ( Figure 1 C), and evaluation ( Figure 1 D) critical thinking skills. Frequency distributions of total and component critical thinking dependent variables indicated that each approximated a standard normal curve.

Effect of Writing on Total Critical Thinking Performance

The ANCOVA test of total critical thinking performance showed that writing and nonwriting groups differed significantly, F(1, 300) = 19.357, p < 0.0001, power = 0.992, partial η 2 = 0.061 (see Table 2 ). The strength of the relationship between the writing/nonwriting groups and critical thinking performance was modest but significant, accounting for more than 6% of the variance in critical thinking performance.

ANCOVA results for total critical thinking performance

TreatmentFdf PowerEffect size
Writing19.3573000.000 0.9920.061
CCTST pretest19.7133000.000 0.9930.062
Instructor7.7453000.006 0.7920.025
Time of day6.2913000.013 0.7050.021
Gender0.2263000.6350.0760.001
Ethnicity2.3263000.1280.3300.008
Age0.4533000.5020.1030.002
Class standing0.0023000.9620.0500.000
Academic term2.3873000.1230.3380.008

Analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pre-test), academic term, time of day, and instructor.

a Significance tested at 0.05 level.

Descriptive statistics of total critical thinking performance in the writing and nonwriting groups were also calculated (see Table 3 ). The writing group showed an average CCTST raw score change of 1.18 compared with the nonwriting group, which showed an average raw score change of −0.51. These critical thinking raw scores equated to gains in national percentile rank of 7.47 (45th to 53rd percentile) for the writing group and −2.09 (42nd to 40th percentile) for the nonwriting group. Critical thinking improvement in the writing group was approximately nine times greater than the nonwriting group (see Figure 2 ).

Writing effect on total critical thinking performance: CCTST raw scores

TreatmentMean raw score SEM Raw CT change
PrePostPrePost
Writing (158)15.8417.020.320.361.18
Nonwriting (152)15.4614.950.340.43−0.51
Overall (310)15.6516.000.330.400.34

Comparison of writing and nonwriting group performance based on CCTST raw scores. CCTST raw score range was 0–34; n values in parentheses.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740002.jpg

Effect of writing on total critical thinking national percentile rank. Comparison of total critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The ANCOVA test of total critical thinking skill indicated that gender, ethnicity, age, class standing, and academic term did not significantly affect critical thinking performance (see Table 2 ). Covariables that significantly affected total critical thinking performance included 1) CCTST pretest score, F(1, 300) = 19.713, p < 0.0001, power = 0.993, partial η 2 = 0.062, 2) instructor, F(1, 300) = 7.745, p < 0.006, power = 0.792, partial η 2 = 0.025, and 3) time of day, F(1300) = 6.291, p < 0.013, power = 0.705, partial η 2 = 0.021. The effect of prior critical thinking skill (CCTST pretest) was moderately strong, accounting for more than 6% of the variance in total critical thinking performance. The effect of instructor and time of day were smaller, accounting for 2.5 and 2%, respectively, of total critical thinking performance variance. Critical thinking improvement associated with CCTST pretest score was approximately 2.5 times greater than for instructor and nearly three times greater than for time of day.

Effect of Writing on Component Critical Thinking Performance

The MANCOVA test indicated that analysis, inference, and evaluation critical thinking skills differed significantly between the writing and nonwriting groups, Wilks λ = 0.919, F(3, 296) = 8.746, p < 0.0001, power = 0.995, partial η 2 = 0.081 (see Table 4 ). The strength of the relationship between writing and component critical thinking performance was modest but significant, accounting for more than 8% of the variance in critical thinking performance.

MANCOVA results for component critical thinking performance

TreatmentWilks λFdf PowerEffect size
Writing0.9199.7462960.0000.9950.081
Analysis pretest0.62359.7372960.0001.0000.377
Inference pretest0.68146.2222960.0001.0000.319
Evaluation pretest0.61362.3982960.0001.0000.387
Gender0.9841.6022960.1890.4200.016
Ethnicity0.9831.7562960.1560.4560.017
Age0.9881.1532960.3280.4560.012
Class standing0.9782.1862960.0900.5530.022
Instructor0.9564.5082960.0040.8800.044
Quarter0.9910.8992960.4420.2460.009
Time of day0.9802.0222960.1110.5170.020

Multivariate analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pretest), academic term, time of day, and instructor.

Specifically, significant gains in analysis and inference skills were observed in the writing group but not the nonwriting group. No statistically significant gains in evaluation skill were observed in either group (see Table 5 ). National percentile rank equivalents for CCTST component raw scores indicated the writing group gained 10.51 percentile in analysis skill (42nd to 52nd percentile), 6.05 percentile in inference skill (45th to 52nd percentile), and 5.16 percentile in evaluation skill (46th to 52nd percentile). The nonwriting group showed a national percentile rank change of −4.43 percentile in analysis skill (47th to 42nd percentile), −2.23 percentile in inference skill (42nd to 40th percentile), and 1.37 percentile in evaluation (44th to 45th percentile; see Figure 3 ). Critical thinking performance for the writing group was 15 times greater for analysis and 8 times greater for inference skills than for the nonwriting group. Although neither the writing nor the nonwriting group showed significant gains in evaluation skill, the writing group showed more than 3 times greater improvement than did the nonwriting group.

Effect of writing on component critical thinking performance

Component skillMean raw score and change
Writing (n = 158) Nonwriting (n = 152)
Raw scoreSEMRaw scoreSEM
Analysis (pre)4.220.114.350.11
Analysis (post)4.540.103.990.13
Analysis (change)0.33 0.11−0.360.14
Inference (pre)7.420.187.070.20
Inference (post)7.910.206.830.23
Inference (change)0.48 0.16−0.240.21
Evaluation (pre)4.200.144.040.15
Evaluation (post)4.570.154.130.17
Evaluation (change)0.370.150.090.17

Comparison of writing and nonwriting group performance based on critical thinking component skill raw scores (CCTST subscales). Score range was 0–7 (analysis), 0–16 (inference), and 0–11 (evaluation).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740003.jpg

Effect of writing on component critical thinking national percentile rank. Comparison of component critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The MANCOVA test of analysis, inference, and evaluation skills indicated that gender, ethnicity, age, class standing, academic term, and time of day did not significantly affect critical thinking performance. Critical thinking performance was affected by prior analysis, inference, and evaluation skill (CCTST component pretest scores) and instructor (see Table 4 ). Specifically, component pretest scores had a large effect on critical thinking, accounting for 38% (analysis), 32% (inference), and 39% (evaluation) of critical thinking performance variance. The effect of instructor was smaller, accounting for 4.4% of variation in critical thinking skill. The effect of prior component critical thinking skill was approximately 4.5 times greater than the effect of writing, and nearly 9 times greater than the effect of instructor.

Student Thought Question Performance

Critical thinking performance on student essays was evaluated by applying a thesis-based essay rubric (see Supplemental Appendix 2 ) on initial submissions and final revised essays. Average weekly performance during the academic term is shown in Figure 4 . A comparison of initial essays indicated that students improved 53.3% from week 1 (average score of 27.9%) to week 7 (average score of 81.2%). A similar comparison of final essays showed that students improved 32.5% from week 1 (average score of 54.1%) to week 7 (average score of 86.6%). The largest changes between initial and final essays occurred in week 1 (change of 26.2%), and decreased each week thereafter (24.8, 23.9, 18.8, 8, 7.8, and 5.4% for weeks 2 through 7, respectively). These results showed that students produced little evidence of critical thinking skill in their writing early in the term, but improved dramatically on both initial and revised essay submissions by the end of the term.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740004.jpg

Profile of change in critical thinking performance in writing group. Comparison of student writing performance on weekly initial and revised essays. Essay scores were derived using a thesis-based critical thinking rubric (see Supplemental Appendix 2 ). Average essay scores were computed across writing sections.

The purpose of this study was to discover whether writing could measurably influence critical thinking performance in general education biology. Results indicated that students from the writing group significantly outperformed their nonwriting peers in both total critical thinking skill and the component critical thinking skills of analysis and inference. The writing and nonwriting groups were highly similar initially and began the academic term with comparable critical thinking ability (45th and 42nd national percentile for writing and nonwriting, respectively). By the end of the term, writing students had improved their critical thinking skill to above the 52nd percentile whereas nonwriting students decreased to below the 40th percentile. In addition to writing, prior critical thinking skill and course instructor significantly affected critical thinking performance, with prior critical thinking skill having the largest effect on critical thinking gains of any variable tested. Further analysis of the writing group showed that the largest gains in critical thinking occurred during the first few weeks of the term, with graduated improvement during the remainder of the term. A comparison of average critical thinking performance on initial essays and revised essays showed that thinking skills improvement was greater on initial essays (53%) than on final essays (33%). Collectively, the results of this study indicated that students who experienced writing in general education biology significantly improved their critical thinking skills.

The covariance analysis that was conducted provided a partial means to separate out the effects of writing, prior critical thinking skill, instructor, and multiple covariables from total and component critical thinking gains. The analysis of total critical thinking skill indicated that writing students changed their critical thinking skill from below the national average to above the national average within an academic quarter, whereas nonwriting students remained below the national average. This observation is important because it shows that students can develop critical thinking skills within a fairly short 9-wk period of time, and that writing can play a role in that process. A similar study showed critical thinking skills improve over 15 wk (Quitadamo, Brahler, and Crouch, unpublished results); however, this study provided no insight into whether critical thinking skills could be changed over a shorter period of time, in a different academic setting, or in response to instructional variables such as writing.

Although critical thinking gains were influenced by writing, they did not appear to be affected by gender, ethnicity, class standing, or age. In fact, statistical results indicated that these variables collectively had a very small effect on critical thinking performance. Gender distribution was nearly identical across the writing and nonwriting groups, and was predominantly female (nearly 62%). Ethnic distribution was also highly similar across the writing and nonwriting groups, but the sampling was largely Caucasian (>84%). Class standing varied a little more across the writing and nonwriting groups, with the sample largely comprised of underclassmen (70%). Although nearly three-quarters of the sample was between 18 and 21 years of age, nearly 10% was over 21, with a fair number of older nontraditional students represented. It is possible that a more diverse sample would have produced different results, or it may be that the individuals participating in this study responded particularly well to writing. Although further investigation of these variables is necessary and important, it was beyond the scope of the current study.

The analysis of component skills provided greater insight into the particular critical thinking skills that students changed in response to writing. Specifically, writing students significantly improved their analysis and inference skills whereas nonwriting students did not. Writing students also improved their evaluation skills much more than nonwriting students, although not significantly. These results indicate that the process of writing helps students develop improved analytical and inference skills. Prior research indicates that the writing to learn strategy is effective because students must conceptually organize and structure their thoughts as well as their awareness of thinking processes ( Langer and Applebee, 1987 ; Ackerman, 1993 ; Holliday, 1994 ; Rivard, 1994 ). More specifically, as students begin to shape their thoughts at the point of construction and continually analyze, review, and clarify meaning through the processes of drafting and revision, they necessarily engage and apply analysis and inference skills ( Klein, 1999 ; Hand and Prain, 2002 ). In this study, the process of writing appears to have influenced critical thinking gains. It also seems likely that writing students experienced a greater cognitive demand than nonwriting students simply because the writing act required them to hypothesize, debate, and persuade ( Rivard, 1994 ; Hand and Prain, 2002 ) rather than memorize as was the case in nonwriting control courses.

Conversely, the lack of any significant change in analysis, inference, or evaluation skills in the nonwriting group indicated that the traditional lab instruction used in the general education biology control courses did not help students develop critical thinking skills. Based on the results of this study, it could be argued that traditional lab instruction actually prevents the development of critical thinking skills, which presents a rather large problem when one considers how frequently these traditional methods are used in general education biology courses. One also has to consider that the critical thinking gains seen in the writing group might also have resulted from the relative absence of traditional lab instruction rather than writing alone. Additional research will be necessary to gain further insight into this question. Either way, changes to the traditional model of lab instruction will be necessary if the goal is to enhance the critical thinking abilities of general education biology students.

The variable that had the largest impact on critical thinking performance gains was prior critical thinking skill. This phenomenon was previously observed by Quitadamo, Brahler, and Crouch (unpublished results) in a related study that investigated the effect of Peer Led Team Learning on critical thinking performance. That study focused on science and math major undergraduate critical thinking performance at a major research university, and found that, in addition to Peer Led Team Learning, prior critical thinking skill significantly influenced critical thinking performance (Quitadamo, Brahler, and Crouch, unpublished results). Specifically, students with the highest prior critical thinking skill showed the largest performance gains, whereas students with low initial skill were at a comparative disadvantage. The fact that prior critical thinking skill also had a large effect on critical thinking performance in this study increases the generalizability of the observation and underscores its importance. Simply put, students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills, not because they lack the cognitive hard-wiring to perform but because they lack the tools to build their knowledge. Is it reasonable or just to expect otherwise comparable students to perform at similar levels when only some of them have the keys for success? If we hope to improve the perception of science in this country, we need to educate people on how to think about important scientific issues, and not simply argue a position based on one school of thought. By helping general education students to develop critical thinking skills, it is hoped that they will be better able to think rationally about science.

The observation that students who come to general education biology with greater critical thinking skills leave with the largest skill gains has important implications for the K–12 school system as well. If a high proportion of students are coming to institutions of higher education lacking critical thinking skills, why are these skills not being explicitly taught in the K–12 system? Ideally, students would learn the foundational tenets of critical thinking at an earlier age, and be able to refine and hone these skills as they progress through the K–20 education system. The results of this study reinforce the idea that students should be explicitly taught critical thinking skills and be expected to practice them as early and often as possible.

Although its effect was smaller than writing or prior critical thinking skill, the instructor variable also played a significant role in student critical thinking performance, accounting for 2.5% of the total variance in critical thinking gains. Determining the particular qualities of each instructor that contributed to student critical thinking success and further separating instructor and writing effects will require additional research. Previous research indicates that teaching style positively influences certain aspects of student learning ( Grasha, 1994 ; Hativa et al., 2001 ; Bain, 2004 ), but the qualities that specifically influence student critical thinking gains have not been sufficiently investigated. Additional research in this area is necessary.

Faculty considering whether to use writing in the laboratory may wonder about how much time and energy it takes to implement, if efforts to change will translate into improved student learning, and how these changes affect disciplinary content. From a practical perspective, implementing writing did not take more time and effort per se; rather, it required faculty to reconceptualize how they spent their instructional time. Instead of individually developing course materials, writing faculty collaborated to a greater extent than nonwriting faculty on course design and assessments that required students to demonstrate their critical thinking skill. Interviews of faculty from the writing and nonwriting groups indicated that writing faculty felt the course was less work because they collaborated with colleagues and because students demonstrated improved thinking skill. Writing faculty generally became more comfortable with the new model after ∼2–3 wk when students began to show observable changes in writing proficiency and critical thinking. Together, collaboration with colleagues and observed gains in critical thinking tended to create a positive feedback loop that helped to sustain writing faculty efforts. In contrast, nonwriting faculty similarly wanted their students to think better but were convinced that traditional methods would be more effective, and so remained closed to change. There were some logistical challenges with writing, like scheduling computer labs where students could draft and revise their weekly essay responses under instructor and teaching assistant supervision. Teaching assistants (and faculty) also needed to be trained on how to evaluate writing using a rubric. Finally, with regards to content coverage, no lecture or laboratory content was killed in order to implement writing because writing and nonwriting students both performed the same lab activities. Collectively, the benefits of using writing in laboratory should encourage faculty who want their students to learn to think critically to give it a try.

Future Directions

This study showed that writing affects student critical thinking skill in a nonmajors biology course, but the results have generated more questions than have been answered. How does writing specifically produce gains in critical thinking performance? What factors influence student prior critical thinking skill? How do instructors specifically influence student gains in critical thinking? Future studies that analyze student essays in more detail would provide greater insight into how writing influences critical thinking skill. Using writing in other nonmajor science courses such as chemistry, geology, or physics could also be done to determine the transferability of this method. Additional studies that investigate student prior critical thinking skill and instructor variables are also necessary. These future studies would further contribute to the knowledge base in this area, and also address some of its identified limitations ( Ebert-May et al., 1997 ; Daempfle, 2002 ). Results from these studies would also increase the generalizability of the results from this study.

CONCLUSIONS

Building on existing research and on the basis of several lines of evidence presented in this study, we conclude that writing positively influences critical thinking performance for general education biology students. Those students with prior critical thinking skill may have a comparative advantage over other general education biology students who have not developed these same skills. To rectify that inequity critical thinking skills should be explicitly taught early and used often during the K–20 academic process. As it appears that particular instructors improve student critical thinking skills more than others, students should be discerning in their choice of instructors if they want to improve their critical thinking skills. Whether writing as a method to improve critical thinking skills will prove useful in other general education science courses will likely depend on a host of factors, but it has potential. Further study of writing in general education science will be necessary to verify these results and discover the breadth and depth of how writing affects critical thinking skill.

ACKNOWLEDGMENTS

We thank Drs. Holly Pinkart, Roberta Soltz, Phil Mattocks, and James Johnson and undergraduate researchers Matthew Brewer, Dayrk Flaugh, Adam Wallace, Colette Watson, Kelly Vincent, and Christine Weller for their valuable contributions to this study. The authors also acknowledge the generous financial support provided by the Central Washington University Office of the Provost and the Office of the Associate Vice President for Undergraduate Studies.

  • Ackerman J. M. The promise of writing to learn. Writ. Commun. 1993; 10 (3):334–370. [ Google Scholar ]
  • American Association for the Advancement of Science. Washington, DC: 1989. Science for All Americans. A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology. [ Google Scholar ]
  • Applebee A. N. Writing and reasoning. Rev. Educ. Res. 1984; 54 (4):577–596. [ Google Scholar ]
  • Association of American Colleges Universities. Washington, DC: 2005. Liberal Education Outcomes: A Preliminary Report on Student Achievement in College. [ Google Scholar ]
  • Bain K. Cambridge, MA: Harvard University Press; 2004. What the Best College Teachers Do. [ Google Scholar ]
  • Beers T., McIssac C., Henderson B., Gainen J. Writing: thesis and support scoring guide. 1994. [accessed 25 August 2006]. http://www.insightassessment.com/pdf_files/RUB_WTHS.PDF .
  • Bruffee K. A. Collaborative learning and the “conversation of mankind.” Coll. Engl. 1984; 46 (7):635–653. [ Google Scholar ]
  • Business-Higher Education Forum, and American Council on Education. Washington, DC: 2003. Building a Nation of Learners: The Need for Changes in Teaching and Learning To Meet Global Challenges. [ Google Scholar ]
  • Bybee R. W., Fuchs B. Preparing the 21st century workforce: a new reform in science and technology education. J. Res. Sci. Teach. 2006; 43 (4):349–352. [ Google Scholar ]
  • Campbell D. T., Stanley J. C. Boston, MA: Houghton Mifflin Company; 1963. Experimental and Quasi-experimental Designs for Research. [ Google Scholar ]
  • Carnevale A. P. American Society for Training Development. San Francisco, CA: Jossey-Bass; 1990. Workplace Basics: The Essential Skills Employers Want. [ Google Scholar ]
  • Center for Critical Thinking and Moral Critique. Rohnert Park, CA: Sonoma State University; 1996. ICAT Critical Thinking Essay Test. [ Google Scholar ]
  • Champagne A., Kouba V., Mintzes J., Wandersee J., Novak J. Assessing Science Understanding: A Human Constructivist View. New York: Academic Press; 1999. Written product as performance measures; pp. 224–248. [ Google Scholar ]
  • Collier K. G. Peer-group learning in higher education: the development of higher order skills. Stud. High. Educ. 1980; 5 (1):55–61. [ Google Scholar ]
  • Daempfle P. A. New York: U.S. Department of Education; 2002. Instructional Approaches for the Improvement of Reasoning in Introductory College Biology Courses: A Review of the Research. [ Google Scholar ]
  • Ebert-May D., Brewer C., Allred S. Innovation in large lectures—teaching for active learning. Bioscience. 1997; 47 (9):601–607. [ Google Scholar ]
  • Ennis R. H. A logical basis for measuring critical thinking skills. Educ. Leadership. 1985; 43 (2):44–48. [ Google Scholar ]
  • Ennis R. H., Weir E. Pacific Grove, CA: Midwest Publications; 1985. The Ennis-Weir Critical Thinking Essay Test. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990a. The California Critical Thinking Skills Test—College Level. Technical Report 1. Experimental Validation and Content Validity. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990b. The California Critical Thinking Skills Test—College Level. Technical Report 3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990c. The California Critical Thinking Skills Test—College Level. Technical Report 4. Interpreting the CCTST, Group Norms, and Sub-Scores. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1991. Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment. [ Google Scholar ]
  • Facione P. A. American Philosophical Association. Millbrae, CA: Insight Assessment; 1990. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. [ Google Scholar ]
  • Facione P. A., Facione N. C., Giancarlo C. A. Millbrae, CA: Insight Assessment; 1992. Test Manual: The California Critical Thinking Disposition Inventory. [ Google Scholar ]
  • Facione P. A, Facione N. C. Insight Assessment. 2004. [accessed 30 June 2006]. Test of everyday reasoning. http://www.insightassessment.com/test-ter.html . [ Google Scholar ]
  • Giancarlo C. A., Facione P. A. A look across four years at the disposition toward critical thinking among undergraduate students. J. Gen. Educ. 2001; 50 (1):29–55. [ Google Scholar ]
  • Grasha A. F. A matter of style: the teacher as expert, formal authority, personal model, facilitator, and delegator. Coll. Teach. 1994; 42 (4):142–149. [ Google Scholar ]
  • Hand B., Prain V. Teachers implementing writing-to-learn strategies in junior secondary science: a case study. Sci. Educ. 2002; 86 (6):737–755. [ Google Scholar ]
  • Hativa N., Barak R., Simhi E. Exemplary university teachers: knowledge and beliefs regarding effective teaching dimensions and strategies. J. High. Educ. 2001; 72 (6):699–729. [ Google Scholar ]
  • Holliday W. G. The reading-science learning-writing connection: breakthroughs, barriers, and promises. J. Res. Sci. Teach. 1994; 31 (9):877–893. [ Google Scholar ]
  • Holmes J., Clizbe E. Facing the 21st century. Bus. Educ. Forum. 1997; 52 (1):33–35. [ Google Scholar ]
  • Holyoak A. R. A plan for writing throughout (not just across) the biology curriculum. Am. Biol. Teach. 1998; 60 (3):186–190. [ Google Scholar ]
  • Howard V. A. Thinking on paper: a philosopher's look at writing. In: Howard V. A., editor. Varieties of Thinking: Essays from Harvard's Philosophy of Education Research Center. New York: Routledge; 1990. pp. 84–92. [ Google Scholar ]
  • Insight Assessment. Blohm S. Annual number of users for the CCTST form 2000. 2005 [accessed 8 December 2006]; [ Google Scholar ]
  • Jones E. A., Hoffman S., Moore L. M., Ratcliff G., Tibbets S., Click B., III . Report no. NCES-95-001. University Park, PA: U.S. Department of Education, Office of Educational Research and Improvement.; 1995. National Assessment of College Student Learning: Identifying College Graduates' Essential Skills in Writing, Speech and Listening, and Critical Thinking. Final project report. [ Google Scholar ]
  • Jones G. M, Carter G. Small groups and shared constructions. In: Mintzes J. J., Wandersee J. H., Novak J. D., editors. Teaching Science for Understanding: A Human Constructivist View. San Diego, CA: Academic Press; 1998. pp. 261–279. [ Google Scholar ]
  • Kelly G. J., Chen C. The sound of music: constructing science as sociocultural practices through oral and written discourse. J. Res. Sci. Teach. 1999; 36 (8):883–915. [ Google Scholar ]
  • Keys C. W. Revitalizing instruction in scientific genres: connecting knowledge production with writing to learn in science. Sci. Educ. 1999; 83 (2):115–130. [ Google Scholar ]
  • Keys C. W., Hand B., Prain V., Collins S. Using the science writing heuristic as a tool for learning from laboratory investigations in secondary science. J. Res. Sci. Teach. 1999; 36 (10):1065–1084. [ Google Scholar ]
  • Klein P. Reopening inquiry into cognitive processes in writing-to-learn. Ed. Psychol. Rev. 1999; 11 (3):203–270. [ Google Scholar ]
  • Klein P. D. Constructing scientific explanations through writing. Instr. Sci. 2004; 32 (3):191–231. [ Google Scholar ]
  • Klymkowsky M. W. Can nonmajors courses lead to biological literacy? Do majors courses do any better? Cell. Biol. Educ. 2006; 4 :42–44. [ PubMed ] [ Google Scholar ]
  • Kokkala I., Gessell D. A. Writing science effectively: biology and English students in an author-editor relationship. J. Coll. Sci. Teach. 2003; 32 (4):252–257. [ Google Scholar ]
  • Kurfiss J. G. Association for the Study of Higher Education. Washington, DC: George Washington University; 1988. Critical Thinking: Theory, Research, Practice, and Possibilities. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Learning to write: learning to think. Educ. Horizons. 1985; 64 (1):36–38. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Urbana, IL: National Council of Teachers of English; 1987. How Writing Shapes Thinking: A Study of Teaching and Learning. NCTE research report no. 22. [ Google Scholar ]
  • Lawson A. E. Using the learning cycle to teach biology concepts and reasoning patterns. J. Biol. Educ. 2001; 35 (4):165–169. [ Google Scholar ]
  • Malcom S. M., Abdallah J., Chubin D. E., Grogan K. A System of Solutions: Every School, Every Student. Washington, DC: American Association for the Advancement of Science; 2005. [ Google Scholar ]
  • Marzano R. J. Fostering thinking across the curriculum through knowledge restructuring. J. Reading. 1991; 34 (7):518–525. [ Google Scholar ]
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine. Washington, DC: Committee on Prospering in the Global Economy of the 21st Century; 2005. Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future. [ Google Scholar ]
  • National Research Council. Washington, DC: National Academy Press; 1995. National Science Education Standards. [ Google Scholar ]
  • National Research Council. Washington, DC: Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; 2003. Bio 2010, Transforming Undergraduate Education for Future Research Biologists. [ Google Scholar ]
  • National Science Foundation. Washington, DC: Directorate for Education and Human Resources; 1996. Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology. [ Google Scholar ]
  • Office of Educational Research Improvement. Washington, DC: 1991. Striving for excellence: The National Education Goals. [ Google Scholar ]
  • Project Kaleidoscope. Washington, DC: National Science Foundation; 2006. Transforming America's Scientific and Technological Infrastructure: Recommendations for Urgent Action. [ Google Scholar ]
  • Resnick L. B. Education and Learning To Think. Washington DC: National Academy Press; 1987. [ Google Scholar ]
  • Rivard L. P. A review of writing to learn in science: implications for practice and research. J. Res. Sci. Teach. 1994; 31 (9):969–983. [ Google Scholar ]
  • Springer L., Donovan S. S., Stanne M. E. Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis. Rev. Educ. Res. 1999; 69 (1):21–51. [ Google Scholar ]
  • Steglich C. S. A writing assignment that changes attitudes in biology classes. Am. Biol. Teach. 2000; 62 (2):98–101. [ Google Scholar ]
  • Taylor K. L., Sobota S. J. Writing in biology: an integration of disciplines. Am. Biol. Teach. 1998; 60 (5):350–353. [ Google Scholar ]
  • Tessier J. Writing assignment in a nonmajor introductory ecology class. J. Coll. Sci. Teach. 2006; 35 (4):25–29. [ Google Scholar ]
  • Tobin K. G., Tippins D. J., Gallard A. J. Research on instructional strategies for teaching science. In: Gabel D. L., editor. Handbook of Research on Science Teaching and Learning. New York: Macmillan; 1994. pp. 45–93. [ Google Scholar ]
  • Tsui L. ASHE annual meeting paper. Miami, FL: 1998. A review of research on critical thinking; pp. 5–8. 1998 November. [ Google Scholar ]
  • Tsui L. Fostering critical thinking through effective pedagogy: evidence from four institutional case studies. J. High. Educ. 2002; 73 (6):740–763. [ Google Scholar ]
  • U.S. Department of Education. Washington, DC: 1990. National Goals for Education. [ Google Scholar ]
  • Watson G., Glaser E. M. Watson-Glaser Critical Thinking Appraisal. Cleveland, OH: The Psychological Corporation (Harcourt Brace Jovanovich); 1980. [ Google Scholar ]

Learning A-Z logo

Raz-Plus Add-Ons

Professional learning services.

  • Get Started
  • Breakroom Details

The Writing and Critical Thinking Connection

How Writing A-Z Helps Students Become Better Critical Thinkers

Writing is an exercise in critical thinking. Every writing assignment demands that students think ahead, consider their audience, and rethink their wording or organization to ensure that their composition meets a specific goal: to persuade, to inform or explain, to communicate ideas, or to tell a story.

As students explore the world of writing and experiment with their own styles and approaches, they develop critical thinking skills in multiple ways, and Writing A-Z has resources that can help you use writing as a critical thinking exercise in your classroom.

Audience Consideration

Writing does not happen in a vacuum—there is always an audience and context for every composition. The more students write, the better they become at analyzing not only who is reading their compositions (students, parents, teachers, etc.), but also what their audience wants or expects. Encourage students to be mindful of their audience’s needs, and with time, they will develop a repertoire of persuasive strategies for various audiences.

The revision process encourages students to think about their own thinking. When students re-read a draft, they confront the ideas and priorities that were most salient when they wrote it. A student may have learned a new skill or fact that can be incorporated into a future draft, and this new knowledge helps them see their ideas in a different light during revision. When students practice the metacognitive habits embedded in the revision process often enough, they improve their critical thinking skills and predictive reading abilities.

Collaboration

Along with revision, peer review is a key element of the writing process and it supports a student’s understanding of their audience. When students collaborate through peer review, they train themselves to anticipate the thoughts, concerns, and analyses of their peers. The constructive feedback students get from their peers develops not only a sense of community, but also opens students’ minds to different perspectives and approaches to understanding writing.

Want to try Writing A-Z in your classroom? Start a free 2-week trial today!

Try Writing A-Z

A Cambium Learning Group Brand

Sign up to receive our eNews, updates, and offers.

  • Privacy Policy
  • Accessibility
  • Learning A-Z Facebook
  • Learning A-Z twitter
  • Learning A-Z linkedin
  • Learning A-Z Instagram

A Cambium Learning® Group Brand

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

19k Accesses

21 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

critical thinking collaboration and the writing process

A meta-analysis of the effects of design thinking on student learning

critical thinking collaboration and the writing process

Fostering twenty-first century skills among primary school students through math project-based learning

critical thinking collaboration and the writing process

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

Introduction.

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence.

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2024)

Exploring the effects of digital technology on deep learning: a meta-analysis

The impacts of computer-supported collaborative learning on students’ critical thinking: a meta-analysis.

  • Yoseph Gebrehiwot Tedla
  • Hsiu-Ling Chen

Sustainable electricity generation and farm-grid utilization from photovoltaic aquaculture: a bibliometric analysis

  • A. A. Amusa
  • M. Alhassan

International Journal of Environmental Science and Technology (2024)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking collaboration and the writing process

  • Open access
  • Published: 02 July 2020

Scripting as a pedagogical method to guide collaborative writing: university students’ reflections

  • Kirsi Heinonen   ORCID: orcid.org/0000-0001-9365-0894 1 ,
  • Nore De Grez 2 ,
  • Raija Hämäläinen 1 ,
  • Bram De Wever 2 &
  • Sophie van der Meijs 1  

Research and Practice in Technology Enhanced Learning volume  15 , Article number:  15 ( 2020 ) Cite this article

6387 Accesses

2 Citations

7 Altmetric

Metrics details

A collaboration script is a set of instructions used to improve collaborative learning among students in technology-enhanced environments. Previously, university students’ perspective has been under-represented in the study on collaboration scripts. In this article, we focus on understanding students’ experiences in a scripted collaborative writing task, especially from the perspective of following the script. The study was conducted among undergraduate students ( N = 91) taking a master-level educational science course at one university each in Finland and Belgium. Divided into 25 groups, each with three to five members, the students worked on collaborative writing tasks. During their shared writing process within these groups, all students were introduced to a four-phase simultaneous sequential integrating construction script (SSCIS). Then, data from the students’ reflection notes on the collaborative writing process were analysed. The thematic analysis revealed that over half of the students highlighted how the introduced script provided them with appropriate support for the writing process; therefore, they used a script-based collaborative writing approach ( 52% ). However, almost as many students (48%) viewed the script as inadequate or even impeding their accomplishment of the mutual writing task; therefore, they chose to deviate from it. In this group of students decided not to follow script-based writing engaged in the collective writing approach ( 32% ), characterised as free form compared to sequential script-based writing. Other students opted for the separate writing approach ( 16% ), leading them to divide the writing assignment into single parts among the group members and to compose the text without authentic collaboration. By focusing on students’ individual reflections, the findings of our study suggest that to enhance script-based learning processes, it is essential to provide guidance for the implementation of the script, additional content-related support or tailored choices.

Introduction

In higher education (HE), collaborative writing is a widely used pedagogical approach that aims to prepare students for their future working life by promoting generic skills, such as collaboration, creativity and critical and reflective thinking (Hodges 2002 ; Jääskelä et al. 2018 ). Web 2.0 technologies offer novel and flexible opportunities to develop collaborative writing interventions by means of technological tools, which intend to facilitate student groups’ ability to co-write and edit each other’s texts, share information, produce and combine new knowledge and collaborate in different ways (Aydin and Yildiz 2014 ). Although current studies on HE have identified several positive consequences of applying the collaborative writing approach in technology-enhanced learning (TEL) settings (e.g. Ellis and Goodyear 2013 ; Ertmer et al. 2011 ; Storch 2005 ), in practice, there have been divergent results concerning students’ experiences with such an environment (De Wever et al. 2015 ). Learning exclusively through technology requires students to possess multifaceted skills; combined with the collaborative writing approach, this makes the learning process even more complex for students. Therefore, ensuring that students engage in productive, high-level and collaborative processes in TEL settings remains a significant challenge (e.g. Hämäläinen 2012 ; Fischer et al. 2013b ).

Collaborative writing as a learning method

The first serious discussions and analyses on collaborative writing emerged during the late 1980s when Mackler ( 1987 ) found that, according to co-authors, group efforts facilitated a better outcome than individually produced documents. Ede and Lunsford ( 1990 ) described collaborative writing as a social process that challenged the collaboration partners through (1) meaningful interaction during all stages of writing, (2) sharing of decision making and responsibility for the written product and (3) a single text as the product of collaborative writing. Later, Posner and Baecker ( 1993 ) proposed a detailed taxonomy of the collaborative writing process. They characterised joint writing in terms of four central elements: roles, activities, methods for document control and writing strategies applied during the process.

In this article, collaborative writing denotes the process of producing a common written document (Storch 2005 ) through shared planning, researching, writing, editing and reviewing by multiple authors (Posner and Baecker 1993 ). Definitions of collaborative writing share the idea that the collaboration comprises alternate phases of writing and interacting with partners. Computer-supported collaborative writing incorporates the key pedagogical idea of computer-supported collaborative learning (CSCL). In short, CSCL is a pedagogical approach where learning takes place via social interaction (Dillenbourg 1999 ; Dillenbourg et al. 2009 ). Specifically, collaborative writing creates a learning experience that constructs social and cognitive activities through interaction (Li 2013 ).

Collaborative writing is ‘an iterative and social process that involves a team focused on a common objective that negotiates, coordinates and communicates during the creation of a common document’ (Lowry and Nunamaker 2003 , p. 12). From the students’ perspective, collaborative writing is recognised as a highly complex process and has been subjected to a wide range of studies. However, there are few clear answers about the best strategy to employ for a collaborative writing task. Additionally, there is no clear understanding of how a specific group adopts a particular strategy or strategies and how these evolve during the writing process (Onrubia and Engel 2009 ). Although collaborative writing is a promising and widely used teaching method in HE contexts, different challenges can be distinguished. Collaborative writing has been recognised as a highly demanding task since individual and collaborative cognitive and metacognitive processes are interwoven (Marttunen and Laurinen 2012 ). Developing truly collaborative writing processes (Mayordomo and Onrubia 2015 ) and conducting particular collaborative writing activities, such as revising, have proven to be very challenging (Wichmann and Rummel 2013 ). For example, prior studies have claimed that some students experience it as awkward to revise or comment on other students’ written thoughts in a collaborative wiki environment (De Wever 2011 ; Kale 2014 ). Although in the CSCL context, the critical role of social interactions in successful collaboration has been emphasised (e.g. Kobbe et al. 2007 ; Vogel et al. 2016 ), earlier research suggests that simply involving students in collaborative learning settings through technology does not automatically ensure pedagogically beneficial interactions among students (Hämäläinen and Häkkinen 2010 ). Thus, there is a need to support students in their collaborative writing processes.

Scripting as a method to promote students’ collaboration

Instructional scaffolding, a process designed to support a certain learning activity, has been recognised as a way to generate successful collaborative learning. Over the last decade, several studies have illustrated the positive effects of socio-cognitive scaffolding via collaboration scripts, especially in HE contexts (e.g. Kobbe et al. 2007 ; De Wever et al. 2015 ; Fischer et al. 2013b ; Rau et al. 2017 ; Vogel et al. 2017 ). A collaboration script is a set of instructions used to improve collaboration among students during the learning process. In practice, the advantages of collaboration scripts are typically described as promoting the reason for students’ interaction in collaborative learning settings (Fischer et al. 2013a ; Kollar and Fischer 2007 ; Rummel and Spada 2005 ). In this light, scripting has been applied in various ways, and previous studies have indicated that the collaborative writing process can be facilitated by dividing it into particular phases and assigning students tasks or roles related to each phase in order to support the creation of a jointly written document (e.g. De Wever et al. 2015 ; Heimbuch et al. 2018 ). Additionally, scripting can be used to specify the roles that students are expected to adopt during collaboration processes or to trigger the roles that are relevant for high-level learning activities (e.g. resolving socio-cognitive conflicts; see Strijbos and Weinberger 2010 ).

In this study, we applied the simultaneous sequential integrating construction script (SSICS) (De Wever et al. 2015 ), which aimed to enhance collaboration and shared responsibility among the students. The SSICS is a pedagogical model that engages students in an authentic collaborative process of shared knowledge construction through sequencing task-based activities and delineating specific roles to achieve a shared responsibility of an entire collaborative task (De Wever et al. 2015 ). Before starting to accomplish their co-authored writing task, the small groups were advised that the idea behind the script was to provide them with a general precept for collaborative working and writing but did not require precise adherence to it (Dillenbourg and Tchounikine 2007 ).

The students were introduced to the four-phase SSICS procedure as follows: student 1 in each small group started at phase 1 (state-of-the-art), student 2 at phase 2 (advantages), student 3 at phase 3 (disadvantages) and student 4 at phase 4 (attention points for drawing conclusions) (Fig. 1 ). The first phase (state of art) required each of the students had to read the mandatory source together with one additional source and start writing a draft for this specific part of the collaborative document. The second phase (advantages) required students to evaluate what aspects of the theoretical ideas or empirical evidence can be well applied. The third phase (disadvantages) demanded students to focus on critical points of view or empirical evidence. At the end, the forth phase of the script called for students to identify the main points of their task and ensure that the main points were well-balanced. After 1 week of working on a particular phase, the students were rotated in the following manner: student 1 moved to phase 2, student 2 moved to phase 3, etc. This required their reading more sources in order to proceed to the next phase. In the subsequent phases, the students were asked to revise the drafts written earlier by their fellow students with regard to their phase-specific perspective. In this way, they advanced through the four phases that were meant to inspire them to edit other group members’ texts in order to take responsibility for all aspects of the mutual writing assignment and therefore engage thoroughly in the collaborative writing process.

figure 1

Overview of the four phases introduced in the SSCIS (modified from De Wever et al. 2015 )

The main aim of the SSICS procedure presented here intended to coordinate students’ joint writing by associating it with collaborative text production, in which new content creation is founded on the former output (Arvaja et al. 2007 ; Kobbe et al. 2007 ) of the co-authors. In practice, the SSICS script guided students in such a way that, during the first phase of the script, they each had to read different information sources and begin writing a draft for one specific section of the collaborative writing project. In the subsequent phases, students were required to read several more sources and then were asked to revise the drafts made earlier by their fellow students. This script thus stimulated the students’ collaboration instead of just presenting a means for dividing the work. SSICS was labelled ‘sequential integrating’, given that students should successively contribute to the text and propose justified modifications (Onrubia and Engel 2009 ) and ‘simultaneous’ since each of them should start with one section. The SSICS has been shown to be beneficial in increasing shared responsibility and influencing the degree to which a task is addressed (for the number of sources studied and amount of time spent on the task, see De Wever et al. 2015 ).

Notwithstanding the positive consequences of scripting (Fischer et al. 2013b ), there is still no guarantee that scripting axiomatic leads on intended ambition in collaborative learning settings. Other research on the effectiveness of CSCL scripts has also reported negative findings. For example, Raes et al. ( 2016 ) have found no significant improvement that could be attributed to the classroom script intervention. Other studies on CSCL have identified the ‘over-scripting’ issue, where students considered the introduced script as excessively restricted and therefore declined to work according to it (Dillenbourg 2002 ; Dillenbourg and Hong 2008 ). In other cases, students reported that the script did not offer them sufficient support in relation to the script level, leading to the ‘under-scripting’ problem (Dillenbourg and Hong 2008 ; Weinberger et al. 2009 ). Finally, previous studies on CSCL scripts have identified a possible divergence between the ideal script and the actual, realised script (Dillenbourg and Jermann 2007 ; Papadopoulos et al. 2013 ). Despite significant attention to the outcomes of scripted collaboration and collaboration processes among students (see the review by Vogel et al.), the ways that students experienced the scripted collaborative task and how these experiences are associated with their decisions on how to proceed with the task have been somewhat neglected. In this respect, appropriation of scripts has recently garnered attention (Stegmann et al. 2016 ; Tchounikine 2016 ).

A script can be perceived as an artefact of which learners develop an understanding, and appropriation is the process by which learners perceive, understand and make the script theirs (Tchounikine 2016 ). To make sense of these pedagogical challenges that may arise when integrating scripts into collaborative learning settings (Häkkinen and Mäkitalo-Siegl 2007 ), it is indispensable to gain new knowledge on how students experience and appropriate the introduced external scripts. Given that students’ experiences with the scripts may shed light on why scripts are followed or disregarded, it is important to investigate such experiences. Our study addresses this twofold research question: How did the university students experience the collaborative scripted tasks in relation to ( a ) the kinds of writing approaches they applied and ( b ) the key restrictions and resources affecting these approaches ?

Context of the study

The study involved undergraduate students from one Finnish university and one Belgian university. The participants were taking an optional master-level course in educational sciences ( N = 91; Finland, N = 47 and Belgium, N = 44). In total, 95 students participated in this study, but afterwards 91 students reflected their joint writing process detailed enough and therefore used for final analysis. The demographics of the participants are reported in Table 1 .

Both interventions were designed and executed according to the same guidelines. The similar conditions for the interventions endeavoured to assure close cooperation between the responsible professors and teachers in both universities. Therefore, the course design, execution and grading procedure for the collaborative assignments were discussed thoroughly before, during and after the interventions. In both universities, the students were randomly assigned to three- to five-member groups that worked together on collaborative writing tasks for 12 weeks. Students were divided into 25 groups (Finland, N = 14 and Belgium, N = 11), in which 6 groups consisted of 3 students, 18 groups comprised 4 students and 1 group had 5 students. During their joint writing process in the groups, all students were introduced to follow the four-phase SSICS procedure. The teachers and the small groups met in class for 2 h per week, and the students worked together out-of-class on a joint writing assignment online or face-to-face. They were allowed to decide on their working practicalities in their respective groups and to use as much time as they needed to accomplish the required task. Although the content of the specific courses varied, the students in both universities were all involved in a task with a similar structure and writing requirement, consisting of a recommendation paper (12–15 pages) on a specific subject (wikis, collaboration or sense making) with an in-depth focus on how educational practitioners could use these ideas in developing their work.

All groups wrote the text by using Web 2.0 technologies, that is, a shared Google Docs document or Moodle’s Wiki tool. All students were instructed to follow a four-phase SSICS procedure based on a script used by De Wever et al. ( 2015 ), in which the collaborative writing task was divided into different phases to support the students’ collaboration (Dillenbourg and Tchounikine 2007 ; Kollar et al. 2006 ). In this study, the idea of the SSICS is to guide the groups step by step in co-producing their papers by following particular guidelines through the rotation system. In our previous study, we focused on the effects of the SSICS on students’ learning processes (based on collaborative knowledge construction) and outcomes (De Wever et al. 2015 ; De Grez et al. 2017 ). This present study goes one step further by focusing on students’ experiences and how these are associated with their decisions on how to proceed with the scripted CSCL task. Therefore, after the intervention, all students were individually asked to write reflection notes about their experiences during the scripted, computer-supported collaborative writing task.

Data collection and analysis

Students were asked to produce a written document that reflected on the following guiding questions: (1) What did you do during the complete process of writing the paper? Through which steps did your work proceed? (2) What went well, and what proved to be a challenge? How do you explain this? (3) Please explain how students who will take this course next year—and will thus collaborate on a similar task—should proceed in order to collaborate successfully (Rummel et al. 2009 ). They were asked to individually write at least 300 words on each of the guiding questions and hand in their reflection notes at the conclusion of the course. They wrote their reflection papers electronically and submitted them through the learning management system. To obtain reflections that would be as authentic as possible, these papers were not graded. All students involved in the study were required to sign informed consents.

We analysed the students’ reflective writings using the qualitative thematic analysis approach in an inductive manner (see Braun and Clarke 2006 ). An iterative analysis process was conducted by the first author in close association with the second and the fifth authors. All authors participated in definitive cross-checking of the emerging themes and subthemes. We used the following techniques to assure rigour: (1) The data were collected from similar scripted tasks in both universities and underwent the same collection process. (2) Investigator triangulation was used, with three investigators participating in the applied analysis process (Patton 2001 ). (3) The collaboratively developed codebook used systematic, data-driven analysis. (4) The participants’ abundant written reflections were used to support the results. To identify thematic patterns across the data, our analysis process was carried out in accordance with Braun and Clarke’s ( 2006 ) guidelines and proceeded through the following six main phases.

(1) Becoming familiar with the data. To engage with the entire data set, the students’ reflective writings were read and re-read actively to identify the prevalent topics concerning the scripted collaborative writing task in technology-enhanced environments from the university students’ perspective. By highlighting the potentially meaningful information, the initial ideas were noted for the next phases of the analysis. This phase revealed that four participants had not reflected on their scripted collaborative writing process in the light of the research question; thus, their reflection notes were omitted, and the remaining data set ( N = 91) was included in the subsequent analysis.

(2) Generating the initial codes. Two researchers separately coded 16% of the data set with an inductive approach by obtaining all content that was pertinent to the students’ reflections from the vantage point of the research question. We first formulated the initial codes on whether the students indicated that they had applied or deviated from the introduced script. Descriptions of how the students had tackled the task individually and as a group, in relation to the script, were included under the first codes. Another set of codes was created for clarifications of why the students decided to apply or deviate from the script. To ensure agreement on the appropriate codes, the two authors jointly and critically reflected on the coding. The structured codebook was developed as a result of the reflective discussion. It included 92 initial codes labelled with a few words or short sentences, such as ‘struggling in the beginning’, ‘deviated from the script’ and ‘followed the script’. To identify all potentially relevant meanings of the students’ reflections, individual extracts were coded as many times as possible throughout the data set. Furthermore, the process involved collating the initial codes with relevant extracts to identify meaningful patterns.

(3) Searching for themes. In this phase, similar codes were collated under potential themes and relevant subthemes. This iterative process entailed multiple rounds of profound reading of the data and examining the relationships between divergent codes in the light of how university students reflected on scripted collaborative writing. The meaningfully sorted codes represented three broader themes—script-based, collective writing and separate writing approaches—each consisting of subthemes that were constructed by analysing the properties of individual codes within them. In this process, 18 subthemes were identified. Table 2 illustrates an example of codes applied and themes and subthemes identified by way of a short data section.

(4) Critically reviewing the themes. The potential themes were critically reviewed in conjunction with their relevant subthemes to confirm the codes’ internal homogeneity and external heterogeneity (Patton 2001 ). The first and the second authors re-examined the themes that emerged in phase 3, which aligned with the coded extracts throughout the data set. The authors individually rechecked all the extracts within the potential themes and re-evaluated their correspondence with each other and in light of research question. A triangulated review was conducted in accordance with the framework, based on data-driven classification of the data.

(5) Qualifying and naming the themes. Through joint negotiations, all authors agreed on the final themes and subthemes. The process aimed to determine which aspects of the data represented each of the three main themes and 18 subthemes. The authors ensured that the invoked themes and subthemes had an exceptional focus were not overlapping and directly addressed the research question (Braun and Clarke 2012 ).

(6) Writing the final report. Finally, to produce a compelling overall report on the research findings, analytical descriptions of the relevant data extracts were incorporated to reveal important insights on the research context (the participants’ names are pseudonyms).

Results on approaches to scripted collaborative writing

A thematic analysis of the students’ reflective writings identified three main themes (script-based, collective and separate writing approaches) and 18 subthemes (see Table 3 ) regarding the scripted collaborative writing task in the technology-enhanced environment. Each small group approached the writing task in a group-specific way; some groups followed script-based writing, while others customised or deviated from the proposed script. The students who applied script-based writing highlighted several positive consequences of following the SSICS procedure, such as providing the structure and the organisation for working and writing, as well as ensuring equal involvement in the joint writing process. The students who characterised their mutual writing process as collective used the free-flow writing strategy and the group members’ abilities rather than the structured support of the SSICS. The students who employed the separate writing strategy viewed the SSICS as hampering the accomplishment of the mutual task and therefore decided to split the writing task into detached portions among the group members. Furthermore, the results indicated the factors that played an important role for the students in either following or deviating from the proposed script. Each main theme, together with the subthemes (highlighted in italics and identified by their numbers in parentheses), will be further elaborated in the following sections.

Tackling the collaborative writing task by applying the external script: script-based writing approach

Based on the results, 52% of the students (47 of 91) noted that they had applied the proposed external script to their collaborative task by following the script-based sequential writing strategy. Typical for this pattern, the students reported positive consequences when using the script to complete a joint writing assignment. The following excerpt shows how Matilda viewed applying the collaborative script as an advisable and straightforward working method ( subtheme 1 . 1 ) in which the initial phase was essential and formed a foundation for the co-writers’ forthcoming work:

I would recommend students follow the rotation system. It is an easy way of working. But it is important to take the first step with a lot of care. The first step is the basis the other students will proceed on. (Matilda)

When describing their approach, the students pointed out that the external script guided and organised their collaborative writing process ( subtheme 1 . 2 ). The script directed them to strive for a common understanding about the tangible accomplishment of the task, thus setting an aim for their joint writing assignment. Kristian reported how the script supported their collaboration by inspiring his group members to plan and discuss their joint task purposefully in the early phases of the writing process.

The first step for our group and us as individuals was to try to understand what the task was all about. We all read through all the instructions and information that was available by ourselves and then discussed how each of us understood them. We also presented many questions for other members of our group. (Kristian)

In this approach, the participants emphasised the affirmative effects of the script and found that it enhanced the group members ’ equal involvement ( subtheme 1 . 3 ) in the collaborative writing process. First, the students stated that the script supported them in tackling the task as a group. Following the script-based rotation system through the four phases encouraged them to complete one others’ work during their collaborative writing process. By applying the script, all group members felt more able to align their input and create a coherent text together. In the following example, Klaara highlighted how the script provided an explicit distribution of the work and the roles, which called for all her group members’ equal participation in writing the assignment.

Concerning the writing process, the rotation system was a good and practical method. It was my first time working like this, but I will definitely use it in the future. It created structure and clarity on what every group member had to do at each moment. Because of this, the contribution and responsibility of each member came forward. (Klaara)

When describing their approach to script-based writing, the students reported that the script guided their efforts towards genuine collaborative writing. Hence, writing activities, such as editing , commenting and completing the text produced by others ( subtheme 1 . 4 ), were viewed as forming an integral part of the writing process, in which the proposed script played a crucial role to promote fertile group practices regarding the co-creation of a new text. The students who followed the script emphasised encouragement and edited their colleagues’ writings without hesitation. Lukas described how applying the script rapidly accustomed his group members to revising one others’ texts and forming their ideas based on previous work.

After the first rotation of roles, editing the writings of the others felt a bit clumsy, but quickly we got used to it. The above-mentioned encouragement for the text editing gave us the courage to edit others’ texts. Although we had agreed with what kind of information was needed to search for in each role and finding information was not difficult, it was pleasantly easy to take a little model of what others had already written and build ideas on them. (Lukas)

Additionally, the students who applied the script believed that the script-based circulation system assisted them considerably in conjoining their divergent writing styles ( subtheme 1 . 5 ). Hence, they noted that the script supported them in creating a consistent shared text together. Although they reported examples of their initial apprehension about merging significantly different study experiences in academic writing and writing styles, they felt that the proposed script eventually ensured the production of a well-written, uniform and coherent document. Marie cited her extremely positive stance towards the script-based writing approach and characterised how the introduced script offered accountability to all group members and helped them amalgamate divergent writing styles into one streamlined document:

The rotation system was a huge plus for the essay. The different writing styles are matching and the text reads as a whole. You feel more responsible for the whole essay, instead of being responsible for just one part of the essay. (Marie)

Although the students reported following the proposed script, a few (13%; 12 of 91) mentioned the variety of additional agreements that they made and/or modifications to the script. These additional agreements ( subtheme 1 . 6 ) proved to support their collaborative writing. The students experienced their consensus on particular methods of profound reading and identifying significant knowledge, as well as the application of such methods was beneficial for the collaborative accomplishment of the task. They made additional agreements on the timetable to enable all group members to complete their writing on time. Therefore, supporting their writing colleagues empowered them to slightly modify the script, as Thomas described:

At first, we all worked on our topic individually, asking for help when necessary; however, we soon discovered we would be more successful if we checked other’s work often or at least when our phase was finished in order to avoid repetition throughout the recommendation, to know the structure of the rest of the text, and to contribute if necessary with information found. (Thomas)

Tackling the collaborative writing task by deviating from the external script: collective writing approach

Our results revealed that nearly one-third of the students (32%; 29 of 91) consistently emphasised writing collectively but stated that they actively revised or deviated from the proposed script. Typical for this pattern, the students highlighted the significance of group dynamics, supporting and complementing one another during the writing process. First, they reported examples of how they used a free - flowing collaborative writing approach ( subtheme 2 . 1 ) rather than a sequential scripted writing technique. They expressed how, as a group, they were more successful by ignoring the script-based sequential writing approach and instead engaging in free-floating collective writing. For example, Silja portrayed how her group collectively settled on an unattached writing approach:

We discovered that it was better to stop following the proposed rotation system. Instead, we chose to work with the articles we were reading. We gave every group member the freedom to complete every part of the paper with relevant information. (Silja)

The students who mentioned that they accomplished their mutual writing task by exploiting free-flowing technique focused on enhancing the most inadequate properties of the co-authored text by utilising their group resources ( such as group dynamics ; subtheme 2 . 2 ). In contrast, students emphasised their group members ’ individual interests as resources for collaborative writing ( subtheme 2 . 3 ). In this light, Adrien described how deviating from the proposed script led his group to develop an apposite self-script:

This was also the time we started to diverge from the original script. In my opinion, we started to work according to what is the most important need or deficiency of our text. This resulted in dividing the work according to the deficiencies of the text, and on the other hand according to our own interests. ... However, I think the whole process was more free-flowing rather than consciously structured (even though there was some kind of script). I think we came up with a good self-script that worked well in this context, with these specific people and our group dynamics. I think we were also lucky to get this kind of group. (Adrien)

Other students demonstrated how they deviated from the script by permitting their group members to write from the perspective of their individual strengths. Furthermore, they reported how their collaboration evolved from following the introduced script to free-flowing collaborative writing. Thus, they rejected the rotation system and allowed their group members’ unique skills to emerge in their assigned of responsibility ( subtheme 2 . 4 ), as reflected in Saara’s statement:

We divided the tasks again. We took into account the talents of each member. Somebody else and I were better in writing, so we read all parts and processed the feedback … In short, the finalizing of the paper was our duty. Another group member checked the (in-text) references and, if necessary, adjusted them. (Saara)

Moreover, the students felt that the proposed script was difficult to follow ( subtheme 2 . 5 ), especially in the early phases of the writing process, or hard to understand from the perspective of completing the collaborative writing assignment. Matias demonstrated how their experience of jointly writing the script substituted for writing comprehensively together:

We did not do any interchange of the roles, as was instructed. We tried this once, but ended up writing the essay comprehensively together. The reason for that could be that at first we didn’t understand what we were doing and especially that we didn’t have time for all interchanges of the roles. (Matias)

Other students noticed that the script did not sufficiently support them in writing a smooth shared text as a group. When they recognised that some of their group members were struggling with joint writing, they experienced the proposed script to be meaningless ( subtheme 2 . 5 ) and therefore opted out of following the script instructions. They reported a variety of examples concerning difficulties in writing together. Some group members were unable to write their parts, while rotating and writing styles differed radically in some groups. Helena portrayed how these difficulties resulted in ditching the proposed script:

The next time that we came together, some of us were finished with their part while others were not. Some group members told which difficulties they encountered. We gave each other tips and referred to articles. In this way, we discovered that it was more convenient to not follow the rotation system anymore. (Helena)

Another student reported writing the whole document together ( subtheme 2 . 6 ), face to face and/or online by focusing on specific aspects of the text. Kaarina described this approach as follows:

After round two of the rotations, we switched to revising every part. At first, we did this by posting online comments and suggestions, but very quickly, we decided that it was easier to revise it face to face. During the face-to-face time, we revised different things. We paid attention to the cohesion of the text, substantive and correct transitions and smooth transitions, correct language use etc. (Kaarina)

Tackling the collaborative writing task by deviating from the external script: separate writing approach

Our results indicated that 16% (15 of 91) of the students disallowed the proposed script in order to end up with the separate writing approach, where dividing the joint writing task into separate parts was essential instead of collaborative sequential writing. When the students did not know how to proceed with the scripted task appropriately and effectively, they ended up deviating from the script. They felt that the presented script method was inconsistent writing method ( subtheme 3 . 1 ) for writing together. Lotte outlined how the script offered insufficient support for the group to tackle the joint collaborative task:

Our group started the writing process by following the given script—the four steps. However, after the first week, all of us were confused and overwhelmed with our assigned task … In our work, we saw a lot of overlapping ideas and summaries of the required article. For the next two weeks, all of us were fairly inactive with this paper, as we did not know how to continue writing the paper … The outline served as our new script. We divided the writing task so that every member was responsible for at least one part of the essay … Editing a single text all together was not a good idea. We spent too much time with not much efficiency. (Lotte)

Furthermore, inadequate planning and poor organisation ( subtheme 3 . 2 ) of the activities in the early phases of the shared writing process were reported as key reasons for deviating from the script. These included tackling the group work without common agreements concerning working practicalities, organising the written document or even discussing hardly anything related to the group task. When the students experienced the lack of structure and agreement, they decided to allocate separate portions of the writing task to each group member. In some groups, inadequate planning caused difficulties with collaboration, as conveyed by Kalle:

Everybody got a part: a state of the art, advantages and disadvantages, attention points and recommendations. Originally, we started working according to the rotation. Hereabout we did not make good agreements and this resulted in chaos. That is why we changed our approach and decided that everybody checked their sources for useful information. (Kalle)

The students experienced it challenging to follow the script when their group composition suddenly changed, such as when some members dropped out of the course and left the group. Ruben demonstrated how the divergent group sizes in relation to the number of script phases ( subtheme 3 . 3 ) forced them to change their writing strategy in midstream and settle on the separate writing approach in which each student was responsible for one part of the assignment:

First, we worked to our steps, but after one member left us and we more focused to part which we started with (extension, editing) and added few sentences to conclusion. (Ruben)

The students also provided examples of the difficulties they had encountered in text production as a group ( subtheme 3 . 4 ). Leena illustrated her experience concerning the complexity of generating shared text, as follows:

We ended up not switching the roles... Because of our different writing and reference styles, we found it difficult to produce text together. Most of the text we produced individually and later combined. (Leena)

When the students experienced difficulties in grouping ( subtheme 3 . 5 ), such as forming linkages between their group members in the beginning of the writing process, they readily rejected the sequential writing style and chose a more individual text production approach in which separately written parts were combined into a complete document. Moreover, the students reported that regardless of their group, they worked individually on their joint writing assignment. Martta described how they worked together but practically alone:

We did our group work quite independently even though we were a group; each one of us always had their own part and we did not talk much to each other about the subject of group work. (Martta)

In this light, the students reported examples of how they worked individually on their own parts by sharing their work online. In such cases, the students identified themselves as more attuned to individual rather than collaborative working strategies. Justin described their separate writing process as follows:

All the members of our group worked very individually—probably because we all were used to individual work. As a group, we did not discuss that much; we worked on Google Drive and shared our work online. Our group worked more like a team, where everyone had their own task to complete; it never really reminded me of actual group work. (Justin)

As indicated in the above example, the script did not provide students with adequate guidance when working as part of a group in a technology-enhanced environment ( subtheme 3 . 6 ). Additional guidance was necessary, as Justin explained:

I think the idea of this course is good: providing possibilities to work on a demanding task in a group. But it was a challenge because our group did not really bond, and we never got that close-knit feeling of belonging to the same group, partly because we hardly ever saw each other in real life—we worked online almost the whole time. I think it would be a good idea to give the group members a little more to go on at the beginning of the task. (Justin)

To date, students’ perspectives have been under-represented in studies on CSCL scripts. Therefore, this study focused on understanding university students’ experiences in scripted collaboration and how their experiences were associated with the decisions they made during a CSCL writing task. We found that three divergent methods were employed: script-based, collective and separate writing approaches. The students who followed the script-based writing strategy emphasised its positive consequences. The students who applied the collective writing strategy preferred free-flow writing and viewed their group as an essential resource for accomplishing the required task and therefore decided to disregard the script-based writing. Similarly, the students who used the separate writing approach deviated from the script to split the writing task into separate portions among the group members. Over half of the students (52%) reported following the script-based writing approach. They indicated that the script enhanced their collaboration, organised their writing process and ensured the production of a consistent written document. However, nearly as many students (48%) encountered difficulties in following the proposed script and therefore deviated from it when tackling the writing assignment.

At the general level, scripting seemed to enhance students’ collaborative writing by introducing the reason for interaction (Kobbe et al. 2007 ; Fischer et al. 2013a ; Kollar and Fischer 2007 ; Rummel and Spada 2005 ; Vogel et al. 2017 ). Fifty-two percent of the students indicated a positive stance towards the proposed script. In their reflections, the students pointed out that the SSICS promoted their collaboration by triggering shared interaction (see Hämäläinen and Häkkinen 2010 ) and ensuring equal involvement of all group members. Furthermore, they highlighted that the script challenged and encouraged them in collaborative text production, editing, commenting on and extending their co-writers’ ideas. Despite this positive finding, 32% of the students experienced the SSICS as meaningless and reported that they deviated from the script and focused on a more free-form collective writing approach. The findings of our study illustrated that encountering difficulties while endeavouring to write together led her group to stop following the rotation system. The group members’ different levels of ability to produce the text together contributed to their reflections on the script and played an essential role in their deviation from it. Thus, instead of the guidance of the proposed script, they leaned on group dynamics and other students as resources for shared group work (see Arvaja 2012 ). In such cases, the students viewed script-based writing as intrusive and limiting and accordingly rejected it (cf. over-scripting in Dillenbourg 2002 ; Dillenbourg and Hong 2008 ). Additionally, the students who rejected the proposed script and therefore engaged in a collective approach reported that they developed a self-script to accomplish the joint writing task. These interpretations and their resolutions represented how scripting collaboration could lead to script appropriation by students, that is, their personal enactment of the proposed script (Tchounikine 2016 ).

Finally, the findings revealed that 16% of the students applied the separate writing approach, which emerged as they endeavoured to accomplish the writing task individually rather than collaboratively. According to the students’ interpretations, they deviated from the given script by allocating separate segments of the writing task to each group member. These students indicated that difficulties in the early phases of the writing process (i.e. group composition, the lack of effective communication or adjustments in the number of group members) influenced their decision to proceed with separate writing strategies. In this light, the results are in accordance with previous findings indicating the ineffectiveness of some scripts and potential moderators, such as transactivity, the script level, that is, under-scripting (e.g. Dillenbourg and Hong 2008 ; Weinberger et al. 2009 ), additional content and context-related support (Sadler 2004 ). In this regard, our findings suggest that the starting phase of the task proved to be essential (see also Kollar and Fischer 2007 ); therefore, it is crucial to provide guidance in implementing the script. Furthermore, the findings indicate that the students need to find common ground not only in the beginning (Tan 2018 ) but also during the execution of the task. In line with previously described moderators of the effectiveness of CSCL scripts (Vogel et al. 2016 ), such as additional content-related support, the latter seems to be an essential element of scripting. Thus, it is necessary to provide content-related support without hampering the students’ motivation by providing sufficient choices, for example.

This study’s findings have to be viewed in the light of the following limitations. First, the experimental pretest-posttest design was ignored when the study was executed; thus, this study did not determine the actual effects of the CSCL script on the students’ collaborative writing process. Regardless of this evident limitation, the study was based on an extensive qualitative data set of the students’ reflective writings that were collected within various groups and contexts. Hence, the study revealed significant findings on students’ insights that emerged in a parallel unique manner through incomparable group processes. Second, the data were collected from the students’ individual reflection notes concerning the scripted collaborative writing process, which might not have been applied thoroughly in the collaborative process, especially at the group level. Nevertheless, the study was advantageous in indicating several potential reasons why some students experienced the proposed script as worthwhile, while others found that the script hampered them in accomplishing the collaborative writing task. Third, this study did not aim to investigate group-level interaction processes during the scripted collaboration and how these appeared to affect individual meanings and socially shared understandings alike. Therefore, combining these two research pathways—students’ reflective texts and collaborative conversations (Krange and Ludvigsen 2008 )—would be fruitful in future work. Finally, the findings were based exclusively on the students’ self-reporting on the scripted collaborative writing process. Further studies might benefit from employing multiple data sources (e.g. video recordings, interviews, quantitative procedures) to gain a deeper understanding of this multidimensional phenomenon. Utilising learning analytics with students’ reflective writings could also provide more robust evidence concerning the relation of the emerging collaboration process with students’ experiences on scripted collaborative writing in TEL settings.

Our study contributes to current research by providing novel insights into university students’ experiences with scripted collaborative writing in a technology-enhanced environment. First, previous studies have focused on the outcomes of students’ collaboration and collaboration processes (see, e.g. the review by Vogel et al. 2017 ) neglecting how students actually experience the scripted collaboration and how their experiences are associated with their decisions when proceeding with the task. This present study both reveals divergent insights into 91 students’ individual reflections and offers a unique view into scripted collaborative writing in CSCL environments. Second, pedagogically, an important implication of our study is that its results shed new light on the ineffectiveness of scripting (Weinberger 2008 ) for a particular group of students. As previous studies have established, students employ different strategies to tackle a collaborative writing task (Onrubia and Engel 2009 ). For example, Limbu and Markauskaite ( 2015 , p. 398) categorise the following conceptions: ‘(1) the division of work to complete the task, (2) the combination of expertise to produce a good end product, (3) the fusion of ideas and insights to enable deeper understanding and (4) the development of new skills and attitudes for collaborative work’. The current study also distinguishes students’ individual preferences concerning collaborative writing that seem to contribute to the decisions that they make as a group to accomplish the scripted task. This finding raises intriguing questions regarding the existence of different types of collaborative writing approaches and their relation to the optimal level of script-based support for collaborative writing tasks.

Third, our study provides an important theoretical implication. Although it is well-known that finding a balance for optimal support is needed to avoid over-scripting or under-scripting, our results illustrate another possible kind of mismatch between the provided scripts and how the students who rejected it envisioned performing the collaborative task as a group. While over-scripting refers more to cases where students know what to do and the provided script therefore hampers them (Dillenbourg 2002 ; Dillenbourg and Hong 2008 ), scripting mismatch is linked more to group situations in which students do not see the point of either the script or the collaborative task, thus rejecting the provided script. In fact, the findings suggest some kind of mismatch between the ideal script (Papadopoulos et al. 2013 ), the task environment and students’ ideas about the script. Following the script theory of guidance (Fischer et al. 2013b ), it could be argued that students’ internal scripts conflict with the external script provided, and to keep going, students need a certain level of alignment between internal and external scripts. Either this alignment is sufficient and they follow the actual script, or it is inadequate, and they adapt to or neglect the external script. Specifically, when the script differs too much from students’ internal ideas and/or preferences (internal script), it is difficult for them to recognise the purpose of the script, and they may feel alienated from it. The experienced usefulness of the scripts could thus be argued as one of the most important predictors influencing script use. Moreover, the way that students understand the external script can influence how they appropriate and realise it to different degrees (see also Stegmann et al. 2016 ).

Availability of data and materials

The research data is stored in the University of Jyväskylä and in the Ghent University. Due to the both universities privacy policy, the research data cannot be shared.

Abbreviations

  • Computer-supported collaborative learning
  • Technology-enhanced learning
  • Higher education

Simultaneous sequential integrating construction script

Arvaja, M. (2012). Personal and shared experiences as resources for meaning making in a philosophy of science Course. International Journal of Computer-Supported Collaborative Learning, 7 (1), 85–108 https://doi.org/10.1007/s11412-011-9137-5 .

Google Scholar  

Arvaja, M., Salovaara, H., Häkkinen, P., & Järvelä, S. (2007). Combining individual and group-level perspectives for studying collaborative knowledge construction in context. Learning and Instruction, 17 (4), 448–459 https://doi.org/10.1016/j.learninstruc.2007.04.003 .

Aydin, Z., & Yildiz, S. (2014). Using wikis to promote collaborative EFL writing. Language Learning & Technology, 18 (1), 160–180.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3 (2), 77–101 https://doi.org/10.1191/1478088706qp063oa .

Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper (Ed.), Handbook of research methods in psychology (pp. 57–71). Washington, DC: APA Books.

De Grez, N., Lämsä, J., Hämäläinen, R., Kollar, I., & De Wever, B. (2017). How do students deal with 2 different scripts for a collaborative inquiry scientific reasoning task. Presented at the 17th Biennial EARLI Conference for Research on Learning and Instruction.

De Wever, B. (2011). Orchestrating collaborative learning in a wiki-environment. Paper presented at14th Biennial Conference for Research on Learning and Instruction (EARLI) (2011) Exeter, UK. Retrieved from https://earli.org/sites/default/files/2017-03/BookofAbstractsandSummaries.pdf .

De Wever, B., Hämäläinen, R., Voet, M., & Gielen, M. (2015). A wiki task for first-year university students: The effect of scripting students’ collaboration. The Internet and Higher Education, 25 , 37–44 https://doi.org/10.1016/j.iheduc.2014.12.002 .

Dillenbourg, P. (1999). What do you mean by ‘collaborative learning’? In P. Dillenbourg (Ed.), Collaborative learning: Cognitive and computational approaches (pp. 1–16). Amsterdam: Pergamon, Elsevier Science.

Dillenbourg, P. (2002). Over-scripting CSCL: The risks of blending collaborative learning with instructional design. In P. A. Kirschner (Ed.), Three worlds of CSCL: Can we support CSCL? (pp. 61–91). Heerlen: Open University of The Netherlands.

Dillenbourg, P., & Hong, F. (2008). The mechanics of CSCL macro scripts. International Journal of Computer-Supported Collaborative Learning, 3 (1), 5–23 https://doi.org/10.1007/s11412-007-9033-1 .

Dillenbourg, P., Järvelä, S., & Fischer, F. (2009). The evolution of research on computer-supported collaborative learning. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, & S. Barnes (Eds.), Technology-enhanced learning (pp. 3–19). Dordrecht: Springer https://doi.org/10.1007/978-1-4020-9827-7_1 .

Dillenbourg, P., & Jermann, P. (2007). Designing integrative scripts. In F. Fischer, I. Kollar, H. Mandl, & J. M. Haake (Eds.), Scripting computer-supported collaborative learning: Cognitive, computational, and educational perspectives (pp. 275–301). New York: Springer https://doi.org/10.1007/978-0-387-36949-5 .

Dillenbourg, P., & Tchounikine, P. (2007). Flexibility in macro scripts for computer-supported collaborative learning. Journal of Computer Assisted Learning, 23 (1), 1–13 https://doi.org/10.1111/j.1365-2729.2007.00191.x .

Ede, L. S., & Lunsford, A. A. (1990). Singular texts/plural authors : Perspectives on collaborative writing . Carbondale: Southern Illinois University Press.

Ellis, R., & Goodyear, P. (2013). Students’ experiences of e-learning in higher education: the ecology of sustainable innovation . New York: Routledge.

Ertmer, P. A., Newby, T. J., Liu, W., Tomory, A., Yu, J. H., & Lee, Y. M. (2011). Students’ confidence and perceived value for participating in cross-cultural wiki-based collaborations. Educational Technology Research and Development, 59 (2), 213–228.

Fischer, F., Kollar, I., Stegmann, K., & Wecker, C. (2013b). Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist, 48 (1), 56–66 https://doi.org/10.1080/00461520.2012.748005 .

Fischer, F., Kollar, I., Stegmann, K., Wecker, C., & Zottmann, J. (2013a). Collaboration scripts in computer-supported collaborative learning. In C. E. Hmelo-Silver, C. A. Chinn, C. K. Chan, & A. M. O’Donnell (Eds.), The international handbook of collaborative learning (pp. 403–419). New York: Routledge.

Häkkinen, P., & Mäkitalo-Siegl, K. (2007). Educational perspectives on scripting CSCL. In F. Fischer, I. Kollar, H. Mandl, & J. M. Haake (Eds.), Scripting computer-supported collaborative learning: Cognitive, computational and educational approaches (pp. 263–271). New York: Springer.

Hämäläinen, R. (2012). Methodological reflections: designing and understanding computer-supported collaborative learning. Teaching in Higher Education, 17 (5), 603–614 https://doi.org/10.1080/13562517.2012.658556 .

Hämäläinen, R., & Häkkinen, P. (2010). Teachers instructional planning for computer-supported collaborative learning: Macro-scripts as a pedagogical method to facilitate collaborative learning. Teaching and Teacher Education, 26 (4), 871–877 https://doi.org/10.1016/j.tate.2009.10.025 .

Heimbuch, S., Ollesch, L., & Bodemer, D. (2018). Comparing effects of two collaboration scripts on learning activities for wiki-based environments. International Journal of Computer-Supported Collaborative Learning, 13 (3), 331–357 https://doi.org/10.1007/s11412-018-9283-0 .

Hodges, C. G. (2002). Learning through collaborative writing. Reading, 36 (1), 4–10 https://doi.org/10.1111/1467-9345.00178 .

Jääskelä, P., Nykänen, S., & Tynjälä, P. (2018). Models for the development of generic skills in Finnish higher education. Journal of Further and Higher Education, 42 (1), 130–142 https://doi.org/10.1080/0309877X.2016.1206858 .

Kale, U. (2014). Can they plan to teach with Web 2.0? Future teachers’ potential use of the emerging web. Technology, Pedagogy and Education, 23 (4), 471–489.

Kobbe, L., Weinberger, A., Dillenbourg, P., Harrer, A., Hämäläinen, R., Häkkinen, P., & Fischer, F. (2007). Specifying computer-supported collaboration scripts. International Journal of Computer-Supported Collaborative Learning, 2 (2-3), 211–224. https://doi.org/10.1007/s11412-007-9014-4 .

Article   Google Scholar  

Kollar, I., & Fischer, F. (2007). Supporting self-regulated learners for a while and what computers can contribute. Journal of Educational Computing Research, 35 (4), 425–435.

Kollar, I., Fischer, F., & Hesse, F. W. (2006). Collaboration scripts–a conceptual analysis. Educational Psychology Review, 18 (2), 159–185 https://doi.org/10.1007/s10648-006-9007-2 .

Krange, I., & Ludvigsen, S. (2008). What does it mean? Students’ procedural and conceptual problem solving in a CSCL environment designed within the field of science education. International Journal of Computer-Supported Collaborative Learning, 3 (1), 25–51 https://doi.org/10.1007/s11412-007-9030-4 .

Li, M. (2013). Individual novices and collective experts: Collective scaffolding in wiki-based small group writing. System, 41 (3), 752–769 https://doi.org/10.1016/j.system.2013.07.021 .

Limbu, L., & Markauskaite, L. (2015). How do learners experience joint writing: University students’ conceptions of online collaborative writing tasks and environments. Computers & Education, 82 , 393–408 https://doi.org/10.1016/j.compedu.2014.11.024 .

Lowry, P. B., & Nunamaker, J. F. (2003). Using internet-based, distributed collaborative writing tools to improve coordination and group awareness in writing teams. IEEE Transactions on Professional Communication, 46 (4), 277–297 https://doi.org/10.1109/TPC.2003.819640.

Mackler, T. (1987). Group produced documents: An exploratory study of collaborative writing processes. (Unpublished doctoral dissertation) . New York: Columbia University Teachers College.

Marttunen, M., & Laurinen, L. (2012). Participant profiles during collaborative writing. Journal of Writing Research, 4 (1), 53–79 https://doi.org/10.17239/jowr-2012.04.01.3 .

Mayordomo, R. M., & Onrubia, J. (2015). Work coordination and collaborative knowledge construction in a small group collaborative virtual task. The Internet and Higher Education, 25 , 96–104 https://doi.org/10.1016/j.iheduc.2015.02.003 .

Onrubia, J., & Engel, A. (2009). Strategies for collaborative writing and phases of knowledge construction in CSCL environments. Computers & Education, 53 (4), 1256–1265 https://doi.org/10.1016/j.compedu.2009.06.008 .

Papadopoulos, P. M., Demetriadis, S. N., & Weinberger, A. (2013). ‘Make it explicit!’: Improving collaboration through increase of script coercion. Journal of Computer Assisted Learning, 29 (4), 383–398 https://doi.org/10.1111/jcal.12014 .

Patton, M. Q. (2001). Qualitative evaluation and research methods (2nd ed.). Thousand Oaks: Sage Publications, Inc..

Posner, I. R., & Baecker, R. M. (1993). How people write together. In R. M. Baecker (Ed.), Readings in groupware and computer-supported cooperative work: Assisting human-human collaboration (pp. 239–250). San Mateo: Morgan Kaufmann.

Raes, A., Schellens, T., De Wever, B., & Benoit, D. F. (2016). Promoting metacognitive regulation through collaborative problem solving on the web: When scripting does not work. Computers in Human Behavior, 58 , 325–342.

Rau, M. A., Bowman, H. A., & Moore, J. W. (2017). An adaptive collaboration script for learning with multiple visual representations in chemistry. Computers & Education, 109 , 38–55 https://doi.org/10.1016/j.compedu.2017.02.006 .

Rummel, N., & Spada, H. (2005). Learning to collaborate: An instructional approach to promoting collaborative problem solving in computer-mediated settings. The Journal of the Learning Sciences, 14 (2), 201–241.

Rummel, N., Spada, H., & Hauser, S. (2009). Learning to collaborate while being scripted or by observing a model. International Journal of Computer-Supported Collaborative Learning, 4 (1), 69–92 https://doi.org/10.1007/s11412-008-9054-4 .

Sadler, T. (2004). Informal reasoning regarding socioscientific issues: A critical review of research. Journal of Research in Science Teaching, 41 (5), 513–536 https://doi.org/10.1002/tea.20009 .

Stegmann, K., Kollar, I., Weinberger, A., & Fischer, F. (2016). Appropriation from a script theory of guidance perspective: A response to Pierre Tchounikine. International Journal of Computer-Supported Collaborative Learning, 11 (3), 371–379 https://doi.org/10.1007/s11412-016-9241-7 .

Storch, N. (2005). Collaborative writing: Product, process, and students’ reflections. Journal of Second Language Writing, 14 (3), 153–173 https://doi.org/10.1016/j.jslw.2005.05.002 .

Strijbos, J. W., & Weinberger, A. (2010). Emerging and scripted roles in computer-supported collaborative learning. Computers in Human Behavior, 26 (4), 491–494 https://doi.org/10.1016/j.chb.2009.08.006 .

Tan, E. (2018). Effects of two differently sequenced classroom scripts on common ground in collaborative inquiry learning. Instructional Science, 46 (6), 893–919 https://doi.org/10.1007/s11251-018-9460-6 .

Tchounikine, P. (2016). Contribution to a theory of CSCL scripts: Taking into account the appropriation of scripts by learners. International Journal of Computer-Supported Collaborative Learning, 11 (3), 349–369 https://doi.org/10.1007/s11412-016-9240-8 .

Vogel, F., Kollar, I., Ufer, S., Reichersdorfer, E., Reiss, K., & Fischer, F. (2016). Developing argumentation skills in mathematics through computer-supported collaborative learning: The role of transactivity. Instructional Science, 44 (5), 477–500 https://doi.org/10.1007/s11251-016-9380-2 .

Vogel, F., Wecker, C., Kollar, I., & Fischer, F. (2017). Socio-cognitive scaffolding with computer-supported collaboration scripts: A meta-analysis. Educational Psychology Review, 29 (3), 477–511 https://doi.org/10.1007/s10648-016-9361-7 .

Weinberger, A. (2008). CSCL scripts: Effects of social and epistemic scripts on computer-supported collaborative learning . Berlin: VDM Verlag.

Weinberger, A., Kollar, I., Dimitriadis, Y., Mäkitalo-Siegl, K., & Fischer, F. (2009). Computer-supported collaboration scripts. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, & S. Barnes (Eds.), Technology-enhanced learning (pp. 155–173). Dordrecht: Springer https://doi.org/10.1007/978-1-4020-9827-7_10 .

Wichmann, A., & Rummel, N. (2013). Improving revision in wiki-based writing: Coordination pays off. Computers & Education, 62 , 262–270 https://doi.org/10.1016/j.compedu.2012.10.017 .

Download references

Acknowledgements

The authors would like to thank the participants of the study. The authors would also acknowledge the assistance of research staff at the Department of Education, University of Jyväskylä and Department of Educational Studies, Ghent University.

This work was supported by the Academy of Finland [grant numbers 292466 and 318095, the Multidisciplinary Research on Learning and Teaching profiles I and II of Universty of Jyväskylä (JYU)].

Author information

Authors and affiliations.

Department of Education, University of Jyväskylä, P.O. Box 35, FI-40014, Jyvaskylä, Finland

Kirsi Heinonen, Raija Hämäläinen & Sophie van der Meijs

Department of Educational Studies, Ghent University, Henri Dunantlaan 2, B-9000, Ghent, Belgium

Nore De Grez & Bram De Wever

You can also search for this author in PubMed   Google Scholar

Contributions

The initial manuscript draft was written by the first author but was reviewed and revised multiple times and complemented in conjunction with the co-authors. The data analysis was mainly executed by the first author, with the close support from the second and fifth authors. All authors participated in definitive cross-checking of the emerged results. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Kirsi Heinonen .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Heinonen, K., De Grez, N., Hämäläinen, R. et al. Scripting as a pedagogical method to guide collaborative writing: university students’ reflections. RPTEL 15 , 15 (2020). https://doi.org/10.1186/s41039-020-00131-x

Download citation

Received : 28 August 2019

Accepted : 23 March 2020

Published : 02 July 2020

DOI : https://doi.org/10.1186/s41039-020-00131-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaborative writing

critical thinking collaboration and the writing process

COLLABORATIVE WRITING AND CRITICAL THINKING

Murnijati, _ (2016) COLLABORATIVE WRITING AND CRITICAL THINKING. S2 thesis, Universitas Pendidikan Indonesia.

Baca Full Text klik disini

This study aims to investigate the elements of critical thinking (CT) demonstrated in students’ writing resulted from collaborative writing, and how collaborative writing (CW) helps develop students’ critical thinking. The study involved 27 Grade XII students, an observer, and the researcher as the participant observer who conducted the teaching program in her regular class. The study employed the qualitative case study approach which obtained the data from students’ writing texts, classroom observation, and student interview. The CT elements as well as the CT dispositions were analysed based on the CT theories proposed mainly by Reichenbach, Chaffee, Ennis, Browne and Keeley. Findings revealed that the students were able to demostrate the CT elements namely issue, arguments, facts, and opinions in their writing. In general, their arguments did not satisfy the expected standards of clarity, sufficiency, and accuracy, thus they were weak or unsound. However, during the CW process, the students demonstrated CT dispositions through their attempts as typically critical thinkers do. From the teaching program, it was obvious that CW with the heterogeneous groups, the well-organized writing phases, the writing task, and the teacher’s role as a facilitator, motivator, and a model of CT dispositions, had provided large spaces for students to engage actively in meaningful interactions that encouraged them to use their thinking. To ensure validity, triangulation was conducted by student interviews. Findings suggest that students need to be habituated to the learning involving CT, and given guidance to improve the quality of their writing and thinking. Meanwhile, the teacher needs to improve her teaching of writing and CT explicitely through ELT in the future.

Item Type: Thesis (S2)
Additional Information: No. Panggil : T B.ING MUR c-2016; Pembimbing : I. Emi Emilia, II. Bachrudin.
Uncontrolled Keywords: critical thinking elements, critical thinking dispositions, collaborative writing process
Subjects:
Divisions:
Depositing User: Mr mhsinf 2017
Date Deposited: 28 Aug 2017 06:03
Last Modified: 28 Aug 2017 06:03
URI:

Actions (login required)

View Item

Collaborative writing at work: Peer feedback in a blended learning environment

  • Published: 25 August 2020
  • Volume 26 , pages 1293–1310, ( 2021 )

Cite this article

critical thinking collaboration and the writing process

  • Teresa López-Pellisa   ORCID: orcid.org/0000-0003-3151-447X 1 ,
  • Neus Rotger   ORCID: orcid.org/0000-0002-5298-2267 2 &
  • Fernando Rodríguez-Gallego   ORCID: orcid.org/0000-0002-6539-0447 1  

2991 Accesses

33 Citations

1 Altmetric

Explore all metrics

This exploratory study aims to analyse the nature of peer feedback during a collaborative writing assignment, and to identify the possible effects of feedback on the revision of a text written by university students in a blended learning environment. Under analysis are two different graduate courses in academic writing, during which, over a period of a semester, the students ( n  = 85) were divided into 25 work groups to carry out a co-evaluation assignment with the support of a technology platform. The results obtained indicate that, when collaborative writing includes peer feedback, instead of unidirectional corrections from the teacher, the students respond more reflectively and constructively, they discuss the content they are working with, and, as a result, they make significant changes in their own writing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

critical thinking collaboration and the writing process

Similar content being viewed by others

critical thinking collaboration and the writing process

The Impact of Self and Peer Assessment on L2 Writing: The Case of Moodle Workshops

critical thinking collaboration and the writing process

Collaborating with Mature English Language Learners to Combine Peer and Automated Feedback: a User-Centered Approach to Designing Writing Support

critical thinking collaboration and the writing process

Effects of technology enhanced peer, teacher and self-feedback on students’ collaborative writing, critical thinking tendency and engagement in learning

Explore related subjects.

  • Artificial Intelligence
  • Digital Education and Educational Technology

All figures are drawn from the survey conducted during the project. They can be consulted in full, in Spanish, in López-Pellisa et al. 2020b . The translations into English are our own.

All examples are drawn from the qualitative interviews conducted during the project. They can be consulted in full, in Spanish, in López-Pellisa et al. ( 2020a ). The translations into English are our own.

Álvarez, I., Espasa, A., & Guasch, T. (2012). The value of feedback in improving collaborative writing assignments in an online learning environment. Studies in Higher Education, 37 (4), 387–400. https://doi.org/10.1080/03075079.2010.510182 .

Article   Google Scholar  

Bangert-Drowns, R., Hurley, M., & Wilkinson, B. (2004). The effects of school-based writing to-learn interventions on academic achievement: A meta-analysis. Review of Educational Research, 74 (1), 29–58.

Bradley, L., & Thouësny, S. (2017). Students’ collaborative peer reviewing in an online writing environment. Themes in Science and Technology Education, 10 (2), 69–83.

Google Scholar  

Bustos Sánchez, A. (2004). Un modelo para blended-learning aplicado a la formación en el trabajo. Compartimos prácticas - ¿compartimos saberes? RIED. Revista Iberoamericana de Educación a Distancia, 7 (1–2), 113–132.

Bustos Sánchez, A. (2009). Escritura colaborativa en línea. Un estudio preliminar para el estudio orientado al análisis del proceso de co-autoría. RIED. Revista Iberoamericana de Educación a Distancia, 12 (2), 33–55.

Carlino, P. (2008). Leer y escribir en la universidad, una nueva cultura. ¿Por qué es necesaria la alfabetización académica? In E. Narváez & D. Cadena (Eds.), Los desafíos de la lectura y la escritura en la educación superior: caminos posibles (pp. 159–194). Cali: Universidad Autónoma de Occidente.

Cassany, D., & Morales, Ó. (2009). Leer y escribir en la universidad: los géneros científicos. In D. Cassany (Comp.), Para ser letrados. Voces y miradas sobre la lectura (pp. 109–128). Barcelona: Paidós.

Castelló, M. (2009). Aprender a escribir textos académicos: ¿copistas, escribas, compiladores o escritores? In J. I. Pozo, & M. Pérez (Coords.), Psicología del aprendizaje universitario: la formación de competencias (pp. 120–133). Madrid: Morata.

Crook, C. K., Gross, H., & Dymott, R. (2006). Assessment relationships in higher education: The tension of process and practice. British Educational Research Journal, 32 , 95–114.

Deveci, T. (2018). Student perceptions on collaborative writing in a project-based course. Universal Journal of Educational Research, 6 (4), 721–732. https://doi.org/10.13189/ujer.2018.060415 .

Article   MathSciNet   Google Scholar  

Devere, T. (2008). Efficacy of instructor feedback on written work in an online program. International Journal on E-Learning, 7 (2), 311–329.

Dysthe, O., Lillejord, S., Vines, A., & Wasson, B. (2010). Productive e-feedback in higher education. Two models and some critical issues. In A. Ludvigsen, I. Rasmussen, & R. Säljö (Eds.), Learning across sites: New tools, infrastructures and practices (pp. 243–259). Oxford: Pergamon Press.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education, 59 (3), 277–292.

Guasch, T., Espasa, A., & Álvarez, I. (2010). Formative e-feedback in collaborative writing assignments: The effect of the process and time. eLC Research Paper Series, 1 , 49–59.

Guasch, T., Espasa, A., Álvarez, I., & Kirschner, P. A. (2013). Effects of feedback on collaborative writing in an online learning environment. Distance Education, 34 (3), 324–338. https://doi.org/10.1080/01587919.2013.835772 .

Guasch, T., Espasa, A., & Martínez-Melo, M. (2019). The art of questioning in online learning environments: The potentialities of feedback in writing. Assessment & Evaluation in Higher Education, 44 (1), 111–123. https://doi.org/10.1080/02602938.2018.1479373 .

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77 (1), 81–112. https://doi.org/10.3102/003465430298487 .

Herder, A., Berenst, J., de Glopper, K., & Koole, T. (2018). Reflective practices in collaborative writing of primary school students. International Journal of Educational Research, 90 , 160–174.

Hernández Rojas, G., & Romero, V. (2011). El b-learning en contextos educativos universitarios: posibilidades de uso. In F. Díaz Barriga, G. Hernández, & M. A. Rigo (Eds.), Experiencias educativas con recursos digitales (pp. 95–119). Ciudad de México: Facultad de Psicología-UNAM.

Hernández Rojas, G., Sánchez González, P., Rodríguez Varela, E. I., Caballero Becerril, R. S., & Martínez Martínez, M. (2014). Un entorno b-learning para la promoción de la escritura académica de estudiantes universitarios. Revista Mexicana de Investigación Educativa, 19 (61), 349–375.

Huisman, B., Saab, N., van den Broek, P., & van Driel, J. (2019). The impact of formative peer feedback on higher education students’ academic writing: A meta-analysis. Assessment & Evaluation in Higher Education, 44 (6), 863–880. https://doi.org/10.1080/02602938.2018.1545896 .

Klein, P. D. (2014). Knowledge construction in collaborative science writing: Strategic simplicity, distributed complexity, and explanatory sophistication. In G. E. Rijlaarsdam, P. D. Klein, P. Boscolo, L. C. Kirkpatrick, & C. E. Gelati (Eds.), Studies in writing, writing as a learning activity (Vol. 28, pp. 300–326). Leiden: Brill.

Chapter   Google Scholar  

Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia - Social and Behavioral Sciences, 31 , 486–490. https://doi.org/10.1016/j.sbspro.2011.12.091 .

Lindblom-Ylänne, S., & Pihlajamäki, H. (2003). Can a collaborative network environment enhance essay-writing processes? British Journal of Educational Technology, 34 (1), 17–30. https://doi.org/10.1111/1467-8535.d01-3 .

Lonka, K. (2003). Helping doctoral students to finish their theses. In L. Björk, G. Bräuer, L. Rienecker, G. Ruhmann, & P. Stray Jorgensen (Eds.), Teaching academic writing across Europe . Dordrecht: Kluwer University Press.

López-Pellisa, T., Rodríguez-Gallego, F. & Rotger, N. (2020a). Interviews: Collaborative Writing at Work: Peer Feedback in a Blended Learning Environment. Mendeley Data, V1, https://doi.org/10.17632/ygktp8g3g9.1 .

López-Pellisa, T., Rotger, N. & Rodríguez-Gallego, F. (2020b). Apéndice A: Cuestionario del proyecto de escritura colaborativa. Mendeley Data , V1, https://doi.org/10.17632/xttwsktvcv.1 .

Mauri, T., Colomina, R., Clara, M., & Ginesta, A. (2011). Ayudas al aprendizaje en tareas de escritura colaborativa con Moodle. Electronic Journal of Research in Educational Psychology, 9 (3), 1103–1128.

Nokes-Malach, T. J., Richey, J. E., & Gadgil, S. (2015). When is it better to learn together? Insights from research on collaborative learning. Educational Psychology Review, 27 (4), 645–656. https://doi.org/10.1007/s10648-015-9312-8 .

Nykopp, M., Marttunen, M., & Laurinen, L. (2014). University students’ knowledge construction during face to face collaborative writing. In P. Klein, P. Boscolo, L. Kirkpatrick, & C. Gelati (Eds.), Writing as a learning activity (pp. 277–299). Leiden: Brill Studies in Writing.

Peña, K., Pérez, M., & Rondón, E. (2010). Redes sociales en Internet: reflexiones sobre sus posibilidades para el aprendizaje cooperativo y colaborativo. Revista de Teoría y Didáctica de las Ciencias Sociales, 16 , 173–205.

Rogers, P. L. (2001). Traditions to transformations: The forced evolution of higher education. AACE Journal, 9 (1), 47–60.

Rotger, N., López-Pellisa, T. & Rodríguez-Gallego, F. (2020). Evaluation Rubric: Collaborative Writing at Work: Peer Feedback in a Blended Learning Environment. Mendeley Data , V3, https://doi.org/10.17632/rz5m25gbvk.3 .

Sánchez, E. (Coord.). (2010). La lectura en el aula . Barcelona: Graó.

Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78 (1), 153–189.

Turpo Gebera, O. W. (2010). Contexto y desarrollo de la modalidad educativa blended learning en el sistema educativo iberoamericano. Revista Mexicana de Investigación Educativa, 15 (45), 345–370.

van Steendam, E. (2016). Editorial: Forms of collaboration in writing. Journal of Writing Research, 8 (2), 183–204. https://doi.org/10.17239/jowr-2016.08.02.01 .

Wolsey, T. (2008). Efficacy of instructor feedback on written work in an online program. International Journal on E-learning, 7 (2), 311–329.

Download references

This work was supported by the Institute for Educational Research and Innovation (IRIE) of the Universitat de les Illes Balears (Spain) through a teaching innovation project (PID 181960) led by Teresa López-Pellisa.

Author information

Authors and affiliations.

Universitat de les Illes Balears, Palma de Mallorca, Spain

Teresa López-Pellisa & Fernando Rodríguez-Gallego

Universitat Oberta de Catalunya, Barcelona, Spain

Neus Rotger

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Fernando Rodríguez-Gallego .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

López-Pellisa, T., Rotger, N. & Rodríguez-Gallego, F. Collaborative writing at work: Peer feedback in a blended learning environment. Educ Inf Technol 26 , 1293–1310 (2021). https://doi.org/10.1007/s10639-020-10312-2

Download citation

Received : 14 July 2020

Accepted : 20 August 2020

Published : 25 August 2020

Issue Date : January 2021

DOI : https://doi.org/10.1007/s10639-020-10312-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Collaborative writing
  • Reflective practices
  • Peer feedback, blended learning environment
  • Assessment of learning
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. 7 Methods to Develop Creative Thinking Skills for Students

    critical thinking collaboration and the writing process

  2. Critical Thinking

    critical thinking collaboration and the writing process

  3. Critical Thinking, Collaboration and The Writing Process: Lesson 2

    critical thinking collaboration and the writing process

  4. Critical Thinking Definition, Skills, and Examples

    critical thinking collaboration and the writing process

  5. Critical Thinking Skills

    critical thinking collaboration and the writing process

  6. Group 1 Critical Thinking Collaboration and the Writing Process

    critical thinking collaboration and the writing process

VIDEO

  1. #itsnotinthesyllabus Eps 38 Adventist Education in the 21st Century: Challenges and Opportunities

  2. Purposive Communication: Critical Thinking, Collaboration and Writing Process

  3. Understanding the Critical Reading & Writing Skills Test

  4. NYC Schools and TC Advancing Literacy

  5. Role of Critical Reading and Writing In Enhancing Writing Skills

  6. Lesson 27: Real World Challenges; 21st Century Learning Design by Amb. Phanuel Ondele

COMMENTS

  1. MODULE 12

    MODULE 12- Critical Thinking, Collaboration& Writing Process - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. This document provides guidance on the writing process for a college course. It discusses choosing a topic and narrowing it, asking research questions to develop a thesis statement.

  2. Writing to Think: Critical Thinking and the Writing Process

    The writing process, then, is not just a mirror image of the thinking process: it is the thinking process. Confronted with a topic, an effective critical thinker/writer ... Writing practice builds critical thinking, which empowers people to "take charge of [their] own minds" so they "can take charge of [their] own lives . . . and improve ...

  3. Lesson 2: Critical Thinking, Collaboration, and The Writing Process

    Hello! How's your day?In this lesson, we will going to encounter the following topics:• WRITING AS A PROCESS• THE PRE-WRITING STAGE OF THE WRITING PROCESS• T...

  4. The Writing Process

    7 The Writing Process . Kathy Boylan. Communication skills, including writing, are some of the most important soft skills (employable skills that have more to do with emotional IQ such as common sense, communication, problem-solving, and collaboration) that students learn when they are in college because most professions require high competency in written communication, which can be a chance ...

  5. PDF Critical Collaborative Writing Process

    Writing develops personal and social literacy. The act of writing is a way of knowing, a way of thinking, and a way of doing. Writing provides a framework for intellectual growth, critical inquiry, and social imagination through the use of rhetorical inquiry and collaborative learning practices.

  6. PDF A Study of Critical Thinking Skills Practice in Collaborative Writing

    2.2. The Combination of Writing Process and Critical Thinking Theoretically, the writing process shares similar characteristics with the critical thinking process listed by Haase (2010).The planning stage reflects skills found in conceptualizing and applying information.At the commencement of the writing process, the writer must use these two ...

  7. Collaborative writing: Strategies and activities for writing ...

    Collaborative writing is "an iterative and social process that involves a team focused on a common objective that negotiates, coordinates, and communicates during the creation of a common document" [].Collaborative writing can follow many different strategies [], but five are most common [].These are one-for-all writing, each-in-sequence writing, all-in-parallel writing, all-in-reaction ...

  8. Collaborative Writing

    Collaborative Writing. Collaborative or team writing is the process of producing a written work as a group where all team members contributed to the content and the decisions about how the group will function. Group assignments can be difficult for many students with busy schedules because they require planning, coordination, and frequent ...

  9. Collaborative Learning and Critical Thinking

    Collaborative learning is a relationship among learners that fosters positive interdependence, individual accountability, and interpersonal skills. "Critical thinking" involves asking appropriate questions, gathering and creatively sorting through relevant information, relating new information to existing knowledge, reexamining beliefs ...

  10. Learning to Improve: Using Writing to Increase Critical Thinking

    In short, the Delphi panel expert consensus describes critical thinking as a "process of purposeful self-regulatory judgment that drives problem-solving and decision-making" ... The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature (Collier, 1980; ...

  11. The Writing and Critical Thinking Connection

    When students practice the metacognitive habits embedded in the revision process often enough, they improve their critical thinking skills and predictive reading abilities. Collaboration. Along with revision, peer review is a key element of the writing process and it supports a student's understanding of their audience.

  12. The effectiveness of collaborative problem solving in promoting

    Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field ...

  13. What Are Critical Thinking Skills and Why Are They Important?

    According to the University of the People in California, having critical thinking skills is important because they are [1]: Universal. Crucial for the economy. Essential for improving language and presentation skills. Very helpful in promoting creativity. Important for self-reflection.

  14. Critical Thinking, Collaboration and the Writing Process ...

    Critical Thinking, Collaboration and the Writing Process 💗📚📝

  15. Collaborative Writing: Strategies and Activities

    Collaborative writing is "an iterative and social process that involves a team focused on a common objective that negotiates, coordinates, and communicates during the creation of a common document" (Lowry et al. 2004, p 73).Collaborative writing can follow many different strategies (Ede and Lunsford 1990), but five are most common (Lowry et al. 2004).

  16. Collaborative writing: Product, process, and students' reflections

    The study also elicited the learners' reflections on the experience of collaborative writing. The study found that pairs produced shorter but better texts in terms of task fulfilment, grammatical accuracy, and complexity. Collaboration afforded students the opportunity to pool ideas and provide each other with feedback.

  17. Scripting as a pedagogical method to guide collaborative writing

    In higher education (HE), collaborative writing is a widely used pedagogical approach that aims to prepare students for their future working life by promoting generic skills, such as collaboration, creativity and critical and reflective thinking (Hodges 2002; Jääskelä et al. 2018).Web 2.0 technologies offer novel and flexible opportunities to develop collaborative writing interventions by ...

  18. Critical Thinking, Collaboration and The Writing Process: Lesson 2

    Critical Thinking, Collaboration and The Writing Process: Lesson 2 | PDF. Presentation - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online.

  19. Synchronous and Asynchronous Collaborative Writing

    Collaborative writing is effective in improving accuracy of student writing and critical thinking. Accepting these general claims, it seems justified to claim that not only our understanding of writing competences has to be remodeled but also that completely new opportunities of teaching academic writing have emerged.

  20. The Effects of Collaborative Writing on Students' Writing Fluency: An

    Collaborative or cooperative learning is usually considered an important activity for the students at the tertiary level. Since the 1970s, when the communicative language teaching approach was introduced to English language learning, the instructors often employed pair-work or group-work to help students practice the language together as well as learn the language from each other.

  21. Lesson 2,: Critical Thinking, Collaboration & Writing Process

    Lesson 2,: Critical Thinking, Collaboration & Writing Process - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online. This document discusses the writing process in three stages: pre-writing, writing, and rewriting. The pre-writing stage involves choosing and limiting a topic, asking research questions to develop a main ...

  22. COLLABORATIVE WRITING AND CRITICAL THINKING

    This study aims to investigate the elements of critical thinking (CT) demonstrated in students' writing resulted from collaborative writing, and how collaborative writing (CW) helps develop students' critical thinking. The study involved 27 Grade XII students, an observer, and the researcher as the participant observer who conducted the teaching program in her regular class.

  23. Collaborative writing at work: Peer feedback in a blended learning

    Writing effectively and accurately is one of the basic learning skills of the twenty-first century that academia must support. In recent years, the introduction of new information and communication technologies in higher education has created new opportunities for collaborative learning and knowledge (Dysthe et al. 2010; Hernández Rojas et al. 2014), to which we must add the increasing ...