Exam Scoring

  • New Freshmen
  • New International Students
  • Info about COMPOSITION
  • Info about MATH
  • Info about SCIENCE
  • LOTE for Non-Native Speakers
  • Log-in Instructions
  • ALEKS PPL Math Placement Exam
  • Advanced Placement (AP) Credit
  • What is IB?
  • Advanced Level (A-Levels) Credit
  • Departmental Proficiency Exams
  • Departmental Proficiency Exams in LOTE ("Languages Other Than English")
  • Testing in Less Commonly Studied Languages
  • FAQ on placement testing
  • FAQ on proficiency testing
  • Legislation FAQ
  • 2024 Cutoff Scores Math
  • 2024 Cutoff Scores Chemistry
  • 2024 Cutoff Scores IMR-Biology
  • 2024 Cutoff Scores MCB
  • 2024 Cutoff Scores Physics
  • 2024 Cutoff Scores Rhetoric
  • 2024 Cutoff Scores ESL
  • 2024 Cutoff Scores Chinese
  • 2024 Cutoff Scores French
  • 2024 Cutoff Scores German
  • 2024 Cutoff Scores Latin
  • 2024 Cutoff Scores Spanish
  • 2024 Advanced Placement Program
  • 2024 International Baccalaureate Program
  • 2024 Advanced Level Exams
  • 2023 Cutoff Scores Math
  • 2023 Cutoff Scores Chemistry
  • 2023 Cutoff Scores IMR-Biology
  • 2023 Cutoff Scores MCB
  • 2023 Cutoff Scores Physics
  • 2023 Cutoff Scores Rhetoric
  • 2023 Cutoff Scores ESL
  • 2023 Cutoff Scores Chinese
  • 2023 Cutoff Scores French
  • 2023 Cutoff Scores German
  • 2023 Cutoff Scores Latin
  • 2023 Cutoff Scores Spanish
  • 2023 Advanced Placement Program
  • 2023 International Baccalaureate Program
  • 2023 Advanced Level Exams
  • 2022 Cutoff Scores Math
  • 2022 Cutoff Scores Chemistry
  • 2022 Cutoff Scores IMR-Biology
  • 2022 Cutoff Scores MCB
  • 2022 Cutoff Scores Physics
  • 2022 Cutoff Scores Rhetoric
  • 2022 Cutoff Scores ESL
  • 2022 Cutoff Scores Chinese
  • 2022 Cutoff Scores French
  • 2022 Cutoff Scores German
  • 2022 Cutoff Scores Latin
  • 2022 Cutoff Scores Spanish
  • 2022 Advanced Placement Program
  • 2022 International Baccalaureate Program
  • 2022 Advanced Level Exams
  • 2021 Cutoff Scores Math
  • 2021 Cutoff Scores Chemistry
  • 2021 Cutoff Scores IMR-Biology
  • 2021 Cutoff Scores MCB
  • 2021 Cutoff Scores Physics
  • 2021 Cutoff Scores Rhetoric
  • 2021 Cutoff Scores ESL
  • 2021 Cutoff Scores Chinese
  • 2021 Cutoff Scores French
  • 2021 Cutoff Scores German
  • 2021 Cutoff Scores Latin
  • 2021 Cutoff Scores Spanish
  • 2021 Advanced Placement Program
  • 2021 International Baccalaureate Program
  • 2021 Advanced Level Exams
  • 2020 Cutoff Scores Math
  • 2020 Cutoff Scores Chemistry
  • 2020 Cutoff Scores MCB
  • 2020 Cutoff Scores Physics
  • 2020 Cutoff Scores Rhetoric
  • 2020 Cutoff Scores ESL
  • 2020 Cutoff Scores Chinese
  • 2020 Cutoff Scores French
  • 2020 Cutoff Scores German
  • 2020 Cutoff Scores Latin
  • 2020 Cutoff Scores Spanish
  • 2020 Advanced Placement Program
  • 2020 International Baccalaureate Program
  • 2020 Advanced Level Exams
  • 2019 Cutoff Scores Math
  • 2019 Cutoff Scores Chemistry
  • 2019 Cutoff Scores MCB
  • 2019 Cutoff Scores Physics
  • 2019 Cutoff Scores Rhetoric
  • 2019 Cutoff Scores Chinese
  • 2019 Cutoff Scores ESL
  • 2019 Cutoff Scores French
  • 2019 Cutoff Scores German
  • 2019 Cutoff Scores Latin
  • 2019 Cutoff Scores Spanish
  • 2019 Advanced Placement Program
  • 2019 International Baccalaureate Program
  • 2019 Advanced Level Exams
  • 2018 Cutoff Scores Math
  • 2018 Cutoff Scores Chemistry
  • 2018 Cutoff Scores MCB
  • 2018 Cutoff Scores Physics
  • 2018 Cutoff Scores Rhetoric
  • 2018 Cutoff Scores ESL
  • 2018 Cutoff Scores French
  • 2018 Cutoff Scores German
  • 2018 Cutoff Scores Latin
  • 2018 Cutoff Scores Spanish
  • 2018 Advanced Placement Program
  • 2018 International Baccalaureate Program
  • 2018 Advanced Level Exams
  • 2017 Cutoff Scores Math
  • 2017 Cutoff Scores Chemistry
  • 2017 Cutoff Scores MCB
  • 2017 Cutoff Scores Physics
  • 2017 Cutoff Scores Rhetoric
  • 2017 Cutoff Scores ESL
  • 2017 Cutoff Scores French
  • 2017 Cutoff Scores German
  • 2017 Cutoff Scores Latin
  • 2017 Cutoff Scores Spanish
  • 2017 Advanced Placement Program
  • 2017 International Baccalaureate Program
  • 2017 Advanced Level Exams
  • 2016 Cutoff Scores Math
  • 2016 Cutoff Scores Chemistry
  • 2016 Cutoff Scores Physics
  • 2016 Cutoff Scores Rhetoric
  • 2016 Cutoff Scores ESL
  • 2016 Cutoff Scores French
  • 2016 Cutoff Scores German
  • 2016 Cutoff Scores Latin
  • 2016 Cutoff Scores Spanish
  • 2016 Advanced Placement Program
  • 2016 International Baccalaureate Program
  • 2016 Advanced Level Exams
  • 2015 Fall Cutoff Scores Math
  • 2016 Spring Cutoff Scores Math
  • 2015 Cutoff Scores Chemistry
  • 2015 Cutoff Scores Physics
  • 2015 Cutoff Scores Rhetoric
  • 2015 Cutoff Scores ESL
  • 2015 Cutoff Scores French
  • 2015 Cutoff Scores German
  • 2015 Cutoff Scores Latin
  • 2015 Cutoff Scores Spanish
  • 2015 Advanced Placement Program
  • 2015 International Baccalaureate (IB) Program
  • 2015 Advanced Level Exams
  • 2014 Cutoff Scores Math
  • 2014 Cutoff Scores Chemistry
  • 2014 Cutoff Scores Physics
  • 2014 Cutoff Scores Rhetoric
  • 2014 Cutoff Scores ESL
  • 2014 Cutoff Scores French
  • 2014 Cutoff Scores German
  • 2014 Cutoff Scores Latin
  • 2014 Cutoff Scores Spanish
  • 2014 Advanced Placement (AP) Program
  • 2014 International Baccalaureate (IB) Program
  • 2014 Advanced Level Examinations (A Levels)
  • 2013 Cutoff Scores Math
  • 2013 Cutoff Scores Chemistry
  • 2013 Cutoff Scores Physics
  • 2013 Cutoff Scores Rhetoric
  • 2013 Cutoff Scores ESL
  • 2013 Cutoff Scores French
  • 2013 Cutoff Scores German
  • 2013 Cutoff Scores Latin
  • 2013 Cutoff Scores Spanish
  • 2013 Advanced Placement (AP) Program
  • 2013 International Baccalaureate (IB) Program
  • 2013 Advanced Level Exams (A Levels)
  • 2012 Cutoff Scores Math
  • 2012 Cutoff Scores Chemistry
  • 2012 Cutoff Scores Physics
  • 2012 Cutoff Scores Rhetoric
  • 2012 Cutoff Scores ESL
  • 2012 Cutoff Scores French
  • 2012 Cutoff Scores German
  • 2012 Cutoff Scores Latin
  • 2012 Cutoff Scores Spanish
  • 2012 Advanced Placement (AP) Program
  • 2012 International Baccalaureate (IB) Program
  • 2012 Advanced Level Exams (A Levels)
  • 2011 Cutoff Scores Math
  • 2011 Cutoff Scores Chemistry
  • 2011 Cutoff Scores Physics
  • 2011 Cutoff Scores Rhetoric
  • 2011 Cutoff Scores French
  • 2011 Cutoff Scores German
  • 2011 Cutoff Scores Latin
  • 2011 Cutoff Scores Spanish
  • 2011 Advanced Placement (AP) Program
  • 2011 International Baccalaureate (IB) Program
  • 2010 Cutoff Scores Math
  • 2010 Cutoff Scores Chemistry
  • 2010 Cutoff Scores Rhetoric
  • 2010 Cutoff Scores French
  • 2010 Cutoff Scores German
  • 2010 Cutoff Scores Latin
  • 2010 Cutoff Scores Spanish
  • 2010 Advanced Placement (AP) Program
  • 2010 International Baccalaureate (IB) Program
  • 2009 Cutoff Scores Math
  • 2009 Cutoff Scores Chemistry
  • 2009 Cutoff Scores Rhetoric
  • 2009 Cutoff Scores French
  • 2009 Cutoff Scores German
  • 2009 Cutoff Scores Latin
  • 2009 Cutoff Scores Spanish
  • 2009 Advanced Placement (AP) Program
  • 2009 International Baccalaureate (IB) Program
  • 2008 Cutoff Scores Math
  • 2008 Cutoff Scores Chemistry
  • 2008 Cutoff Scores Rhetoric
  • 2008 Cutoff Scores French
  • 2008 Cutoff Scores German
  • 2008 Cutoff Scores Latin
  • 2008 Cutoff Scores Spanish
  • 2008 Advanced Placement (AP) Program
  • 2008 International Baccalaureate (IB) Program
  • Log in & Interpret Student Profiles
  • Mobius View
  • Classroom Test Analysis: The Total Report
  • Item Analysis
  • Error Report
  • Omitted or Multiple Correct Answers
  • QUEST Analysis
  • Assigning Course Grades

Improving Your Test Questions

  • ICES Online
  • Myths & Misperceptions
  • Longitudinal Profiles
  • List of Teachers Ranked as Excellent by Their Students
  • Focus Groups
  • IEF Question Bank

For questions or information:

  • Choosing between Objective and Subjective Test Items

Multiple-Choice Test Items

True-false test items, matching test items, completion test items, essay test items, problem solving test items, performance test items.

  • Two Methods for Assessing Test Item Quality
  • Assistance Offered by The Center for Innovation in Teaching and Learning (CITL)
  • References for Further Reading

I. Choosing Between Objective and Subjective Test Items

There are two general categories of test items: (1) objective items which require students to select the correct response from several alternatives or to supply a word or short phrase to answer a question or complete a statement; and (2) subjective or essay items which permit the student to organize and present an original answer. Objective items include multiple-choice, true-false, matching and completion, while subjective items include short-answer essay, extended-response essay, problem solving and performance test items. For some instructional purposes one or the other item types may prove more efficient and appropriate. To begin out discussion of the relative merits of each type of test item, test your knowledge of these two item types by answering the following questions.

(circle the correct answer)
1. Essay exams are easier to construct than objective exams.TF
2. Essay exams require more thorough student preparation and study time than objective exams.TF
3. Essay exams require writing skills where objective exams do not.TF
4. Essay exams teach a person how to write.TF
5. Essay exams are more subjective in nature than are objective exams.TF
6. Objective exams encourage guessing more so than essay exams.TF
7. Essay exams limit the extent of content covered.TF
8. Essay and objective exams can be used to measure the same content or ability.TF
9. Essay and objective exams are both good ways to evaluate a student's level of knowledge.TF

Quiz Answers

1.TRUEEssay items are generally easier and less time consuming to construct than are most objective test items. Technically correct and content appropriate multiple-choice and true-false test items require an extensive amount of time to write and revise. For example, a professional item writer produces only 9-10 good multiple-choice items in a day's time.
2.?According to research findings it is still undetermined whether or not essay tests require or facilitate more thorough (or even different) student study preparation.
3.TRUEWriting skills do affect a student's ability to communicate the correct "factual" information through an essay response. Consequently, students with good writing skills have an advantage over students who have difficulty expressing themselves through writing.
4.FALSEEssays do not teach a student how to write but they can emphasize the importance of being able to communicate through writing. Constant use of essay tests may encourage the knowledgeable but poor writing student to improve his/her writing ability in order to improve performance.
5.TRUEEssays are more subjective in nature due to their susceptibility to scoring influences. Different readers can rate identical responses differently, the same reader can rate the same paper differently over time, the handwriting, neatness or punctuation can unintentionally affect a paper's grade and the lack of anonymity can affect the grading process. While impossible to eliminate, scoring influences or biases can be minimized through procedures discussed later in this guide.
6.?Both item types encourage some form of guessing. Multiple-choice, true-false and matching items can be correctly answered through blind guessing, yet essay items can be responded to satisfactorily through well written bluffing.
7.TRUEDue to the extent of time required by the student to respond to an essay question, only a few essay questions can be included on a classroom exam. Consequently, a larger number of objective items can be tested in the same amount of time, thus enabling the test to cover more content.
8.TRUEBoth item types can measure similar content or learning objectives. Research has shown that students respond almost identically to essay and objective test items covering the same content. Studies by Sax & Collet (1968) and Paterson (1926) conducted forty-two years apart reached the same conclusion:
"...there seems to be no escape from the conclusions that the two types of exams are measuring identical things" (Paterson, 1926, p. 246).
This conclusion should not be surprising; after all, a well written essay item requires that the student (1) have a store of knowledge, (2) be able to relate facts and principles, and (3) be able to organize such information into a coherent and logical written expression, whereas an objective test item requires that the student (1) have a store of knowledge, (2) be able to relate facts and principles, and (3) be able to organize such information into a coherent and logical choice among several alternatives.
9.TRUEBoth objective and essay test items are good devices for measuring student achievement. However, as seen in the previous quiz answers, there are particular measurement situations where one item type is more appropriate than the other. Following is a set of recommendations for using either objective or essay test items: (Adapted from Robert L. Ebel, Essentials of Educational Measurement, 1972, p. 144).

1 Sax, G., & Collet, L. S. (1968). An empirical comparison of the effects of recall and multiple-choice tests on student achievement. J ournal of Educational Measurement, 5 (2), 169–173. doi:10.1111/j.1745-3984.1968.tb00622.x

Paterson, D. G. (1926). Do new and old type examinations measure different mental functions? School and Society, 24 , 246–248.

When to Use Essay or Objective Tests

Essay tests are especially appropriate when:

  • the group to be tested is small and the test is not to be reused.
  • you wish to encourage and reward the development of student skill in writing.
  • you are more interested in exploring the student's attitudes than in measuring his/her achievement.
  • you are more confident of your ability as a critical and fair reader than as an imaginative writer of good objective test items.

Objective tests are especially appropriate when:

  • the group to be tested is large and the test may be reused.
  • highly reliable test scores must be obtained as efficiently as possible.
  • impartiality of evaluation, absolute fairness, and freedom from possible test scoring influences (e.g., fatigue, lack of anonymity) are essential.
  • you are more confident of your ability to express objective test items clearly than of your ability to judge essay test answers correctly.
  • there is more pressure for speedy reporting of scores than for speedy test preparation.

Either essay or objective tests can be used to:

  • measure almost any important educational achievement a written test can measure.
  • test understanding and ability to apply principles.
  • test ability to think critically.
  • test ability to solve problems.
  • test ability to select relevant facts and principles and to integrate them toward the solution of complex problems. 

In addition to the preceding suggestions, it is important to realize that certain item types are  better suited  than others for measuring particular learning objectives. For example, learning objectives requiring the student  to demonstrate  or  to show , may be better measured by performance test items, whereas objectives requiring the student  to explain  or  to describe  may be better measured by essay test items. The matching of learning objective expectations with certain item types can help you select an appropriate kind of test item for your classroom exam as well as provide a higher degree of test validity (i.e., testing what is supposed to be tested). To further illustrate, several sample learning objectives and appropriate test items are provided on the following page.

Learning Objectives   Most Suitable Test Item
The student will be able to categorize and name the parts of the human skeletal system.   Objective Test Item (M-C, T-F, Matching)
The student will be able to critique and appraise another student's English composition on the basis of its organization.   Essay Test Item (Extended-Response)
The student will demonstrate safe laboratory skills.   Performance Test Item
The student will be able to cite four examples of satire that Twain uses in .   Essay Test Item (Short-Answer)

After you have decided to use either an objective, essay or both objective and essay exam, the next step is to select the kind(s) of objective or essay item that you wish to include on the exam. To help you make such a choice, the different kinds of objective and essay items are presented in the following section. The various kinds of items are briefly described and compared to one another in terms of their advantages and limitations for use. Also presented is a set of general suggestions for the construction of each item variation. 

II. Suggestions for Using and Writing Test Items

The multiple-choice item consists of two parts: (a) the stem, which identifies the question or problem and (b) the response alternatives. Students are asked to select the one alternative that best completes the statement or answers the question. For example:

Sample Multiple-Choice Item

(a)
(b)

*correct response

Advantages in Using Multiple-Choice Items

Multiple-choice items can provide...

  • versatility in measuring all levels of cognitive ability.
  • highly reliable test scores.
  • scoring efficiency and accuracy.
  • objective measurement of student achievement or ability.
  • a wide sampling of content or objectives.
  • a reduced guessing factor when compared to true-false items.
  • different response alternatives which can provide diagnostic feedback.

Limitations in Using Multiple-Choice Items

Multiple-choice items...

  • are difficult and time consuming to construct.
  • lead an instructor to favor simple recall of facts.
  • place a high degree of dependence on the student's reading ability and instructor's writing ability.

Suggestions For Writing Multiple-Choice Test Items

1. When possible, state the stem as a direct question rather than as an incomplete statement.
Undesirable:
Desirable:
2. Present a definite, explicit and singular question or problem in the stem.
Undesirable:
Desirable:
3. Eliminate excessive verbiage or irrelevant information from the stem.
Undesirable:
Desirable:
4. Include in the stem any word(s) that might otherwise be repeated in each alternative.
Undesirable:
5. Use negatively stated stems sparingly. When used, underline and/or capitalize the negative word.
Undesirable:
Desirable:

Item Alternatives

6. Make all alternatives plausible and attractive to the less knowledgeable or skillful student.
UndesirableDesirable
7. Make the alternatives grammatically parallel with each other, and consistent with the stem.
Undesirable:
8. Make the alternatives mutually exclusive.
Undesirable: The daily minimum required amount of milk that a 10 year old child should drink is
9. When possible, present alternatives in some logical order (e.g., chronological, most to least, alphabetical).
UndesirableDesirable
10. Be sure there is only one correct or best response to the item.
Undesirable:
11. Make alternatives approximately equal in length.
Undesirable:
12. Avoid irrelevant clues such as grammatical structure, well known verbal associations or connections between stem and answer.
Undesirable:
(grammatical clue)

of water behind the dam.

13. Use at least four alternatives for each item to lower the probability of getting the item correct by guessing.

14. Randomly distribute the correct response among the alternative positions throughout the test having approximately the same proportion of alternatives a, b, c, d and e as the correct response.

15. Use the alternatives "none of the above" and "all of the above" sparingly. When used, such alternatives should occasionally be used as the correct response.

A true-false item can be written in one of three forms: simple, complex, or compound. Answers can consist of only two choices (simple), more than two choices (complex), or two choices plus a conditional completion response (compound). An example of each type of true-false item follows:

Sample True-False Item: Simple

The acquisition of morality is a developmental process.TrueFalse

Sample True-False Item: Complex

Sample true-false item: compound.

The acquisition of morality is a developmental process.TrueFalse
 
 

Advantages In Using True-False Items

True-False items can provide...

  • the widest sampling of content or objectives per unit of testing time.
  • an objective measurement of student achievement or ability.

Limitations In Using True-False Items

True-false items...

  • incorporate an extremely high guessing factor. For simple true-false items, each student has a 50/50 chance of correctly answering the item without any knowledge of the item's content.
  • can often lead an instructor to write ambiguous statements due to the difficulty of writing statements which are unequivocally true or false.
  • do not discriminate between students of varying ability as well as other item types.
  • can often include more irrelevant clues than do other item types.
  • can often lead an instructor to favor testing of trivial knowledge.

Suggestions For Writing True-False Test Items

1.  Base true-false items upon statements that are absolutely true or false, without qualifications or exceptions.
Undesirable:
Desirable:
2.  Express the item statement as simply and as clearly as possible.
Undesirable:
Desirable:
3.  Express a single idea in each test item.
Undesirable:
Desirable:
4.  Include enough background information and qualifications so that the ability to respond correctly to the item does not depend on some special, uncommon knowledge.
Undesirable:
Desirable:
5.  Avoid lifting statements from the text, lecture or other materials so that memory alone will not permit a correct answer.
Undesirable:
Desirable:
6.  Avoid using negatively stated item statements.
Undesirable:
Desirable:
7.  Avoid the use of unfamiliar vocabulary.
Undesirable:
Desirable:
8.  Avoid the use of specific determiners which would permit a test-wise but unprepared examinee to respond correctly. Specific determiners refer to sweeping terms like "all," "always," "none," "never," "impossible," "inevitable," etc. Statements including such terms are likely to be false. On the other hand, statements using qualifying determiners such as "usually," "sometimes," "often," etc., are likely to be true. When statements do require the use of specific determiners, make sure they appear in both true and false items.
Undesirable:
required to rule on the constitutionality of a law. (T)
easier to score than an essay test. (T)
Desirable:
180°. (T)
other molecule of that compound. (T)
used for the metering of electrical energy used in a home. (F)
9.  False items tend to discriminate more highly than true items. Therefore, use more false items than true items (but no more than 15% additional false items).

In general, matching items consist of a column of stimuli presented on the left side of the exam page and a column of responses placed on the right side of the page. Students are required to match the response associated with a given stimulus. For example:

Sample Matching Test Item

Advantages In Using Matching Items

Matching items...

  • require short periods of reading and response time, allowing you to cover more content.
  • provide objective measurement of student achievement or ability.
  • provide highly reliable test scores.
  • provide scoring efficiency and accuracy.

Limitations in Using Matching Items

  • have difficulty measuring learning objectives requiring more than simple recall of information.
  • are difficult to construct due to the problem of selecting a common set of stimuli and responses.

Suggestions for Writing Matching Test Items

1.  Include directions which clearly state the basis for matching the stimuli with the responses. Explain whether or not a response can be used more than once and indicate where to write the answer.
Undesirable:
Desirable:
2.  Use only homogeneous material in matching items.
Undesirable:

1.

2.

3.

4.

5.

a.

b.

c.

d. O

e.

f.

Desirable:

1.

2.

3.

4. 

a. SO

b.

c.

d. O

e. HCl

3.  Arrange the list of responses in some systematic order if possible (e.g., chronological, alphabetical).
UndesirableDesirable

1.

2.

3.

4.

a.

b.

c.

d.

e.

a.

b.

c.

d.

e.

4.  Avoid grammatical or other clues to the correct response.
Undesirable:

1.

2.

3.

4.

Desirable:

5.  Keep matching items brief, limiting the list of stimuli to under 10.

6.  Include more responses than stimuli to help prevent answering through the process of elimination.

7.  When possible, reduce the amount of reading time by including only short phrases or single words in the response list.

The completion item requires the student to answer a question or to finish an incomplete statement by filling in a blank with the correct word or phrase. For example,

Sample Completion Item

According to Freud, personality is made up of three major systems, the _________, the ________ and the ________.

Advantages in Using Completion Items

Completion items...

  • can provide a wide sampling of content.
  • can efficiently measure lower levels of cognitive ability.
  • can minimize guessing as compared to multiple-choice or true-false items.
  • can usually provide an objective measure of student achievement or ability.

Limitations of Using Completion Items

  • are difficult to construct so that the desired response is clearly indicated.
  • are more time consuming to score when compared to multiple-choice or true-false items.
  • are more difficult to score since more than one answer may have to be considered correct if the item was not properly prepared.

Suggestions for Writing Completion Test Items

1.  Omit only significant words from the statement.
Undesirable: called a nucleus.
Desirable: .
2.  Do not omit so many words from the statement that the intended meaning is lost.
Undesirable:                                              
Desirable:                              
3.  Avoid grammatical or other clues to the correct response.
Undesirable: decimal system.
Desirable:
4.  Be sure there is only one correct response.
Undesirable: .
Desirable: .
5.  Make the blanks of equal length.
Undesirable: and   (Juno)  .
Desirable: and     (Juno)     .
6.  When possible, delete words at the end of the statement after the student has been presented a clearly defined problem.
Undesirable: .
Desirable: is     (122.5)     .

7.  Avoid lifting statements directly from the text, lecture or other sources.

8.  Limit the required response to a single word or phrase.

The essay test is probably the most popular of all types of teacher-made tests. In general, a classroom essay test consists of a small number of questions to which the student is expected to demonstrate his/her ability to (a) recall factual knowledge, (b) organize this knowledge and (c) present the knowledge in a logical, integrated answer to the question. An essay test item can be classified as either an extended-response essay item or a short-answer essay item. The latter calls for a more restricted or limited answer in terms of form or scope. An example of each type of essay item follows.

Sample Extended-Response Essay Item

Explain the difference between the S-R (Stimulus-Response) and the S-O-R (Stimulus-Organism-Response) theories of personality. Include in your answer (a) brief descriptions of both theories, (b) supporters of both theories and (c) research methods used to study each of the two theories. (10 pts.  20 minutes)

Sample Short-Answer Essay Item

Identify research methods used to study the S-R (Stimulus-Response) and S-O-R (Stimulus-Organism-Response) theories of personality. (5 pts.  10 minutes)

Advantages In Using Essay Items

Essay items...

  • are easier and less time consuming to construct than are most other item types.
  • provide a means for testing student's ability to compose an answer and present it in a logical manner.
  • can efficiently measure higher order cognitive objectives (e.g., analysis, synthesis, evaluation).

Limitations In Using Essay Items

  • cannot measure a large amount of content or objectives.
  • generally provide low test and test scorer reliability.
  • require an extensive amount of instructor's time to read and grade.
  • generally do not provide an objective measure of student achievement or ability (subject to bias on the part of the grader).

Suggestions for Writing Essay Test Items

1.  Prepare essay items that elicit the type of behavior you want to measure.
Learning Objective: The student will be able to explain how the normal curve serves as a statistical model.
Undesirable: Describe a normal curve in terms of: symmetry, modality, kurtosis and skewness.
Desirable: Briefly explain how the normal curve serves as a statistical model for estimation and hypothesis testing.
2.  Phrase each item so that the student's task is clearly indicated.
Undesirable: Discuss the economic factors which led to the stock market crash of 1929.
Desirable: Identify the three major economic conditions which led to the stock market crash of 1929. Discuss briefly each condition in correct chronological sequence and in one paragraph indicate how the three factors were inter-related.
3.  Indicate for each item a point value or weight and an estimated time limit for answering.
Undesirable: Compare the writings of Bret Harte and Mark Twain in terms of settings, depth of characterization, and dialogue styles of their main characters.
Desirable: Compare the writings of Bret Harte and Mark Twain in terms of settings, depth of characterization, and dialogue styles of their main characters. (10 points 20 minutes)

4.  Ask questions that will elicit responses on which experts could agree that one answer is better than another.

5.  Avoid giving the student a choice among optional items as this greatly reduces the reliability of the test.

6.  It is generally recommended for classroom examinations to administer several short-answer items rather than only one or two extended-response items.

Suggestions for Scoring Essay Items

ANALYTICAL SCORING:Each answer is compared to an ideal answer and points are assigned for the inclusion of necessary elements. Grades are based on the number of accumulated points either absolutely (i.e., A=10 or more points, B=6-9 pts., etc.) or relatively (A=top 15% scores, B=next 30% of scores, etc.)
GLOBAL QUALITY:Each answer is read and assigned a score (e.g., grade, total points) based either on the total quality of the response or on the total quality of the response relative to other student answers.

Examples Essay Item and Grading Models

"Americans are a mixed-up people with no sense of ethical values. Everyone knows that baseball is far less necessary than food and steel, yet they pay ball players a lot more than farmers and steelworkers."

WHY? Use 3-4 sentences to indicate how an economist would explain the above situation.

Analytical Scoring

Global Quality

Assign scores or grades on the overall quality of the written response as compared to an ideal answer. Or, compare the overall quality of a response to other student responses by sorting the papers into three stacks:

Read and sort each stack again divide into three more stacks

In total, nine discriminations can be used to assign test grades in this manner. The number of stacks or discriminations can vary to meet your needs.

  • Try not to allow factors which are irrelevant to the learning outcomes being measured affect your grading (i.e., handwriting, spelling, neatness).
  • Read and grade all class answers to one item before going on to the next item.
  • Read and grade the answers without looking at the students' names to avoid possible preferential treatment.
  • Occasionally shuffle papers during the reading of answers to help avoid any systematic order effects (i.e., Sally's "B" work always followed Jim's "A" work thus it looked more like "C" work).
  • When possible, ask another instructor to read and grade your students' responses.

Another form of a subjective test item is the problem solving or computational exam question. Such items present the student with a problem situation or task and require a demonstration of work procedures and a correct solution, or just a correct solution. This kind of test item is classified as a subjective type of item due to the procedures used to score item responses. Instructors can assign full or partial credit to either correct or incorrect solutions depending on the quality and kind of work procedures presented. An example of a problem solving test item follows.

Example Problem Solving Test Item

It was calculated that 75 men could complete a strip on a new highway in 70 days. When work was scheduled to commence, it was found necessary to send 25 men on another road project. How many days longer will it take to complete the strip? Show your work for full or partial credit.

Advantages In Using Problem Solving Items

Problem solving items...

  • minimize guessing by requiring the students to provide an original response rather than to select from several alternatives.
  • are easier to construct than are multiple-choice or matching items.
  • can most appropriately measure learning objectives which focus on the ability to apply skills or knowledge in the solution of problems.
  • can measure an extensive amount of content or objectives.

Limitations in Using Problem Solving Items

  • require an extensive amount of instructor time to read and grade.
  • generally do not provide an objective measure of student achievement or ability (subject to bias on the part of the grader when partial credit is given).

Suggestions For Writing Problem Solving Test Items

1.  Clearly identify and explain the problem.
Undesirable:
Desirable:
2.  Provide directions which clearly inform the student of the type of response called for.
Undesirable:
Desirable:
3.  State in the directions whether or not the student must show his/her work procedures for full or partial credit.
Undesirable:
Desirable:
4.  Clearly separate item parts and indicate their point values.
A man leaves his home and drives to a convention at an average rate of 50 miles per hour. Upon arrival, he finds a telegram advising him to return at once. He catches a plane that takes him back at an average rate of 300 miles per hour.
Undesirable:
Desirable:


5.  Use figures, conditions and situations which create a realistic problem.
Undesirable:
Desirable:

6.  Ask questions that elicit responses on which experts could agree that one solution and one or more work procedures are better than others.

7.  Work through each problem before classroom administration to double-check accuracy.

A performance test item is designed to assess the ability of a student to perform correctly in a simulated situation (i.e., a situation in which the student will be ultimately expected to apply his/her learning). The concept of simulation is central in performance testing; a performance test will simulate to some degree a real life situation to accomplish the assessment. In theory, a performance test could be constructed for any skill and real life situation. In practice, most performance tests have been developed for the assessment of vocational, managerial, administrative, leadership, communication, interpersonal and physical education skills in various simulated situations. An illustrative example of a performance test item is provided below.

Sample Performance Test Item

Assume that some of the instructional objectives of an urban planning course include the development of the student's ability to effectively use the principles covered in the course in various "real life" situations common for an urban planning professional. A performance test item could measure this development by presenting the student with a specific situation which represents a "real life" situation. For example,

An urban planning board makes a last minute request for the professional to act as consultant and critique a written proposal which is to be considered in a board meeting that very evening. The professional arrives before the meeting and has one hour to analyze the written proposal and prepare his critique. The critique presentation is then made verbally during the board meeting; reactions of members of the board or the audience include requests for explanation of specific points or informed attacks on the positions taken by the professional.

The performance test designed to simulate this situation would require that the student to be tested role play the professional's part, while students or faculty act the other roles in the situation. Various aspects of the "professional's" performance would then be observed and rated by several judges with the necessary background. The ratings could then be used both to provide the student with a diagnosis of his/her strengths and weaknesses and to contribute to an overall summary evaluation of the student's abilities.

Advantages In Using Performance Test Items

Performance test items...

  • can most appropriately measure learning objectives which focus on the ability of the students to apply skills or knowledge in real life situations.
  • usually provide a degree of test validity not possible with standard paper and pencil test items.
  • are useful for measuring learning objectives in the psychomotor domain.

Limitations In Using Performance Test Items

  • are difficult and time consuming to construct.
  • are primarily used for testing students individually and not for testing groups. Consequently, they are relatively costly, time consuming, and inconvenient forms of testing.
  • generally do not provide an objective measure of student achievement or ability (subject to bias on the part of the observer/grader).

Suggestions For Writing Performance Test Items

  • Prepare items that elicit the type of behavior you want to measure.
  • Clearly identify and explain the simulated situation to the student.
  • Make the simulated situation as "life-like" as possible.
  • Provide directions which clearly inform the students of the type of response called for.
  • When appropriate, clearly state time and activity limitations in the directions.
  • Adequately train the observer(s)/scorer(s) to ensure that they are fair in scoring the appropriate behaviors.

III. TWO METHODS FOR ASSESSING TEST ITEM QUALITY

This section presents two methods for collecting feedback on the quality of your test items. The two methods include using self-review checklists and student evaluation of test item quality. You can use the information gathered from either method to identify strengths and weaknesses in your item writing. 

Checklist for Evaluating Test Items

EVALUATE YOUR TEST ITEMS BY CHECKING THE SUGGESTIONS WHICH YOU FEEL YOU HAVE FOLLOWED.  

____ When possible, stated the stem as a direct question rather than as an incomplete statement.
____ Presented a definite, explicit and singular question or problem in the stem.
____ Eliminated excessive verbiage or irrelevant information from the stem.
____ Included in the stem any word(s) that might have otherwise been repeated in each alternative.
____ Used negatively stated stems sparingly. When used, underlined and/or capitalized the negative word(s).
____ Made all alternatives plausible and attractive to the less knowledgeable or skillful student.
____ Made the alternatives grammatically parallel with each other, and consistent with the stem.
____ Made the alternatives mutually exclusive.
____ When possible, presented alternatives in some logical order (e.g., chronologically, most to least).
____ Made sure there was only one correct or best response per item.
____ Made alternatives approximately equal in length.
____ Avoided irrelevant clues such as grammatical structure, well known verbal associations or connections between stem and answer.
____ Used at least four alternatives for each item.
____ Randomly distributed the correct response among the alternative positions throughout the test having approximately the same proportion of alternatives a, b, c, d, and e as the correct response.
____ Used the alternatives "none of the above" and "all of the above" sparingly. When used, such alternatives were occasionally the correct response.
____ Based true-false items upon statements that are absolutely true or false, without qualifications or exceptions.
____ Expressed the item statement as simply and as clearly as possible.
____ Expressed a single idea in each test item.
____ Included enough background information and qualifications so that the ability to respond correctly did not depend on some special, uncommon knowledge.
____ Avoided lifting statements from the text, lecture, or other materials.
____ Avoided using negatively stated item statements.
____ Avoided the use of unfamiliar language.
____ Avoided the use of specific determiners such as "all," "always," "none," "never," etc., and qualifying determiners such as "usually," "sometimes," "often," etc.
____ Used more false items than true items (but not more than 15% additional false items).
____ Included directions which clearly stated the basis for matching the stimuli with the response.
____ Explained whether or not a response could be used more than once and indicated where to write the answer.
____ Used only homogeneous material.
____ When possible, arranged the list of responses in some systematic order (e.g., chronologically, alphabetically).
____ Avoided grammatical or other clues to the correct response.
____ Kept items brief (limited the list of stimuli to under 10).
____ Included more responses than stimuli.

____ 

When possible, reduced the amount of reading time by including only short phrases or single words in the response list.
____ Omitted only significant words from the statement.
____ Did not omit so many words from the statement that the intended meaning was lost.
____ Avoided grammatical or other clues to the correct response.
____ Included only one correct response per item.
____ Made the blanks of equal length.
____ When possible, deleted the words at the end of the statement after the student was presented with a clearly defined problem.
____ Avoided lifting statements directly from the text, lecture, or other sources.
____ Limited the required response to a single word or phrase.
____ Prepared items that elicited the type of behavior you wanted to measure.
____ Phrased each item so that the student's task was clearly indicated.
____ Indicated for each item a point value or weight and an estimated time limit for answering.
____ Asked questions that elicited responses on which experts could agree that one answer is better than others.
____ Avoided giving the student a choice among optional items.
____ Administered several short-answer items rather than 1 or 2 extended-response items.

Grading Essay Test Items

____ Selected an appropriate grading model.
____ Tried not to allow factors which were irrelevant to the learning outcomes being measured to affect your grading (e.g., handwriting, spelling, neatness).
____ Read and graded all class answers to one item before going on to the next item.
____ Read and graded the answers without looking at the student's name to avoid possible preferential treatment.
____ Occasionally shuffled papers during the reading of answers.
____ When possible, asked another instructor to read and grade your students' responses.
____ Clearly identified and explained the problem to the student.
____ Provided directions which clearly informed the student of the type of response called for.
____ Stated in the directions whether or not the student must show work procedures for full or partial credit.
____ Clearly separated item parts and indicated their point values.
____ Used figures, conditions and situations which created a realistic problem.
____ Asked questions that elicited responses on which experts could agree that one solution and one or more work procedures are better than others.

____ 

Worked through each problem before classroom administration.
____ Prepared items that elicit the type of behavior you wanted to measure.
____ Clearly identified and explained the simulated situation to the student.
____ Made the simulated situation as "life-like" as possible.
____ Provided directions which clearly inform the students of the type of response called for.
____ When appropriate, clearly stated time and activity limitations in the directions.
____ Adequately trained the observer(s)/scorer(s) to ensure that they were fair in scoring the appropriate behaviors.

STUDENT EVALUATION OF TEST ITEM QUALITY 

Using ices questionnaire items to assess your test item quality .

The following set of ICES (Instructor and Course Evaluation System) questionnaire items can be used to assess the quality of your test items. The items are presented with their original ICES catalogue number. You are encouraged to include one or more of the items on the ICES evaluation form in order to collect student opinion of your item writing quality.

102--How would you rate the instructor's examination questions?116--Did the exams challenge you to do original thinking?
ExcellentPoorYes, very challengingNo, not challenging
103--How well did examination questions reflect content and emphasis of the course?118--Were there "trick" or trite questions on tests?
Well relatedPoorly relatedLots of themFew if any
114--The exams reflected important points in the reading assignments.122--How difficult were the examinations?
Strongly agreeStrongly disagreeToo difficultToo easy
119--Were exam questions worded clearly?123--I found I could score reasonably well on exams by just cramming.
Yes, very clearNo, very unclearStrongly agreeStrongly disagree
115--Were the instructor's test questions thought provoking?121--How was the length of exams for the time allotted.
Definitely yesDefinitely noToo longToo short
125--Were exams adequately discussed upon return?109--Were exams, papers, reports returned with errors explained or personal comments?
Yes, adequatelyNo, not enoughAlmost alwaysAlmost never

IV. ASSISTANCE OFFERED BY THE CENTER FOR INNOVATION IN TEACHING AND LEARNING (CITL)

The information on this page is intended for self-instruction. However, CITL staff members will consult with faculty who wish to analyze and improve their test item writing. The staff can also consult with faculty about other instructional problems. Instructors wishing to acquire CITL assistance can contact [email protected]

V. REFERENCES FOR FURTHER READING

Ebel, R. L. (1965). Measuring educational achievement . Prentice-Hall. Ebel, R. L. (1972). Essentials of educational measurement . Prentice-Hall. Gronlund, N. E. (1976). Measurement and evaluation in teaching (3rd ed.). Macmillan. Mehrens W. A. & Lehmann I. J. (1973). Measurement and evaluation in education and psychology . Holt, Rinehart & Winston. Nelson, C. H. (1970). Measurement and evaluation in the classroom . Macmillan. Payne, D. A. (1974).  The assessment of learning: Cognitive and affective . D.C. Heath & Co. Scannell, D. P., & Tracy D. B. (1975). Testing and measurement in the classroom . Houghton Mifflin. Thorndike, R. L. (1971). Educational measurement (2nd ed.). American Council on Education.

Center for Innovation in Teaching & Learning

249 Armory Building 505 East Armory Avenue Champaign, IL 61820

217 333-1462

Email: [email protected]

Office of the Provost

Logo for Open Oregon Educational Resources

Lesson 6.3: Understanding Test Items

Target with a bullseye dart

There are several common kinds of test items. For objective tests, students answer multiple choice questions, matching items, and fill-in-the-blank items. The other kind of test items are short-answer responses and essays. The following are explanations and examples of these along with tips for educated guessing, if applicable–after all, sometimes even though we study hard, a bit of test anxiety might cause a sudden mental block!

Multiple Choice Test Items

Explanation:.

These test items offer several answer choices. The test prompt (or question) is known as the “stem” for which you choose one or more of the answer options.

1. A simile is

  • a comparison without using “like”
  • a comparison using the word “as”
  • a comparison using either “like” or “as”
  • none of the above

2. Memory devices include

  • associations
  • 1, 3, and 4
  • all of the above

Make sure that all the rules of grammar apply when you match the stem with the option. for example, in example item number 2, above, notice that them stem directs you to look for a plural answer because “devices” is plural. Number 5, then, is the correct answer (answers 1, 3, and 4 are all plural).

Educated Guessing

  • Choose “3” or C, which is more often than not the correct answer (as in example item number 1, above).
  • Choose the longest or most inclusive answer, also as in example item number 1, above.
  • If the test item is for a math quiz, choose the in-between number, or one of the in-between numbers. example: 1) 432, 2) 77, 3) 12, 4)2,098. Your chances are better by choosing either 1 or 2.

Matching Test Items

Explanation.

This kind of test item features two columns, a numbered column and a lettered column. Students are asked to match the correct answer with the correct stem.

  • NASA                                                 ___organization device
  • headache                                          ___acronym
  • graphic organizer                           ___symptom of test anxiety

Count the number of items in each column. If there are more on one side, ask if an answer can be used more than once.

Fill-In-the-Blank Test Items

These are items for which you must fill in a word or words.

Fill in the ____________ questions are featured frequently on exams.

Fill-in-the-blank questions usually expect you to write one word per blank. If more than one word is expected, there will be more than one blank space or the blank will be long.

Short Answer Test Items

This type of test item usually involves a short answer of approximately 5-7 sentences. Typical short answer items will address only one topic and require only one “task” (see “essay test items,” below, for a test item requiring more than one task).

Define the term “mnemonic.”

Since many short answer test items ask the student to define a term, here are several ways to expand a definition to achieve the 5-7 sentence desired response (depending upon the course and the teacher’s requirements, that is):

  • the dictionary definition
  • an informal definition (in your own words, for example)
  • give an example of how the term is used
  • give the category in which the term is used, for example, what kind of essays belong in the “expository” genre (argument, explanation, cause and effect, etc.)
  • the history (etymology) of the term
  • include antonyms
  • include synonyms
  • tell what the term is not (some terms mean more than one distinct thing, for example, a “whatchamacallit is not only a slang term but also a candy bar.

Essay Test Items

This type of test is usually a multi-part prompt requiring several paragraphs or pages to answer. You can make use of writing formulas, for example how to write a basic, five-paragraph essay suitable for most classes. However, for writing classes the task will be expanded as per the type of writing class and the level of writing sophistication required.

Contrasted with short answer items where usually one task is required to fully answer them (such as a simple definition), there are typically several tasks required to fully answer an essay prompt. For example in the following prompt below, I have underlined all of the tasks and decisions required to fully address the prompt:

Many people believe Mark Twain’s book  Huckleberry Finn is racist and should not be included on high school reading lists. Others believe that Twain correctly portrayed race relations in the Southern part of the United States  in the latter part of the nineteenth century, and, in fact, by telling the story he exposed the negative racial attitudes existing then. Do you think the book should be kept on reading lists ? Explain why or why not in relation to today’s emphasis on “political correctness.”

  • compare–show similarities between items
  • contrast–show differences between items
  • sometimes both comparing and contrasting is required
  • define–give a concise meaning (see how to expand definitions, above)
  • describe–recount, characterize, or relate information in narrative form
  • discuss–examine, analyze, and present pros and cons regarding the topic. Detail is essential
  • enumerate–incorporate (which might require listing or outlining) major points in order
  • explain–clarify and interpret the material including explaining how and/or why. Detail is essential
  • Do NOT merely summarize the plot of the work if this is an essay for an English class or a literature class.

UNIT 6, EXERCISE 3.1

List how many tasks need to be accomplished in order to fully respond to the essay prompt below, or another one your instructor will provide for you.

Select a novel or a play and, focusing on one symbol, write an essay analyzing how that symbol functions in the work and what it reveals about the themes of the work as a whole.

How to Learn Like a Pro! Copyright © 2016 by Phyllis Nissila is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Creating and Scoring Essay Tests

FatCamera / Getty Images

  • Tips & Strategies
  • An Introduction to Teaching
  • Policies & Discipline
  • Community Involvement
  • School Administration
  • Technology in the Classroom
  • Teaching Adult Learners
  • Issues In Education
  • Teaching Resources
  • Becoming A Teacher
  • Assessments & Tests
  • Elementary Education
  • Secondary Education
  • Special Education
  • Homeschooling
  • M.Ed., Curriculum and Instruction, University of Florida
  • B.A., History, University of Florida

Essay tests are useful for teachers when they want students to select, organize, analyze, synthesize, and/or evaluate information. In other words, they rely on the upper levels of Bloom's Taxonomy . There are two types of essay questions: restricted and extended response.

  • Restricted Response - These essay questions limit what the student will discuss in the essay based on the wording of the question. For example, "State the main differences between John Adams' and Thomas Jefferson's beliefs about federalism," is a restricted response. What the student is to write about has been expressed to them within the question.
  • Extended Response - These allow students to select what they wish to include in order to answer the question. For example, "In Of Mice and Men , was George's killing of Lennie justified? Explain your answer." The student is given the overall topic, but they are free to use their own judgment and integrate outside information to help support their opinion.

Student Skills Required for Essay Tests

Before expecting students to perform well on either type of essay question, we must make sure that they have the required skills to excel. Following are four skills that students should have learned and practiced before taking essay exams:

  • The ability to select appropriate material from the information learned in order to best answer the question.
  • The ability to organize that material in an effective manner.
  • The ability to show how ideas relate and interact in a specific context.
  • The ability to write effectively in both sentences and paragraphs.

Constructing an Effective Essay Question

Following are a few tips to help in the construction of effective essay questions:

  • Begin with the lesson objectives in mind. Make sure to know what you wish the student to show by answering the essay question.
  • Decide if your goal requires a restricted or extended response. In general, if you wish to see if the student can synthesize and organize the information that they learned, then restricted response is the way to go. However, if you wish them to judge or evaluate something using the information taught during class, then you will want to use the extended response.
  • If you are including more than one essay, be cognizant of time constraints. You do not want to punish students because they ran out of time on the test.
  • Write the question in a novel or interesting manner to help motivate the student.
  • State the number of points that the essay is worth. You can also provide them with a time guideline to help them as they work through the exam.
  • If your essay item is part of a larger objective test, make sure that it is the last item on the exam.

Scoring the Essay Item

One of the downfalls of essay tests is that they lack in reliability. Even when teachers grade essays with a well-constructed rubric, subjective decisions are made. Therefore, it is important to try and be as reliable as possible when scoring your essay items. Here are a few tips to help improve reliability in grading:

  • Determine whether you will use a holistic or analytic scoring system before you write your rubric . With the holistic grading system, you evaluate the answer as a whole, rating papers against each other. With the analytic system, you list specific pieces of information and award points for their inclusion.
  • Prepare the essay rubric in advance. Determine what you are looking for and how many points you will be assigning for each aspect of the question.
  • Avoid looking at names. Some teachers have students put numbers on their essays to try and help with this.
  • Score one item at a time. This helps ensure that you use the same thinking and standards for all students.
  • Avoid interruptions when scoring a specific question. Again, consistency will be increased if you grade the same item on all the papers in one sitting.
  • If an important decision like an award or scholarship is based on the score for the essay, obtain two or more independent readers.
  • Beware of negative influences that can affect essay scoring. These include handwriting and writing style bias, the length of the response, and the inclusion of irrelevant material.
  • Review papers that are on the borderline a second time before assigning a final grade.
  • 8 First Day of High School Activities to Get to Know Your Students
  • Utilizing Extended Response Items to Enhance Student Learning
  • Study for an Essay Test
  • Top 10 Tips for Passing the AP US History Exam
  • How to Create a Rubric in 6 Steps
  • ACT Format: What to Expect on the Exam
  • 10 Common Test Mistakes
  • UC Personal Statement Prompt #1
  • Tips to Create Effective Matching Questions for Assessments
  • GMAT Exam Structure, Timing, and Scoring
  • Self Assessment and Writing a Graduate Admissions Essay
  • The Computer-Based GED Test
  • How To Study for Biology Exams
  • SAT Sections, Sample Questions and Strategies
  • What You Need to Know About the Executive Assessment
  • Best Practices for Subjective Test Questions

Your Article Library

Essay test: types, advantages and limitations | statistics.

essay type test item

ADVERTISEMENTS:

After reading this article you will learn about:- 1. Introduction to Essay Test 2. Types of Essay Test 3. Advantages 4. Limitations 5. Suggestions.

Introduction to Essay Test:

The essay tests are still commonly used tools of evaluation, despite the increasingly wider applicability of the short answer and objective type questions.

There are certain outcomes of learning (e.g., organising, summarising, integrating ideas and expressing in one’s own way) which cannot be satisfactorily measured through objective type tests. The importance of essay tests lies in the measurement of such instructional outcomes.

An essay test may give full freedom to the students to write any number of pages. The required response may vary in length. An essay type question requires the pupil to plan his own answer and to explain it in his own words. The pupil exercises considerable freedom to select, organise and present his ideas. Essay type tests provide a better indication of pupil’s real achievement in learning. The answers provide a clue to nature and quality of the pupil’s thought process.

That is, we can assess how the pupil presents his ideas (whether his manner of presentation is coherent, logical and systematic) and how he concludes. In other words, the answer of the pupil reveals the structure, dynamics and functioning of pupil’s mental life.

The essay questions are generally thought to be the traditional type of questions which demand lengthy answers. They are not amenable to objective scoring as they give scope for halo-effect, inter-examiner variability and intra-examiner variability in scoring.

Types of Essay Test:

There can be many types of essay tests:

Some of these are given below with examples from different subjects:

1. Selective Recall.

e.g. What was the religious policy of Akbar?

2. Evaluative Recall.

e.g. Why did the First War of Independence in 1857 fail?

3. Comparison of two things—on a single designated basis.

e.g. Compare the contributions made by Dalton and Bohr to Atomic theory.

4. Comparison of two things—in general.

e.g. Compare Early Vedic Age with the Later Vedic Age.

5. Decision—for or against.

e.g. Which type of examination do you think is more reliable? Oral or Written. Why?

6. Causes or effects.

e.g. Discuss the effects of environmental pollution on our lives.

7. Explanation of the use or exact meaning of some phrase in a passage or a sentence.

e.g., Joint Stock Company is an artificial person. Explain ‘artificial person’ bringing out the concepts of Joint Stock Company.

8. Summary of some unit of the text or of some article.

9. Analysis

e.g. What was the role played by Mahatma Gandhi in India’s freedom struggle?

10. Statement of relationship.

e.g. Why is knowledge of Botany helpful in studying agriculture?

11. Illustration or examples (your own) of principles in science, language, etc.

e.g. Illustrate the correct use of subject-verb position in an interrogative sentence.

12. Classification.

e.g. Classify the following into Physical change and Chemical change with explanation. Water changes to vapour; Sulphuric Acid and Sodium Hydroxide react to produce Sodium Sulphate and Water; Rusting of Iron; Melting of Ice.

13. Application of rules or principles in given situations.

e.g. If you sat halfway between the middle and one end of a sea-saw, would a person sitting on the other end have to be heavier or lighter than you in order to make the sea-saw balance in the middle. Why?

14. Discussion.

e.g. Partnership is a relationship between persons who have agreed to share the profits of a business carried on by all or any of them acting for all. Discuss the essentials of partnership on the basis of this partnership.

15. Criticism—as to the adequacy, correctness, or relevance—of a printed statement or a classmate’s answer to a question on the lesson.

e.g. What is the wrong with the following statement?

The Prime Minister is the sovereign Head of State in India.

16. Outline.

e.g. Outline the steps required in computing the compound interest if the principal amount, rate of interest and time period are given as P, R and T respectively.

17. Reorganization of facts.

e.g. The student is asked to interview some persons and find out their opinion on the role of UN in world peace. In the light of data thus collected he/she can reorganise what is given in the text book.

18. Formulation of questions-problems and questions raised.

e.g. After reading a lesson the pupils are asked to raise related problems- questions.

19. New methods of procedure

e.g. Can you solve this mathematical problem by using another method?

Advantages of the Essay Tests:

1. It is relatively easier to prepare and administer a six-question extended- response essay test than to prepare and administer a comparable 60-item multiple-choice test items.

2. It is the only means that can assess an examinee’s ability to organise and present his ideas in a logical and coherent fashion.

3. It can be successfully employed for practically all the school subjects.

4. Some of the objectives such as ability to organise idea effectively, ability to criticise or justify a statement, ability to interpret, etc., can be best measured by this type of test.

5. Logical thinking and critical reasoning, systematic presentation, etc. can be best developed by this type of test.

6. It helps to induce good study habits such as making outlines and summaries, organising the arguments for and against, etc.

7. The students can show their initiative, the originality of their thought and the fertility of their imagination as they are permitted freedom of response.

8. The responses of the students need not be completely right or wrong. All degrees of comprehensiveness and accuracy are possible.

9. It largely eliminates guessing.

10. They are valuable in testing the functional knowledge and power of expression of the pupil.

Limitations of Essay Tests:

1. One of the serious limitations of the essay tests is that these tests do not give scope for larger sampling of the content. You cannot sample the course content so well with six lengthy essay questions as you can with 60 multiple-choice test items.

2. Such tests encourage selective reading and emphasise cramming.

3. Moreover, scoring may be affected by spelling, good handwriting, coloured ink, neatness, grammar, length of the answer, etc.

4. The long-answer type questions are less valid and less reliable, and as such they have little predictive value.

5. It requires an excessive time on the part of students to write; while assessing, reading essays is very time-consuming and laborious.

6. It can be assessed only by a teacher or competent professionals.

7. Improper and ambiguous wording handicaps both the students and valuers.

8. Mood of the examiner affects the scoring of answer scripts.

9. There is halo effect-biased judgement by previous impressions.

10. The scores may be affected by his personal bias or partiality for a particular point of view, his way of understanding the question, his weightage to different aspect of the answer, favouritism and nepotism, etc.

Thus, the potential disadvantages of essay type questions are :

(i) Poor predictive validity,

(ii) Limited content sampling,

(iii) Scores unreliability, and

(iv) Scoring constraints.

Suggestions for Improving Essay Tests:

The teacher can sometimes, through essay tests, gain improved insight into a student’s abilities, difficulties and ways of thinking and thus have a basis for guiding his/her learning.

(A) White Framing Questions:

1. Give adequate time and thought to the preparation of essay questions, so that they can be re-examined, revised and edited before they are used. This would increase the validity of the test.

2. The item should be so written that it will elicit the type of behaviour the teacher wants to measure. If one is interested in measuring understanding, he should not ask a question that will elicit an opinion; e.g.,

“What do you think of Buddhism in comparison to Jainism?”

3. Use words which themselves give directions e.g. define, illustrate, outline, select, classify, summarise, etc., instead of discuss, comment, explain, etc.

4. Give specific directions to students to elicit the desired response.

5. Indicate clearly the value of the question and the time suggested for answering it.

6. Do not provide optional questions in an essay test because—

(i) It is difficult to construct questions of equal difficulty;

(ii) Students do not have the ability to select those questions which they will answer best;

(iii) A good student may be penalised because he is challenged by the more difficult and complex questions.

7. Prepare and use a relatively large number of questions requiring short answers rather than just a few questions involving long answers.

8. Do not start essay questions with such words as list, who, what, whether. If we begin the questions with such words, they are likely to be short-answer question and not essay questions, as we have defined the term.

9. Adapt the length of the response and complexity of the question and answer to the maturity level of the students.

10. The wording of the questions should be clear and unambiguous.

11. It should be a power test rather than a speed test. Allow a liberal time limit so that the essay test does not become a test of speed in writing.

12. Supply the necessary training to the students in writing essay tests.

13. Questions should be graded from simple to complex so that all the testees can answer atleast a few questions.

14. Essay questions should provide value points and marking schemes.

(B) While Scoring Questions:

1. Prepare a marking scheme, suggesting the best possible answer and the weightage given to the various points of this model answer. Decide in advance which factors will be considered in evaluating an essay response.

2. While assessing the essay response, one must:

a. Use appropriate methods to minimise bias;

b. Pay attention only to the significant and relevant aspects of the answer;

c. Be careful not to let personal idiosyncrasies affect assessment;

d. Apply a uniform standard to all the papers.

3. The examinee’s identity should be concealed from the scorer. By this we can avoid the “halo effect” or “biasness” which may affect the scoring.

4. Check your marking scheme against actual responses.

5. Once the assessment has begun, the standard should not be changed, nor should it vary from paper to paper or reader to reader. Be consistent in your assessment.

6. Grade only one question at a time for all papers. This will help you in minimising the halo effect in becoming thoroughly familiar with just one set of scoring criteria and in concentrating completely on them.

7. The mechanics of expression (legibility, spelling, punctuation, grammar) should be judged separately from what the student writes, i.e. the subject matter content.

8. If possible, have two independent readings of the test and use the average as the final score.

Related Articles:

  • Merits and Demerits of Objective Type Test
  • Types of Recall Type Test: Simple and Completion | Objective Test

Educational Statistics , Evaluation Tools , Essay Test

Comments are closed.

web statistics

CAVEON SECURITY INSIGHTS BLOG

The World's Only Test Security Blog

Pull up a chair among Caveon's experts in psychometrics, psychology, data science, test security, law, education, and oh-so-many other fields and join in the conversation about all things test security.

Constructing Test Items (Guidelines & 7 Common Item Types)

Posted by Erika Johnson

December 7, 2023 at 6:16 PM

updated over a week ago

Introduction

Let's say you have been given the task of building an examination for your organization.

Finally (after spending two weeks panicking about how you would do this and definitely not procrastinating the work that must be done), you are finally ready to begin the test development process. 

But you can't help but ask yourself:

  • Where in the world do you begin?
  • Why do you need to create this exam?
  • And while you know you need to construct test items, which item types are the best fit for your exam?
  • Who is your audience?
  • How do you determine all that?

Luckily for you, Caveon has an amazing team of experts on hand to help with every step of the way: Caveon Secure Exam Development (C-SEDs). Whether working with our team or trying your hand at test development yourself, here's some information on item best practices to help guide you on your way.

Table of Contents

  • The Benefits of Identifying Your Exam’s Purpose
  • What Is a Minimally Qualified Candidate (MQC)?

Common Exam Types

Common item types.

  • General Guidelines for Constructing Test Items
  • Conclusion & More Resources

Determine Your Purpose for Testing: Why and Who

First thing’s first.

Before creating your test, you need to determine your purpose:

  • Why you are testing your candidates, and
  • Who exactly will be taking your exam

Assessing your testing program's purpose (the "why" and "who" of your exam) is the first vital step of the development process. You do not want to test just to test; you want to scope out the reason for your exam. Ask yourself:

  • Why is this exam important to your organization?
  • What are you trying to achieve with having your test takers sit for it?

Consider the following:

Is your organization interested in testing to see what was learned at the end of a course presented to students?

  • Are you looking to assess if an applicant for a job has the necessary knowledge to perform the role?
  • Are candidates trying to obtain certification within a certain field?

The Benefits of Identifying Your Exam's Purpose

Learning the purpose of your exam will help you come up with a plan on how best to set up your exam—which exam type to use, which type of exam items will best measure the skills of your candidates (we will discuss this in a minute), etc.

Determining your test's purpose will also help you to be better able to figure out your testing audience, which will en sure your exam is testing your examinees at the right level.

Whether they are students still in school, individuals looking to qualify for a position, or experts looking to get certification in a certain product or field, it’s important to make sure your exam is actually testing at the appropriate level .

For example, your test scores will not be valid if your items are too easy or too hard, so keeping the minimally qualified candidate (MQC) in mind during all of the steps of the exam development process will ensure you are capturing valid test results overall.

What Is the MQC?

MQC is the acronym for “minimally qualified candidate.”

The MQC is a conceptualization of the assessment candidate who possesses the minimum knowledge, skills, experience, and competence to just meet the expectations of a credentialed individual.

If the credential is entry level, the expectations of the MQC will be less than if the credential is designated at an intermediate or expert level.

Think of an ability continuum that goes from low ability to high ability. Somewhere along that ability continuum, a cut point will be set. Those candidates who score below that cut point are not qualified and will fail the test. Those candidates who score above that cut point are qualified and will pass.

The minimally qualified candidate, though, should just barely make the cut. It’s important to focus on the word “qualified,” because even though this candidate will likely gain more expertise over time, they are still deemed to have the requisite knowledge and abilities to perform the job or understand the subject.

Factors to Consider when Constructing Your Test

Now that you’ve determined the purpose of your exam and identified the audience, it’s time to decide on the exam type and which item types to use that will be most appropriate to measure the skills of your test takers.

First up, your exam type.

The type of exam you choose depends on what you are trying to test and the kind of tool you are using to deliver your exam.

You should always make sure the software you use to develop and deliver your exam is thoroughly vetted—here's an outline of some of the most important things to look for in your testing engine:

Choosing your Testing Engine

Next up, the type of exam and items you choose.

The type of exam and type(s) of items you choose depend on your measurement goals and what you are trying to assess. It is essential to take all of this into consideration before moving forward with development.

Here are some common exam types to consider:

Fixed-Form Exam

Fixed-form delivery is a method of testing where every test taker receives the same items. An organization can have more than one fixed-item form in rotation, using the same items that are randomized on each live form. Additionally, forms can be made using a larger item bank and published with a fixed set of items equated to a comparable difficulty and content area match.

Computer Adaptive Testing (CAT)

A CAT exam is a test that adapts to the candidate's ability in real time by selecting different questions from the bank in order to provide a more accurate measurement of their ability level on a common scale. Every time a test taker answers an item, the computer re-estimates the tester’s ability based on all the previous answers and the difficulty of those items. The computer then selects the next item that the test taker should have a 50% chance of answering correctly.

Linear on the Fly Testing (LOFT)

A LOFT exam is a test where the items are drawn from an item bank pool and presented on the exam in a way that each person sees a different set of items. The difficulty of the overall test is controlled to be equal for all examinees. LOFT exams utilize automated item generation ( AIG ) to create large item banks.

The above three exam types can be used with any standard item type.

Before moving on, however, there is another more innovative exam type to consider if your delivery method allows for it: 

Performance-Based Testing

A performance-based assessment measures the test taker's ability to apply the skills and knowledge learned beyond typical methods of study and/or learned through research and experience. For example, a test taker in a medical field may be asked to draw blood from a patient to show they can competently perform the task. Or a test taker wanting to become a chef may be asked to prepare a specific dish to ensure they can execute it properly.

Once you've decided on the type of exam you'll use, it's time to choose your item types.

There are many different item types to choose from (you can check out a few of our favorites in this article.)

While utilizing more item types on your exam won’t ensure you have more valid test results , it’s important to know what’s available in order to decide on the best item format for your program.

Here are a few of the most common items to consider when constructing your test:

Multiple-Choice

A multiple-choice item is a question where a candidate is asked to select the correct response from a choice of four (or more) options.

Multiple Response

A multiple response item is an item where a candidate is asked to select more than one response from a select pool of options (i.e., “choose two,” “choose 3,” etc.)

Short Answer

Short answer items ask a test taker to synthesize, analyze, and evaluate information, and then to present it coherently in written form.

A matching item requires test takers to connect a definition/description/scenario to its associated correct keyword or response.

A build list item challenges a candidate’s ability to identify and order the steps/tasks needed to perform a process or procedure. 

Discrete Option Multiple Choice ™ (DOMC)

DOMC™ is known as the “multiple-choice item makeover.” Instead of showing all the answer options, DOMC options are randomly presented one at a time. For each option, the test taker chooses “yes” or “no.” When the question is answered correctly or incorrectly, the next question is presented. DOMC has been used by award-winning testing programs to prevent cheating and test theft. You can learn more about the DOMC item type in this white paper .

SmartItem ™

A self-protecting item, otherwise known as a SmartItem , employs a proprietary technology resistant to cheating and theft. A SmartItem contains multiple variations, all of which work together to cover an entire learning objective completely. Each time the item is administered, the computer generates a random variation. SmartItem technology has numerous benefits, including curbing item development costs and mitigating the effects of testwiseness. You can learn more about the SmartItem in this infographic and this white paper .

What Are the General Guidelines for Constructing Test Items?

Regardless of the exam type and item types you choose, focusing on some best practice guidelines can set up your exam for success in the long run.

There are many guidelines for creating tests (see this handy guide, for example), but this list sticks to the most important points. Little things can really make a difference when developing a valid and reliable exam!

Institute Fairness

Although you want to ensure that your items are difficult enough that not everyone gets them correct, you never want to trick your test takers! Keeping your wording clear and making sure your questions are direct and not ambiguous is very important. For example, asking a question such as “What is the most important ingredient to include when baking chocolate chip cookies?” does not set your test taker up for success. One person may argue that sugar is the most important, while another test taker may say that the chocolate chips are the most necessary ingredient. A better way to ask this question would be “What is an ingredient found in chocolate chip cookies?” or “Place the following steps in the proper order when baking chocolate chip cookies.”

Stick to the Topic at Hand

When creating your items, ensuring that each item aligns with the objective being tested is very important. If the objective asks the test taker to identify genres of music from the 1990s, and your item is asking the test taker to identify different wind instruments, your item is not aligning with the objective.

Ensure Item Relevancy

Your items should be relevant to the task that you are trying to test. Coming up with ideas to write on can be difficult, but avoid asking your test takers to identify trivial facts about your objective just to find something to write about. If your objective asks the test taker to know the main female characters in the popular TV show Friends , asking the test taker what color Rachel’s skirt was in episode 3 is not an essential fact that anyone would need to recall to fully understand the objective.

Gauge Item Difficulty

As discussed above, remembering your audience when writing your test items can make or break your exam. To put it into perspective, if you are writing a math exam for a fourth-grade class, but you write all of your items on advanced trigonometry, you have clearly not met the difficulty level for the test taker.

Inspect Your Options

When writing your options, keep these points in mind:

  • Always make sure your correct option is 100% correct, and your incorrect options are 100% incorrect. By using partially correct or partially incorrect options, you will confuse your candidate. Doing this could keep a truly qualified candidate from answering the item correctly.
  • Make sure your distractors are plausible. If your correct response logically answers the question being asked, but your distractors are made up or even silly, it will be very easy for any test taker to figure out which option is correct. Thus, your exam will not properly discriminate between qualified and unqualified candidates.
  • Try to make your options parallel to one another. Ensuring that your options are all worded similarly and are approximately the same length will keep one from standing out from another, helping to remove that testwiseness effect.

Constructing test items and creating entire examinations is no easy undertaking.

This article will hopefully help you identify your specific purpose for testing and determine the  exam and item types you can use to best measure the skills of your test takers.

We’ve also gone over general best practices to consider when constructing items, and we’ve sprinkled helpful resources throughout to help you on your exam development journey.

(Note: This article helps you tackle the first step of the 8-step assessment process : Planning & Developing Test Specifications.)

To learn more about creating your exam —i ncluding how to increase the usable lifespan of your exam — review our ultimate guide on secure exam creation and also our workbook on evaluating your testing engine, leveraging secure item types, and increasing the number of items on your tests.

And if you need help constructing your exam and/or items, our award-winning exam development team is here to help!

Erika Johnson

Erika is an Exam Development Manager in Caveon’s C-SEDs group. With almost 20 years in the testing industry, nine of which have been with Caveon, Erika is a veteran of both exam development and test security. Erika has extensive experience working with new, innovative test designs, and she knows how to best keep an exam secure and valid.

About Caveon

For more than 18 years, Caveon Test Security has driven the discussion and practice of exam security in the testing industry. Today, as the recognized leader in the field, we have expanded our offerings to encompass innovative solutions and technologies that provide comprehensive protection: Solutions designed to detect, deter, and even prevent test fraud.

Topics from this blog: Exam Development K-12 Education Test Security Basics DOMC™ Certification Higher Education Online Exams Automated Item Generation (AIG) SmartItem™ Medical Licensure

Posts by Topic

  • Test Security Basics (34)
  • Detection Measures (29)
  • K-12 Education (27)
  • Online Exams (21)
  • Test Security Plan (21)
  • Higher Education (20)
  • Prevention Measures (20)
  • Test Security Consulting (20)
  • Certification (19)
  • Exam Development (19)
  • Deterrence Measures (15)
  • Medical Licensure (15)
  • Web Monitoring (12)
  • Data Forensics (11)
  • Investigating Security Incidents (11)
  • Test Security Stories (9)
  • Security Incident Response Plan (8)
  • Monitoring Test Administration (7)
  • SmartItem™ (7)
  • Automated Item Generation (AIG) (6)
  • Braindumps (6)
  • Proctoring (4)
  • DMCA Letters (2)

Recent Posts

Subscribe to our newsletter.

Get expert knowledge delivered straight to your inbox, including exclusive access to industry publications and Caveon's subscriber-only resource, The Lockbox .

VIEW MORE RESOURCES

View Insights & Resources

NEED HELP CREATING TEST ITEMS?

THAT'S WHAT WE'RE HERE FOR.

Learn More

essay type test item

What makes effective test questions and answers for assessments?

What instructors and administrators need to know

Christine Lee

Understanding the meaning and function of summative assessment helps clarify its role within education as a critical component of bridging teaching and learning. In this post, we take a closer look at summative assessment’s qualities with the end goal of ensuring that summative assessment supports learning and informs teaching.

essay type test item

Choosing a balance of assessment formats that enable feedback loops and learning insights is not a light task. Let's examine different forms of assessment to help teachers make thoughtful decisions when it comes to how we evaluate our students.

essay type test item

Dive into the differences between test validity and reliability and how they can affect student learning outcomes and a program's overall success.

By completing this form, you agree to Turnitin's Privacy Policy . Turnitin uses the information you provide to contact you with relevant information. You may unsubscribe from these communications at any time.

Thoughtful test questions and answers can help create an effective assessment, one that accurately measures student knowledge. When test questions are crafted with learning objectives in mind, they help foster study habits, influence knowledge retention, and prepare students for eventual summative assessments. Furthermore, when students feel an assessment is fair and relevant, they are less likely to engage in academic misconduct.

Assessment is the intersection at which instructors can provide feedback to guide students but also where instructors gain insights into student learning . In many cases, this feedback exchange can solidify student-teacher relationships and influence learning outcomes. With effective assessments, students can feel seen and supported. And instructors have the information they need to further learning. Thoughtful decisions about test questions and formats can make a difference in this data exchange.

There are many forms of test questions, each with their own strengths when it comes to upholding learning objectives. Some types of questions are efficient and measure breadth of student knowledge whereas other types of questions offer more opportunities to gain insights into higher order thinking.

Some of the most common question types and the roles of each in the realm of assessment are:

  • Multiple-choice
  • Extended matching sets
  • Fill-in-the-blank
  • Short answer
  • Long answer / essay

To that end, this blog post will cover the above question types and then dive into methodology to bolster exam design.

What sets this question type apart?

Multiple-choice questions (MCQs) have the ability to test a wide swath of knowledge in a short amount of time; this characteristic, plus the fact that MCQs enable faster grading and uphold objective scoring , make them a very popular standardized exam format.

That said, there are many critics of MCQs, some going so far as to say “multiple-choice tests are not catalysts for learning” and that “they incite the bad habit of teaching to tests” (Ramirez, 2013). Multiple research articles, too, indicate multiple-choice questions may result in surface-level study habits . However, they can still be leveraged for effective assessment when utilized appropriately. Multiple-choice questions can be paired with other question types to provide a complementary assessment or they can themselves be designed to test deeper conceptual understanding.

There are examples of how this question type can be useful in testing reading comprehension and practical knowledge of learned principles. In response to criticism surrounding the inclusion of multiple-choice questions on the Uniform Bar Exam (UBE), The Jacob D. Fuchsberg Law Center at Touro College cites the “case file” format of a 1983 performance test in California , a multiple-choice exam paired with documents typical of a legal case file. Successful completion of this exam did not rely on rote memorization of rules. Rather, this exam used a series of multiple-choice questions to assess the application of relevant theories and practices to true-to-life scenarios presented in the mock case file.

Those considering the value of multiple-choice questions should also keep in mind any summative assessments that lie ahead for students, beyond the scope of a single course. In a recent webinar on the subject of multiple response type questions in nursing programs, Assistant Professor Cheryl Frutchey noted that many of her students at Oklahoma City University’s School of Nursing have been reporting that 70-75% of NCLEX questions are now the “select all that apply” format. In weighing the benefits of a particular question type in determining student success, field-related insights like these may help tip the scale.

A true/false question asks the exam-taker to judge a statement’s validity. Rather than calling upon powers of memorization, the exam-taker ideally demonstrates their command of verbal knowledge and a working knowledge of a given subject by converting abstract principles to a specific application .

That said, the nature of true/false questions makes it so that even when guessing, the test-taker has a fifty-percent chance of getting the correct answer.

The multiple-true-false question is an adaptation of the true-false question that incorporates (and improves upon) elements of the multiple-choice question type, requiring the test-taker to consider all answer options in relation to a given question stem. This hybrid question type differs from “select all that apply” in asking the test-taker to identify both correct and incorrect statements rather than just the “true” ones, shedding light on incorrect or incomplete understandings .

For both true/false and multiple-choice question types, opportunity for feedback is severely limited.

Particularly helpful for the usual format of clinical assessments in nursing exams, this item type provides a series of individual questions and a longer list of possible answers for the test-taker to choose from. By design, extended matching set questions prioritize an understanding of the question stems before a correct selection can be made, making it difficult to quickly eliminate incorrect answers from the list .

With an extended list of answers to accompany perhaps only a handful of question stems, this question type encourages the test-taker to process information within each question before parsing relevant answers from the provided list , emphasizing a deeper subject mastery than simple memorization can provide.

A known benefit of free response question types like fill-in-the-blank is the decreased possibility of guessing the correct answer. Since the exam-taker must provide an answer that fits contextually within the provided question stem, fill-in-the-blank questions are more likely to exercise language skills.

In a recent study composed of 134 final-year undergraduate dental students at the University of Peradeniya, 90% found fill-in-the-blank questions more challenging than the same question in multiple-choice format, and only 19% reported encountering fill-in-the-blank questions during their time in the program. By withholding answer choices that lead to quick answer recall, fill-in-the-blank questions can effectively gauge an exam-taker’s understanding. Though, as revealed above, the prevalence and/or feasibility of this item type may vary from program to program. And again, feedback is minimal with this type of question.

Short-answer questions are valuable for measuring a test-taker’s understanding of a subject beyond simple recall. Preparing for an assessment with this question type promotes study habits that reinforce comprehension over memorization , thus increasing the likelihood that the test-taker will retain this knowledge.

For example: After using ExamSoft to convert their assessment format from multiple-choice to short-answer questions, the Donald & Barbara Zucker School of Medicine at Hofstra/Northwell conducted a survey to measure student attitudes about the switch. Sixty-four percent of the 274 students surveyed thought that short-answer questions better equipped them for a clinical setting. By exercising abilities in critical thinking, reasoning, and communication, the free-response format of this question type allows the cultivation of skills necessary for the workplace.

Long answer or essay questions allow individual students to formulate their unique ideas and responses to demonstrate their understanding of a concept. This question is one that can most easily measure higher-order thinking and depth of knowledge, though at the same time, it may not cover a wide range of said knowledge.

Marking essay questions can be a time burden on instructors; additionally, long answers involve some measure of subjective scoring. They may also measure writing skills as well as subject-specific knowledge.

Beyond building assessments using all of these common question types, ExamSoft users can:

  • Supplement individual questions with audio, video, or image attachments
  • Create “hotspot” questions for exam-takers to select an area of an image as an answer
  • Tag questions with categories, including learning objectives and accreditation criteria. Additionally, ExamSoft offers robust item analysis.
  • Explore various question types offered by ExamSoft , such as bowtie, matrix, and drag-and-drop.

With Gradescope , instructors can:

  • Accommodate a variety of question types with audio, video, or image attachments
  • Utilize item analysis to measure exam design effectiveness, particularly for multiple-choice questions
  • Grade question by question with answer groups and AI-assisted grading instead of student-by-student to promote more objective scoring
  • Use Dynamic Rubrics to ensure students receive detailed insight into how points were awarded or deducted. Dynamic Rubrics also allow for flexibility to adjust grading criteria midstream to account for later accommodations for all students.

Examplify, ExamSoft’s test-taking application, offers several built-in exam tools for test-takers to use, including:

  • Highlighter and notepad
  • Programmable spreadsheet
  • Scientific and graphing calculators

Gradescope accommodates a variety of assignment types and enables:

  • Grading of paper-based exams, bubble sheets, and homework
  • Programming assignments (graded automatically or manually)
  • Creation of online assignments that students answer right on Gradescope

Assessment is a crucial part of education, no matter the subject or level. Assessments are tools to measure how much a student has learned, though with the right post-exam data, they can be so much more, including assessments themselves being a learning opportunity. But not all assessments are created equal ; a poorly written exam or exam item may skew results, giving instructors a false sense of student learning.

Effective exam items provide an accurate demonstration of what students know, and they also support fair and equitable testing. To get the most out of your assessments, it’s important to write well-constructed exam items with every student in mind and then test item efficacy.

There are two general categories of exam items: objective items and subjective items . Objective test items have a clear correct answer; item types can include multiple choice, true/false, short answer, and fill-in-the-blank items. Subjective items, on the other hand, may have a range of correct answers. Answers to subjective questions often involve persuasive/defensible arguments or present various options for in-depth discernment. Test items like these usually come in the form of long answers, essays, or performance-based evaluations.

According to the Eberly Center for Teaching Excellence and Educational Innovation at Carnegie Mellon University , “There is no single best type of exam question: the important thing is that the questions reflect your learning objectives.” It is the educator’s place to determine whether a subjective or objective test item will better align with their learning objectives.

If you want students to explain the symbolism in a literary text, subjective-based questions like short answers and essays are usually best. Objective test items are great if you want to make sure your students can recall facts or choose the best argument to support a thesis. If you want your students to match medical terms to their definitions? A matching task, which is an objective item, may be your best bet. No matter the subject, it is imperative to ensure the question types serve the intended learning objectives.

As you consider exam items, and whether you’re going to use objective or subjective items, it’s important to keep cognitive complexity in mind. Bloom’ s Taxonomy can help with planning not only curriculum but assessment . Bloom’s consists of six levels of cognitive understanding. From the lowest to highest order, these are:

As you move up the ladder from recall to creation, there is a gradual shift from objective to subjective exam items. If students are new to the concepts you’re teaching, it’s often best to focus on the initial three levels with objective items and set an appropriate knowledge foundation. As students progress through a course or program, you can start to assess the top three levels of cognition with subjective exam items to determine higher-order thinking or capability. While some courses may span testing student factual recall to synthesizing and creating their own ideas, many introductory classes may only pertain to parts of Bloom’s Taxonomy. More advanced courses, like graduate seminars, may target the higher order categories like analyze, evaluate, and create.

You might assess students’ grasp of the “remember” level with a multiple-choice question about the date of a significant period in history. Whereas testing students’ skills in “evaluation” may look like a persuasive essay prompting students to argue and support their stance on a topic with no one correct position such as interpretation of metaphors in written works.

As exam creators, we may sometimes write an item that is difficult for students to understand. After writing an item, ask yourself if the question or statement could be written more clearly. Are there double negatives? Have you used passive voice construction? Are you attempting to teach the concept in the question stem itself? Often, the more concise the item is, the better. If possible, do not use absolutes such as “never” and “always.” We’re writing questions, not riddles; it is best practice to test the students’ knowledge, not how well they read. The point is to focus on student knowledge acquisition and effectively convey the point of the question.

Avoid idioms and colloquialisms that may not be clear to international students. Questions containing regional references demonstrate bias. Also consider references that may exclude historically marginalized groups. For instance, an item that refers to a regional sport may not be as clear to these groups as a sport with international reach. Another example is the infamous critique of the SAT question referring to “regattas.” This term, which might be familiar to one certain socioeconomic group and completely unfamiliar to others, is simultaneously not a measure of aptitude.

Using psychometrics , specific and widely accepted statistical measures of exam data, you can test the reliability of your exam and items. One way to measure exam reliability through psychometrics is the item Difficulty Index, or p-value. Simply put, what percentage of exam-takers answered a specific question correctly?

If the p-value is low, the item may be too difficult. If the p-value is high, the item may be too easy. However, this data point alone is not a strong measure of reliability and should be used in context with other psychometric measures. If your difficult question has a high Discrimination Index and Point Biserial values, you can more confidently say that only the higher-order thinkers answered correctly, while the lower-performers did not. A high corresponding Point Biserial value also tells you that generally, students performing well on this item, albeit difficult, performed well on the overall exam. When psychometrics are used together, you are able to gain a solid holistic picture of item performance and whether your question was well written.

Psychometric analysis measures include:

  • Difficulty (p-value)
  • Discrimination Index
  • Upper and Lower Difficulty Indexes
  • Point Biserial Correlation Coefficient
  • Kuder-Richardson Formula 20

The above strategies for writing and optimizing exam items is by no means exhaustive, but considering these as you create your exams will improve your questions immensely. By delivering assessments with a data-driven digital exam platform, instructors, exam creators, and programs can use the results of carefully created exams to improve learning outcomes, teaching strategies, retention rates, and more.

Teachers Institute

Types of Questions in Teacher Made Achievement Tests: A Comprehensive Guide

essay type test item

Table of Contents

When it comes to assessing students’ learning, teachers often turn to achievement tests they’ve created themselves. These tests are powerful tools that can provide both educators and learners with valuable insights into academic progress and understanding. But what types of questions make up these teacher-made tests? Understanding the various types of test items is crucial for designing assessments that are not only effective but also fair and comprehensive. Let’s dive into the world of objective and essay-type questions to see how they function and how best to construct them.

Objective Type Test Items

Objective test items are those that require students to select or provide a very short response to a question, with one clear, correct answer. This section will explore the different types of objective test items , their uses, and tips for constructing them.

Supply Type Items

  • Short Answer Questions: These require students to recall and provide brief responses.
  • Fill-in-the-Blank: Here, students must supply a word or phrase to complete a statement.
  • Numerical Problems: Often used in math and science, these items require the calculation and provision of a numerical answer.

When constructing supply type items , clarity is key. Questions should be direct, and the required answer should be unambiguous. Avoid complex phrasing and ensure that the blank space provided is proportional to the expected answer’s length.

Selection Type Items

  • Multiple\-Choice Questions \(MCQs\) : Students choose the correct answer from a list of options.
  • True\/False Questions : These require students to determine the veracity of a statement.
  • Matching Items : Students must pair related items from two lists.

For selection type items , it’s important to construct distractors (wrong answers) that are plausible. This prevents guessing and encourages students to truly understand the material. In multiple-choice questions, for example, the incorrect options should be common misconceptions or errors related to the subject matter.

Essay Type Test Items

Essay test items call for longer, more detailed responses from students. These questions evaluate not just recall of information but also critical thinking, organization of thoughts, and the ability to communicate effectively through writing.

Extended Response Essay Questions

  • Exploratory Essays : These require a thorough investigation of a topic, often without a strict length constraint.
  • Argumentative Essays : Students must take a stance on an issue and provide supporting evidence.

In extended response essay questions, students should be given clear guidelines regarding the scope and depth of the response expected. Rubrics can be very helpful in setting these expectations and in guiding both the grading process and the students’ preparation.

Restricted Response Essay Questions

  • Reflective Essays : These typically involve a shorter response, reflecting on a specific question or scenario.
  • Analysis Essays : Students dissect a particular concept or event within a set framework.

Restricted response essay questions are valuable for assessing specific skills or knowledge within a limited domain. When constructing these items, ensure the question is focused and that students are aware of any word or time limits.

Examples and Guidelines for Constructing Effective Test Items

Now that we’ve understood the types of questions, let’s look at some examples and guidelines for creating effective test items.

Objective Item Construction

  • Multiple-Choice Example: “What is the capital of France? A) Madrid B) Paris C) Rome D) Berlin” – Ensure there’s only one correct answer.
  • True/False Example: “The Great Wall of China is visible from space.” – Provide a statement that is not ambiguously phrased.

When constructing objective items, make sure the question is based on important content, not trivial facts. The length of the test should be sufficient to cover the breadth of the material, and the items should vary in difficulty to gauge different levels of student understanding.

Essay Item Construction

  • Extended Response Example: “Discuss the impact of the Industrial Revolution on European society.” – This question allows for a broad exploration of the topic.
  • Restricted Response Example: “Describe two methods of conflict resolution and their effectiveness in workplace settings.” – This question limits the scope to two methods and a specific context.

Essay questions should be open-ended to encourage students to think critically and creatively. However, they should also be specific enough to prevent off-topic responses. Providing a clear rubric can help students understand what is expected in their answers and assist teachers in grading consistently.

Teacher-made achievement tests with a mix of objective and essay type questions can provide a comprehensive assessment of student learning. By understanding the different types of questions and following the guidelines for constructing them, educators can create fair, reliable, and valid assessments. This ensures that the results truly reflect students’ knowledge and skills, allowing for targeted feedback and further instructional planning.

What do you think? How can teachers balance the need for comprehensive assessment with the practical limitations of test administration time? Do you think one type of test item is more effective than the other in measuring student learning?

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Submit Comment

Assessment for Learning

1 Concept and Purpose of Evaluation

  • Basic Concepts
  • Relationships among Measurement, Assessment, and Evaluation
  • Teaching-Learning Process and Evaluation
  • Assessment for Enhancing Learning
  • Other Terms Related to Assessment and Evaluation

2 Perspectives of Assessment

  • Behaviourist Perspective of Assessment
  • Cognitive Perspective of Assessment
  • Constructivist Perspective of Assessment
  • Assessment of Learning and Assessment for Learning

3 Approaches to Evaluation

  • Approaches to Evaluation: Placement Formative Diagnostic and Summative
  • Distinction between Formative and Summative Evaluation
  • External and Internal Evaluation
  • Norm-referenced and Criterion-referenced Evaluation
  • Construction of Criterion-referenced Tests

4 Issues, Concerns and Trends in Assessment and Evaluation

  • What is to be Assessed?
  • Criteria to be used to Assess the Process and Product
  • Who will Apply the Assessment Criteria and Determine Marks or Grades?
  • How will the Scores or Grades be Interpreted?
  • Sources of Error in Examination
  • Learner-centered Assessment Strategies
  • Question Banks
  • Semester System
  • Continuous Internal Evaluation
  • Choice-Based Credit System (CBCS)
  • Marking versus Grading System
  • Open Book Examination
  • ICT Supported Assessment and Evaluation

5 Techniques of Assessment and Evaluation

  • Concept Tests
  • Self-report Techniques
  • Assignments
  • Observation Technique
  • Peer Assessment
  • Sociometric Technique
  • Project Work
  • School Club Activities

6 Criteria of a Good Tool

  • Evaluation Tools: Types and Differences
  • Essential Criteria of an Effective Tool of Evaluation
  • Reliability
  • Objectivity

7 Tools for Assessment and Evaluation

  • Paper Pencil Test
  • Aptitude Test
  • Achievement Test
  • Diagnostic–Remedial Test
  • Intelligence Test
  • Rating Scales
  • Questionnaire
  • Inventories
  • Interview Schedule
  • Observation Schedule
  • Anecdotal Records
  • Learners Portfolios and Rubrics

8 ICT Based Assessment and Evaluation

  • Importance of ICT in Assessment and Evaluation
  • Use of ICT in Various Types of Assessment and Evaluation
  • Role of Teacher in Technology Enabled Assessment and Evaluation
  • Online and E-examination
  • Learners’ E-portfolio and E-rubrics
  • Use of ICT Tools for Preparing Tests and Analyzing Results

9 Teacher Made Achievement Tests

  • Understanding Teacher Made Achievement Test (TMAT)
  • Types of Achievement Test Items/Questions
  • Construction of TMAT
  • Administration of TMAT
  • Scoring and Recording of Test Results
  • Reporting and Interpretation of Test Scores

10 Commonly Used Tests in Schools

  • Achievement Test Versus Aptitude Test
  • Performance Based Achievement Test
  • Diagnostic Testing and Remedial Activities
  • Question Bank
  • General Observation Techniques
  • Practical Test

11 Identification of Learning Gaps and Corrective Measures

  • Educational Diagnosis
  • Diagnostic Tests: Characteristics and Functions
  • Diagnostic Evaluation Vs. Formative and Summative Evaluation
  • Diagnostic Testing
  • Achievement Test Vs. Diagnostic Test
  • Diagnosing and Remedying Learning Difficulties: Steps Involved
  • Areas and Content of Diagnostic Testing
  • Remediation

12 Continuous and Comprehensive Evaluation

  • Continuous and Comprehensive Evaluation: Concepts and Functions
  • Forms of CCE
  • Recording and Reporting Students Performance
  • Students Profile
  • Cumulative Records

13 Tabulation and Graphical Representation of Data

  • Use of Educational Statistics in Assessment and Evaluation
  • Meaning and Nature of Data
  • Organization/Grouping of Data: Importance of Data Organization and Frequency Distribution Table
  • Graphical Representation of Data: Types of Graphs and its Use
  • Scales of Measurement

14 Measures of Central Tendency

  • Individual and Group Data
  • Measures of Central Tendency: Scales of Measurement and Measures of Central Tendency
  • The Mean: Use of Mean
  • The Median: Use of Median
  • The Mode: Use of Mode
  • Comparison of Mean, Median, and Mode

15 Measures of Dispersion

  • Measures of Dispersion
  • Standard Deviation

16 Correlation – Importance and Interpretation

  • The Concept of Correlation
  • Types of Correlation
  • Methods of Computing Co-efficient of Correlation (Ungrouped Data)
  • Interpretation of the Co-efficient of Correlation

17 Nature of Distribution and Its Interpretation

  • Normal Distribution/Normal Probability Curve
  • Divergence from Normality

Share on Mastodon

Center for Teaching

Writing good multiple choice test questions.

Brame, C. (2013) Writing good multiple choice test questions. Retrieved [todaysdate] from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/.

Constructing an Effective Stem

Constructing effective alternatives.

  • Additional Guidelines for Multiple Choice Questions

Considerations for Writing Multiple Choice Items that Test Higher-order Thinking

Additional resources.

Multiple choice test questions, also known as items, can be an effective and efficient way to assess learning outcomes. Multiple choice test items have several potential advantages:

essay type test item

Reliability: Reliability is defined as the degree to which a test consistently measures a learning outcome. Multiple choice test items are less susceptible to guessing than true/false questions, making them a more reliable means of assessment. The reliability is enhanced when the number of MC items focused on a single learning objective is increased. In addition, the objective scoring associated with multiple choice test items frees them from problems with scorer inconsistency that can plague scoring of essay questions.

Validity: Validity is the degree to which a test measures the learning outcomes it purports to measure. Because students can typically answer a multiple choice item much more quickly than an essay question, tests based on multiple choice items can typically focus on a relatively broad representation of course material, thus increasing the validity of the assessment.

The key to taking advantage of these strengths, however, is construction of good multiple choice items.

A multiple choice item consists of a problem, known as the stem, and a list of suggested solutions, known as alternatives. The alternatives consist of one correct or best alternative, which is the answer, and incorrect or inferior alternatives, known as distractors.

essay type test item

1. The stem should be meaningful by itself and should present a definite problem. A stem that presents a definite problem allows a focus on the learning outcome. A stem that does not present a clear problem, however, may test students’ ability to draw inferences from vague descriptions rather serving as a more direct test of students’ achievement of the learning outcome.

essay type test item

2. The stem should not contain irrelevant material , which can decrease the reliability and the validity of the test scores (Haldyna and Downing 1989).

irr-material

3. The stem should be negatively stated only when significant learning outcomes require it. Students often have difficulty understanding items with negative phrasing (Rodriguez 1997). If a significant learning outcome requires negative phrasing, such as identification of dangerous laboratory or clinical practices, the negative element should be emphasized with italics or capitalization.

essay type test item

4. The stem should be a question or a partial sentence. A question stem is preferable because it allows the student to focus on answering the question rather than holding the partial sentence in working memory and sequentially completing it with each alternative (Statman 1988). The cognitive load is increased when the stem is constructed with an initial or interior blank, so this construction should be avoided.

essay type test item

1. All alternatives should be plausible. The function of the incorrect alternatives is to serve as distractors,which should be selected by students who did not achieve the learning outcome but ignored by students who did achieve the learning outcome. Alternatives that are implausible don’t serve as functional distractors and thus should not be used. Common student errors provide the best source of distractors.

essay type test item

2. Alternatives should be stated clearly and concisely. Items that are excessively wordy assess students’ reading ability rather than their attainment of the learning objective

essay type test item

3. Alternatives should be mutually exclusive. Alternatives with overlapping content may be considered “trick” items by test-takers, excessive use of which can erode trust and respect for the testing process.

essay type test item

4. Alternatives should be homogenous in content. Alternatives that are heterogeneous in content can provide cues to student about the correct answer.

essay type test item

5. Alternatives should be free from clues about which response is correct. Sophisticated test-takers are alert to inadvertent clues to the correct answer, such differences in grammar, length, formatting, and language choice in the alternatives. It’s therefore important that alternatives

  • have grammar consistent with the stem.
  • are parallel in form.
  • are similar in length.
  • use similar language (e.g., all unlike textbook language or all like textbook language).

6. The alternatives “all of the above” and “none of the above” should not be used. When “all of the above” is used as an answer, test-takers who can identify more than one alternative as correct can select the correct answer even if unsure about other alternative(s). When “none of the above” is used as an alternative, test-takers who can eliminate a single option can thereby eliminate a second option. In either case, students can use partial knowledge to arrive at a correct answer.

7. The alternatives should be presented in a logical order (e.g., alphabetical or numerical) to avoid a bias toward certain positions.

essay type test item

8. The number of alternatives can vary among items as long as all alternatives are plausible. Plausible alternatives serve as functional distractors, which are those chosen by students that have not achieved the objective but ignored by students that have achieved the objective. There is little difference in difficulty, discrimination, and test score reliability among items containing two, three, and four distractors.

Additional Guidelines

1. Avoid complex multiple choice items , in which some or all of the alternatives consist of different combinations of options. As with “all of the above” answers, a sophisticated test-taker can use partial knowledge to achieve a correct answer.

essay type test item

2. Keep the specific content of items independent of one another. Savvy test-takers can use information in one question to answer another question, reducing the validity of the test.

When writing multiple choice items to test higher-order thinking, design questions that focus on higher levels of cognition as defined by Bloom’s taxonomy . A stem that presents a problem that requires application of course principles, analysis of a problem, or evaluation of alternatives is focused on higher-order thinking and thus tests students’ ability to do such thinking. In constructing multiple choice items to test higher order thinking, it can also be helpful to design problems that require multilogical thinking, where multilogical thinking is defined as “thinking that requires knowledge of more than one fact to logically and systematically apply concepts to a …problem” (Morrison and Free, 2001, page 20). Finally, designing alternatives that require a high level of discrimination can also contribute to multiple choice items that test higher-order thinking.

essay type test item

  • Burton, Steven J., Sudweeks, Richard R., Merrill, Paul F., and Wood, Bud. How to Prepare Better Multiple Choice Test Items: Guidelines for University Faculty, 1991.
  • Cheung, Derek and Bucat, Robert. How can we construct good multiple-choice items? Presented at the Science and Technology Education Conference, Hong Kong, June 20-21, 2002.
  • Haladyna, Thomas M. Developing and validating multiple-choice test items, 2 nd edition. Lawrence Erlbaum Associates, 1999.
  • Haladyna, Thomas M. and Downing, S. M.. Validity of a taxonomy of multiple-choice item-writing rules. Applied Measurement in Education , 2(1), 51-78, 1989.
  • Morrison, Susan and Free, Kathleen. Writing multiple-choice test items that promote and measure critical thinking. Journal of Nursing Education 40: 17-24, 2001.

Creative Commons License

Teaching Guides

  • Online Course Development Resources
  • Principles & Frameworks
  • Pedagogies & Strategies
  • Reflecting & Assessing
  • Challenges & Opportunities
  • Populations & Contexts

Quick Links

  • Services for Departments and Schools
  • Examples of Online Instructional Modules

KnowledgeMag

  • Arts & Entertainment
  • Electronics
  • Education & Communication
  • Hobbies & Crafts
  • Forgot Password?

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

By signing up, you agree to the our terms and our Privacy Policy agreement.

Hot This Week

How to check browsing history on a wi-fi router, johnny depp biography, how to send large videos from iphone.

KnowledgeMag

Essay Type Test: Advantages, Disadvantages, Limitations

admin

In this article, we are going to discuss Essay Type Test: Advantages, Disadvantages, and Limitations.

1. Study Habits

2. reduce guessing, 3. easy to construct, 4.degree of comprehensiveness, 5. logical thinking, 6. creativity, 7. enhance thinking ability:, 8. complex learning, 9. individual differences, limitations or disadvantages of essay type test:, 1. low objectivity, 2. cramming habit, 3. lack of relaibility, 4. selective study, 5. subjectivity of scoring, 6. burden of students, 7. scope of favouritism, 8. time-consuming, 9. low validity, defects of essay-type test from the point of view of the teacher, mathematical question, essay type test.

With the aid of systematization and the creation of a unique composition, students are forced to answer fully to a question or prompt on an essay test. An essay test is designed to evaluate students’ ability to write an argumentative, logical essay.

Advantages Of Essay Type Test

These exams encourage students to develop effective study habits. These exams encourage students to develop effective study habits.

The guesswork can be reduced to a large extent.

Such tests are easier to administer and construct.

This test is helpful to measure all degrees of comprehensiveness and accuracy.

These aid in students’ logical thinking, critical reasoning, and methodical presentation skills development.

Such tests provide an opportunity for the child to show his creativity, originality of thought, fertility of their imagination, etc.

These types of tests are considered to be best for measuring the ability to organize ideas effectively, the ability to criticize or justify a statement, the ability to interpret, etc.

It’s helpful to measure complex learning outcomes.

It’s helpful to determine individual differences.

Defects of the Essay-Type Test as Viewed by Students

The essay-type tests are less objective and so they lack validity.

Essay type of test increases the child’s cramming habit capacity. 

These tests lack the dependability of essay-type tests is low as compared to various multiple-choice questions or objective-type questions.

A student is constrained or bound to study a selective script of his course. He often guesses the questions which may probably have an element of chance from an exam point of view.

The subjectivity of scoring is the most drawback of an essay sort test, which implies that individual likes and loathes play a critical part in the checking.

It puts a lot of pressure on the students. It keeps the student busy and far away from nervous tension.

Partiality is another flaw with essay-sort assessments since instructors tend to award more credit to their top candidates.

It is time-consuming on the part of students and the speed of writing and good writing style consume enough time of students.

It has less content validity.

1-The primary and preeminent point of the educator is the shining victory of all his /her understudies for that reason he tries to cover the constrained and the foremost important contents of the syllabus first and his focus is the maximum number of students to get through the examination.

2. The instruction Programme of the instructor is entirely examination-oriented and the essential guideline for instructing his understudies is given the slightest thought.

3. The educator is compelled to empower his understudies to pack which isn’t a mental strategy for instructing.

4. Since the instructor is judged by the comes about of his understudies so everything gets to be subservient to the examination

5. To show good results sometimes the teacher devotes a good deal of his time to indulging in guesswork which affects his teaching.

In math practice tests, students are tested for different types of questions related to various equations and problem-solving. This also increases quantitative reasoning. This area includes questions on math topics like algebra, geometry, data analysis, and problem-solving.

  • Significance Of Student’s Assessment In Teaching-Learning Process .
  • Importance Of Technical Education In The Youth .
  • Who Was Aristotle And His Philosophy Of Education? .

' src=

KnowledgeMag Team

Related Posts

Alex lange bio, wikipedia, age, height, net worth, profession, weight, girlfriend, how to send fax from iphone, top 10 best zoom background pictures.

Save my name, email, and website in this browser for the next time I comment.

essay type test item

  • Team Members
  • Terms And Conditions
  • Volunteer Content Writing
  • Hobbies & Crafts
  • Arts & Entertainment
  • Education & Communication
  • Privacy Policy

Type above and press Enter to search. Press Esc to cancel.

  • Educational Assessment

Advantages and Disadvantages of Different Types of Test Questions

  • October 23, 2018
  • Maryellen Weimer, PhD

It’s good to regularly review the advantages and disadvantages of the most commonly used test questions and the test banks that now frequently provide them.

Multiple-choice questions

  • Quick and easy to score, by hand or electronically
  • Can be written so that they test a wide range of higher-order thinking skills
  • Can cover lots of content areas on a single exam and still be answered in a class period

Disadvantages

  • Often test literacy skills: “if the student reads the question carefully, the answer is easy to recognize even if the student knows little about the subject” (p. 194)
  • Provide unprepared students the opportunity to guess, and with guesses that are right, they get credit for things they don’t know
  • Expose students to misinformation that can influence subsequent thinking about the content
  • Take time and skill to construct (especially good questions)

True-false questions

  • Quick and easy to score
  • Considered to be “one of the most unreliable forms of assessment” (p. 195)
  • Often written so that most of the statement is true save one small, often trivial bit of information that then makes the whole statement untrue
  • Encourage guessing, and reward for correct guesses

Short-answer questions

  • Quick and easy to grade
  • Quick and easy to write
  • Encourage students to memorize terms and details, so that their understanding of the content remains superficial

Essay questions

  • Offer students an opportunity to demonstrate knowledge, skills, and abilities in a variety of ways
  • Can be used to develop student writing skills, particularly the ability to formulate arguments supported with reasoning and evidence
  • Require extensive time to grade
  • Encourage use of subjective criteria when assessing answers
  • If used in class, necessitate quick composition without time for planning or revision, which can result in poor-quality writing

Questions provided by test banks

  • Save instructors the time and energy involved in writing test questions
  • Use the terms and methods that are used in the book
  • Rarely involve analysis, synthesis, application, or evaluation (cross-discipline research documents that approximately 85 percent of the questions in test banks test recall)
  • Limit the scope of the exam to text content; if used extensively, may lead students to conclude that the material covered in class is unimportant and irrelevant

We tend to think that these are the only test question options, but there are some interesting variations. The article that promoted this review proposes one: Start with a question, and revise it until it can be answered with one word or a short phrase. Do not list any answer options for that single question, but attach to the exam an alphabetized list of answers. Students select answers from that list. Some of the answers provided may be used more than once, some may not be used, and there are more answers listed than questions. It’s a ratcheted-up version of matching. The approach makes the test more challenging and decreases the chance of getting an answer correct by guessing.

Remember, students do need to be introduced to any new or altered question format before they encounter it on an exam.

Editor’s note: The list of advantages and disadvantages comes in part from the article referenced here. It also cites research evidence relevant to some of these advantages and disadvantages.

Reference: McAllister, D., and Guidice, R.M. (2012). This is only a test: A machine-graded improvement to the multiple-choice and true-false examination. Teaching in Higher Education, 17 (2), 193-207.

Reprinted from The Teaching Professor, 28.3 (2014): 8. © Magna Publications. All rights reserved.

Stay Updated with Faculty Focus!

Get exclusive access to programs, reports, podcast episodes, articles, and more!

  • Opens in a new tab

Welcome Back

Username or Email

Remember Me

Already a subscriber? log in here.

  • Page Content
  • Sidebar Content
  • Main Navigation
  • Quick links

Back to Section Home

  • All TIP Sheets
  • Taking Lecture Notes
  • Study Tips for Biology Classes
  • How to Study for Tests
  • Working Successfully with a Study Group
  • Avoiding Test Anxiety
  • Multiple Choice and Other Objective Tests

Essay Tests

  • Take-Home and Open-Book Tests
  • Plan Ahead: Studying for Finals

TIP Sheet HOW TO TAKE ESSAY TESTS

There are basically two types of exams:

Objective - requires answers of a word or short phrase, or the selection of an answer from several available choices that are provided on the test . Essay - requires answers to be written out at some length. The student functions as the source of information.

An essay exam requires you to see the significance and meaning of what you know. It tests your knowledge and understanding of the subject and your skill in reading and writing. To be successful on an essay exam, you must:

  • Prove immediately that you know the material.
  • Make your meaning unmistakably clear.
  • Employ a reasonable organization and show sufficient thought development.
  • Make every word count.
  • Be specific.
  • Use your own voice and style.

When you are writing an essay as part of an exam, all this must be done within what amounts to a first draft written in a very limited amount of time. As with all writing, if you think of your essay as being produced in three stages, you can tackle the test in an organized fashion. The three stages are pre-writing, writing, and revision. Suggestions for each of these stages follow.

The last section addresses preparation for essay exams. PRE-WRITING

Your first impulse in a writing exam is probably to read the question and start writing immediately, especially when you see those seconds ticking away on the clock. RESIST THAT IMPULSE! You can't successfully address the subject until you know precisely what you're required to do, you understand and have thought about the subject, and you are organized in how you approach the specific points you wish to make in your answer. 1.  Understanding what to do:

  • When you get your copy of the exam, read through to make sure you understand what is expected of you. FOLLOW THE INSTRUCTIONS EXACTLY!
  • Underline or circle key words that direct the approach your answer should take. Some of the most common key words are:

Agree/Disagree : State your position and support it with facts Comment or Evaluate: State your position and support it with facts, discussing the issue and its merits. Analyze : Break down into all the parts or divisions looking at the relationships between them. Compare/Contrast : Show differences and similarities. Describe/Discuss : Examine in detail. Explain : Tell why something is as it is. Illustrate : Give examples and relate them to the statement in question. Prove/Defend : Demonstrate why something is true. Interpret : Explain the significance or meaning of something. List/State : Make a list of points or facts. Summarize : Hit the high points.

2.  Understanding the subject

  • When you are confident that you understand the instructions, direct your attention to the topic.
  • Collect your ideas.
  • Formulate a thesis. Make sure it is a strong, concise statement that specifically addresses the question.
  • Think of as many specific details and facts as you can that support the thesis.

3.  Getting organized

  • Jot your ideas down on paper, in very brief format.
  • Evaluate your ideas in light of the question. Ask yourself repeatedly: "Does this apply to the question I'm supposed to answer?" Select only those ideas most relevant to your purpose.
  • Number your ideas in order of appropriate sequence (first step to last step, most important to least important, etc.)

1.  Remember your thesis. Now stick to it, referring back to it periodically throughout your essay. This gives your essay unity and coherence, and helps insure that you are not digressing. 2.  Write in an orderly fashion. If you suddenly think of a new point, jot it down in a margin or on scratch paper until you find an appropriate place for it. Don't just put it into the middle of what you were writing. 3. Avoid:

  • Repeating, in other words, what you have already said.
  • Digressing into material that does not answer the question.
  • Language that is too broad or general. Be specific.
  • Bluffing. This far too common practice of using elegant but empty language to conceal ignorance or lack of effort rarely works, and often irritates the reader(s).
  • Write as legibly as you can. If you want, write on every other line so you have room to add later. When you want to cross something off, simply draw a straight line through it. This is much better than scribbling out an entire passage.
  • If you run out of time, simply write "Ran out of time" at the close of the essay. This is much better than adding a hurriedly tacked on, and possibly incoherent, conclusion.

Essay examinations are difficult because of the time pressures, yet you should always try to leave a few minutes at the end to proofread your essay. 1.  Ask yourself, before you hand in the essay:

  • Did I provide the information requested? That is, did I "explain" or "define" as the directions asked?
  • Is the answer simply, clearly, and logically organized?
  • Do I stick to my thesis statement? Is there unnecessary information in here?
  • Did I proofread to check content and/or mechanical errors?

2.  Proofreading:

  • Gives you a chance to catch and correct errors in content.
  • Gives you a chance to correct your mechanical errors.
  • Allows you to add material that may have occurred to you after writing the essay.

3.  You should proofread for:

  • Complete sentences (watch for fragments, comma-splices, and run-ons).
  • Words omitted, or one word used when you meant another.
  • Logical transitions between sentences and paragraphs.
  • Unnecessary repetition of words or ideas.
  • Spelling errors.

3.  Essay type tests depend a great deal on your basic writing skills - organization, punctuation, grammar, and spelling. If your answer is not clearly written, your instructor won't be able to find it! Here are some basic guidelines to keep in mind as you take an essay test:

  • Read the directions carefully! Read every part of the directions!
  • Give yourself time to answer each question. Quickly look over the entire exam and budget your time per question accordingly.
  • Above all, stay calm. You are being asked to show competence, not perfection.
  • If you are not too sure about one question, leave it and go back.
  • When given a choice, answer the questions you know best.
  • State your points and support ideas clearly - don't make the instructor have to look for them.
  • Go back to check and proofread all of your answers.

PREPARING FOR ESSAY EXAMS

WRITING A SUCCESSFUL ESSAY EXAM BEGINS ON DAY ONE 1.  Study regularly as you go along.

  • Take careful lecture notes.
  • Read all material when assigned.
  • Become familiar with vocabulary.
  • Keep a study list of all main ideas.

2.  Final preparation

  • Review lecture notes and reading material.
  • Find a classmate or friend willing to talk over key ideas and implications.
  • Try to anticipate questions . This is very important!  Use your lecture notes to zero in on points that the instructor emphasized.
  • Think through the material and write up the best possible essay questions you can.
  • Then answer those questions.
  • Pinpoint key points that you would like to make when answering each question.
  • Put your answer into outline form or write it out completely.
  • For each potential test question, use mnemonics or other memory techniques to move the information to your long-term memory for the exam.
  • Create a list of the clue words for each point you wish to make.
  • Create a mnemonic device to memorize those points.

3.  Come to the exam confident that you have something specific to say on all possible topics. KEY WORDS COMMONLY FOUND ON ESSAY EXAMS

Compare: Look for qualities or characteristics that resemble each other. Emphasize similarities among them, but in some cases also mention differences.

Contrast: Stress the dissimilarities, differences, or unlikenesses of things, qualities, events, or problems.

Criticize: Express your judgement about the merit or truth of the factors or views mentioned. Give the results of your analysis of these factors, discussing their limitations and good points.

Define: Give concise, clear, and authoritative meanings. Don't give details, but make sure to give the limits of the definitions. Show how the thing you are defining differs from things in other classes.

Describe: Recount, characterize, sketch, or relate in sequence or story form.

Diagram: Give a drawing, chart, plan, or graphic answer. Usually you should label a diagram. In some cases, add a brief explanation or description.

Discuss: Examine, analyze carefully, and give reasons pro and con. Be complete, and give details.

Enumerate: Write in list or outline form, giving points concisely one by one.

Evaluate: Carefully appraise the problem, citing both advantages and limitations. Emphasize the appraisal of authorities and, to lesser degree, your personal evaluation.

Explain: Clarify, interpret, and spell out the material you present. Give reasons for differences of opinion or of results, and try to analyze causes.

Illustrate: Use a figure, picture, diagram, or concrete example to explain or clarify a problem.

Interpret: Translate, give examples of, solve, or comment on, a subject, usually giving your judgment about it.

Justify: Prove or give reasons for decisions or conclusions, taking pains to be convincing.

List: As in "enumerate," write an itemized series of concise statements.

Outline: Organize a description under main points and subordinate points, omitting minor details and stressing the arrangement or classification of things.

Prove: Establish that something is true by citing factual evidence or giving clear logical reasons.

Relate: Show how things are related to, or connected with, each other or how one causes another, or is like another.

Review: Examine a subject critically, analyzing and commenting on the important statements to be made about it.

Sketch: means "break down into its component parts."

State: Present the main points in brief, clear sequence, usually omitting details, illustrations, or examples.

Summarize: Give the main points or facts in condensed form, like the summary of a chapter, omitting details and illustrations.

Trace: In narrative form describe progress, development, or historical events from some point of origin.

Identify or characterize: means "distinguish this term, or this person from all others that are similar." Both are clear injunctions to be as specific as possible.

Illustrate or exemplify: means "giving examples," showing thereby, rather than by definition, that you understand the concept. TRANSITIONAL WORDS AND PHRASES

To achieve unity and coherence, writers use transitional words and phrases. Transitional expressions clarify the relationships between clauses, sentences, and paragraphs, helping guide the readers along. The following is a partial list of transitional expressions.

To Add or Show Sequence: again, also, and, and then, besides, equally important, finally, first, further, furthermore, in addition, in the first place, last, moreover, next, second, still, too

To Compare: also, in the same way, likewise, similarly

To Contrast: although, and yet, but, but at the same time, despite, even so, even though, for all that, however, in contrast, in spite of, nevertheless, notwithstanding, on the contrary, on the other hand, regardless, sill, though, whereas, yet

To Give Examples or Intensify: after all, an illustration of, even, for example, for instance, indeed, in fact, it is true, of course, specifically, that is, to illustrate, truly

To Indicate Place: above, adjacent to, below, elsewhere, farther on, here, near, nearby, on the other side, opposite to, there, to the east, to the left

To Indicate Time: after a while, afterward, as long as, as soon as, at last, at length, at that time, before, earlier, formerly, immediately, in the meantime, in the past, lately, later, meanwhile, now, presently, shortly, simultaneously, since, so far, soon, subsequently, then, thereafter, until, until now, when

To Repeat Summarize or Conclude: all in all, altogether, as has been said, in brief, in conclusion in other words, in particular, in short, in simpler terms, in summary, on the whole,that is, therefore, to put it differently, to summarize

To Show Cause or Effect: accordingly, as a result, because, consequently, for this purpose, hence, otherwise, since, then, therefore, thereupon, this, to this end, with this object.

Home | Calendars | Library | Bookstore | Directory | Apply Now | Search for Classes | Register | Online Classes  | MyBC Portal MyBC -->

Butte College | 3536 Butte Campus Drive, Oroville CA 95965 | General Information (530) 895-2511

  • Calendar/Events
  • Navigate: Students
  • One-Stop Resources
  • Navigate: Staff
  • More Resources

University of Wisconsin Whitewater

  • LEARN Center
  • Research-Based Teaching Tips

Short Answer & Essay Tests

Strategies, Ideas, and Recommendations from the faculty Development Literature

General Strategies

Save essay questions for testing higher levels of thought (application, synthesis, and evaluation), not recall facts. Appropriate tasks for essays include: Comparing: Identify the similarities and differences between Relating cause and effect: What are the major causes of...? What would be the most likely effects of...? Justifying: Explain why you agree or disagree with the following statement. Generalizing: State a set of principles that can explain the following events. Inferring: How would character X react to the following? Creating: what would happen if...? Applying: Describe a situation that illustrates the principle of. Analyzing: Find and correct the reasoning errors in the following passage. Evaluating: Assess the strengths and weaknesses of.

There are three drawbacks to giving students a choice. First, some students will waste time trying to decide which questions to answer. Second, you will not know whether all students are equally knowledgeable about all the topics covered on the test. Third, since some questions are likely to be harder than others, the test could be unfair.

Tests that ask only one question are less valid and reliable than those with a wider sampling of test items. In a fifty-minute class period, you may be able to pose three essay questions or ten short answer questions.

To reduce students' anxiety and help them see that you want them to do their best, give them pointers on how to take an essay exam. For example:

  • Survey the entire test quickly, noting the directions and estimating the importance and difficulty of each question. If ideas or answers come to mind, jot them down quickly.
  • Outline each answer before you begin to write. Jot down notes on important points, arrange them in a pattern, and add specific details under each point.

Writing Effective Test Questions

Avoid vague questions that could lead students to different interpretations. If you use the word "how" or "why" in an essay question, students will be better able to develop a clear thesis. As examples of essay and short-answer questions: Poor: What are three types of market organization? In what ways are they different from one another? Better: Define oligopoly. How does oligopoly differ from both perfect competition and monopoly in terms of number of firms, control over price, conditions of entry, cost structure, and long-term profitability? Poor: Name the principles that determined postwar American foreign policy. Better: Describe three principles on which American foreign policy was based between 1945 and 1960; illustrate each of the principles with two actions of the executive branch of government.

If you want students to consider certain aspects or issues in developing their answers, set them out in separate paragraph. Leave the questions on a line by itself.

Use your version to help you revise the question, as needed, and to estimate how much time students will need to complete the question. If you can answer the question in ten minutes, students will probably need twenty to thirty minutes. Use these estimates in determining the number of questions to ask on the exam. Give students advice on how much time to spend on each question.

Decide which specific facts or ideas a student must mention to earn full credit and how you will award partial credit. Below is an example of a holistic scoring rubric used to evaluate essays:

  • Full credit-six points: The essay clearly states a position, provides support for the position, and raises a counterargument or objection and refutes it.
  • Five points: The essay states a position, supports it, and raises a counterargument or objection and refutes it. The essay contains one or more of the following ragged edges: evidence is not uniformly persuasive, counterargument is not a serious threat to the position, some ideas seem out of place.
  • Four points: The essay states a position and raises a counterargument, but neither is well developed. The objection or counterargument may lean toward the trivial. The essay also seems disorganized.
  • Three points: The essay states a position, provides evidence supporting the position, and is well organized. However, the essay does not address possible objections or counterarguments. Thus, even though the essay may be better organized than the essay given four points, it should not receive more than three points.
  • Two points: The essay states a position and provides some support but does not do it very well. Evidence is scanty, trivial, or general. The essay achieves it length largely through repetition of ideas and inclusion of irrelevant information.
  • One point: The essay does not state the student's position on the issue. Instead, it restates the position presented in the question and summarizes evidence discussed in class or in the reading.

Try not to bias your grading by carrying over your perceptions about individual students. Some faculty ask students to put a number or pseudonym on the exam and to place that number / pseudonym on an index card that is turned in with the test, or have students write their names on the last page of the blue book or on the back of the test.

Before you begin grading, you will want an overview of the general level of performance and the range of students' responses.

Identify exams that are excellent, good, adequate, and poor. Use these papers to refresh your memory of the standards by which you are grading and to ensure fairness over the period of time you spend grading.

Shuffle papers before scoring the next question to distribute your fatigue factor randomly. By randomly shuffling papers you also avoid ordering effects.

Don't let handwriting, use of pen or pencil, format (for example, many lists), or other such factors influence your judgment about the intellectual quality of the response.

Write brief notes on strengths and weaknesses to indicate what students have done well and where they need to improve. The process of writing comments also keeps your attention focused on the response. And your comments will refresh your memory if a student wants to talk to you about the exam.

Focus on the organization and flow of the response, not on whether you agree or disagree with the students' ideas. Experiences faculty note, however, that students tend not to read their returned final exams, so you probably do not need to comment extensively on those.

Most faculty tire after reading ten or so responses. Take short breaks to keep up your concentration. Also, try to set limits on how long to spend on each paper so that you maintain you energy level and do not get overwhelmed. However, research suggests that you read all responses to a single question in one sitting to avoid extraneous factors influencing your grading (for example, time of day, temperature, and so on).

Wait two days or so and review a random set of exams without looking at the grades you assigned. Rereading helps you increase your reliability as a grader. If your two score differ, take the average.

This protects students' privacy when you return or they pick up their tests. Returning Essay Exams

A quick turnaround reinforces learning and capitalizes on students' interest in the results. Try to return tests within a week or so.

Give students a copy of the scoring guide or grading criteria you used. Let students know what a good answer included and the most common errors the class made. If you wish, read an example of a good answer and contrast it with a poor answer you created. Give students information on the distribution of scores so they know where they stand.

Some faculty break the class into small groups to discuss answers to the test. Unresolved questions are brought up to the class as a whole.

Ask students to tell you what was particularly difficult or unexpected. Find out how they prepared for the exam and what they wish they had done differently. Pass along to next year's class tips on the specific skills and strategies this class found effective.

Include a copy of the test with your annotations on ways to improve it, the mistakes students made in responding to various question, the distribution of students' performance, and comments that students made about the exam. If possible, keep copies of good and poor exams.

The Strategies, Ideas and Recommendations Here Come Primarily From:

Gross Davis, B. Tools for Teaching. San Francisco, Jossey-Bass, 1993.

McKeachie, W. J. Teaching Tips. (10th ed.) Lexington, Mass.: Heath, 2002.

Walvoord, B. E. and Johnson Anderson, V. Effective Grading. San Francisco, Jossey-Bass, 1998.

And These Additional Sources... Brooks, P. Working in Subject A Courses. Berkeley: Subject A Program, University of California, 1990.

Cashin, W. E. "Improving Essay Tests." Idea Paper, no. 17. Manhattan: Center for Faculty

Evaluation and Development in Higher Education, Kansas State University, 1987.

Erickson, B. L., and Strommer, D. W. Teaching College Freshmen. San Francisco:

Jossey-Bass, 1991.

Fuhrmann, B. S. and Grasha, A. F. A Practical Handbook for College Teachers. Boston:

Little, Brown, 1983.

Jacobs, L. C. and Chase, C. I. Developing and Using Tests Effectively: A Guide for Faculty.

San Francisco: Jossey-Bass, 1992.

Jedrey, C. M. "Grading and Evaluation." In M. M. gullette (ed.), The Art and Craft of Teaching.

Cambridge, Mass.: Harvard University Press, 1984.

Lowman, J. Mastering the Techniques of Teaching. San Francisco: Jossey-Bass, 1984.

Ory, J. C. Improving Your Test Questions. Urbana:

Office of Instructional Res., University of Illinois, 1985.

Tollefson, S. K. Encouraging Student Writing. Berkeley:

Office of Educational Development, University of California, 1988.

Unruh, D. Test Scoring manual: Guide for Developing and Scoring Course Examinations.

Los Angeles: Office of Instructional Development, University of California, 1988.

Walvoord, B. E. Helping Students Write Well: A Guide for Teachers in All Disciplines.

(2nded.) New York: Modern Language Association, 1986.

We use cookies on this site. By continuing to browse without changing your browser settings to block or delete cookies you agree to the UW-Whitewater Privacy Notice .

COMMENTS

  1. PDF PREPARING EFFECTIVE ESSAY QUESTIONS

    An essay question is a test item which contains the following elements: 1. Requires examinees to compose rather than select their response. 2. Elicits student responses that must consist of more than one sentence. ... understanding them, educators may use an essay question when another item type would . Essay Questions: Essay Questions:

  2. Improving Your Test Questions

    An essay test item can be classified as either an extended-response essay item or a short-answer essay item. The latter calls for a more restricted or limited answer in terms of form or scope. An example of each type of essay item follows. Sample Extended-Response Essay Item

  3. Lesson 6.3: Understanding Test Items

    Short Answer Test Items Explanation. This type of test item usually involves a short answer of approximately 5-7 sentences. Typical short answer items will address only one topic and require only one "task" (see "essay test items," below, for a test item requiring more than one task). Example. Define the term "mnemonic." Tips

  4. Tips for Creating and Scoring Essay Tests

    There are two types of essay questions: restricted and extended response. ... If your essay item is part of a larger objective test, make sure that it is the last item on the exam. Scoring the Essay Item . One of the downfalls of essay tests is that they lack in reliability. Even when teachers grade essays with a well-constructed rubric ...

  5. Essay Test: Types, Advantages and Limitations

    Advantages of the Essay Tests: 1. It is relatively easier to prepare and administer a six-question extended- response essay test than to prepare and administer a comparable 60-item multiple-choice test items. 2. It is the only means that can assess an examinee's ability to organise and present his ideas in a logical and coherent fashion. 3.

  6. Constructing Test Items (Guidelines & 7 Common Item Types)

    A LOFT exam is a test where the items are drawn from an item bank pool and presented on the exam in a way that each person sees a different set of items. The difficulty of the overall test is controlled to be equal for all examinees. LOFT exams utilize automated item generation ( AIG) to create large item banks.

  7. PDF Construction and Marking of Essay-type Test Items: a Practical

    essay-type test items can be challenging for educators. The process requires careful consideration of several factors, such as the learning objectives, the scope of the question, the format, the ...

  8. PDF Essay Exams: Common Question Types

    Essay Exams: Common Question Types, Spring 2009. Rev. Summer 2014. 1 of 2 Essay Exams: Common Question Types When approaching any essay exam, it is important to identify what kind of response is expected—that is, what is being asked of you and what information you are required to include.

  9. PDF Essay Items

    factual information. In this chapter on essay items, you'll find other tools that you can use to assess other types of outcomes. Here, we'll focus on essay items —those items where the test taker is expected to write a coherent and informative response to a ques-tion. Forget about that Friday spelling test or even the SAT—essay

  10. What makes effective test questions and answers for assessments?

    Test items like these usually come in the form of long answers, essays, or performance-based evaluations. According to the Eberly Center for Teaching Excellence and Educational Innovation at Carnegie Mellon University , "There is no single best type of exam question: the important thing is that the questions reflect your learning objectives."

  11. PDF Handbook on Test Development: Helpful Tips for Creating Reliable and

    2.2 Writing Essay Test Items Essay items are useful when examinees have to show how they arrived at an answer. A test of writing ability is a good example of the kind of test that should be given in an essay response format. This type of item, however, is difficult to score reliably and can require a significant amount of time to be graded.

  12. Essay type test

    It defines essay tests as requiring students to compose lengthy responses of several paragraphs. Essay tests measure higher-level thinking like analysis, synthesis, and evaluation. They give students freedom in how they respond. Essay tests can assess recall, writing ability, understanding, and factual knowledge.

  13. Types of Questions in Teacher Made Achievement Tests: A Comprehensive

    Essay Type Test Items. Essay test items call for longer, more detailed responses from students. These questions evaluate not just recall of information but also critical thinking, organization of thoughts, and the ability to communicate effectively through writing.

  14. Writing Good Multiple Choice Test Questions

    1. Avoid complex multiple choice items, in which some or all of the alternatives consist of different combinations of options. As with "all of the above" answers, a sophisticated test-taker can use partial knowledge to achieve a correct answer. 2. Keep the specific content of items independent of one another.

  15. Essay Type Test: Advantages, Disadvantages, Limitations

    The essay-type tests are less objective and so they lack validity. 2. Cramming Habit. Essay type of test increases the child's cramming habit capacity. 3. Lack Of Relaibility. These tests lack the dependability of essay-type tests is low as compared to various multiple-choice questions or objective-type questions. 4.

  16. Advantages, Disadvantages of Different Types of Test Questions

    Advantages. Save instructors the time and energy involved in writing test questions. Use the terms and methods that are used in the book. Disadvantages. Rarely involve analysis, synthesis, application, or evaluation (cross-discipline research documents that approximately 85 percent of the questions in test banks test recall) Limit the scope of ...

  17. PDF Is This a Trick Question?

    test item types are discussed: multiple choice, true-false, matching, completion, and essay. Information covers the appropriate use of each item type, advantages and disadvantages of each item type, and characteristics of well written items. Suggestions for addressing higher order thinking skills for each item type are also presented.

  18. PDF Classroom Tests: Writing and Scoring Essay

    item format. Instead, it is much more desirable to present students with a problem to solve, and to evaluate students with respect to the processes they used to solve the problem. Examples of item types measuring deep understanding include essay and short-answer questions. Essay and short-answer items, sometimes referred to as constructed-

  19. Essay Tests

    Essay Tests. There are basically two types of exams: Objective - requires answers of a word or short phrase, or the selection of an answer from several available choices that are provided on the test. Essay - requires answers to be written out at some length. The student functions as the source of information.

  20. Short Answer & Essay Tests

    Ask students to write more than one essay. Tests that ask only one question are less valid and reliable than those with a wider sampling of test items. In a fifty-minute class period, you may be able to pose three essay questions or ten short answer questions. Give students advice on how to approach an essay or short-answer test.

  21. (PDF) CONSTRUCTION AND MARKING OF ESSAY-TYPE TEST ITEMS ...

    This study aims to test the reliability and items characteristics of the essay test instrument to measure the higher-order thinking skills of social science lessons of junior high school grade 8.

  22. (PDF) Essay Items

    This encyclopedic entry on essay items provides a general definition of this item type, scoring procedures, and challenges to gathering validity evidence. Discover the world's research 25+ million ...

  23. SUPPLY TYPE or SUBJECTIVE TYPE of TEST ITEMS

    1. Constructed response tests, also known as supply type items, require students to create and supply their own answers rather than selecting from multiple choices. 2. There are two main types of constructed response tests: short answer/completion type and essay type items. Completion type questions require students to supply a word, symbol, or number to complete a statement, while essay ...