A Refresh on Writing Quality Multiple Choice Questions

To ensure that multiple choice questions are accurate and reliable measures of student achievement, it may be time for a refresh on multiple choice questions.
by LSA Learning & Teaching Technology Consultants

Multiple choice questions are one of the most popular assessment techniques in higher education because they have many advantages. They are easy and reliable to grade, can be used through the course for quick, formative assessment, and they are flexible and allow for testing on a broad range of topics. 

However, one drawback of multiple choice questions is that unprepared students can potentially guess and receive credit since the correct answer is selected from options, not necessarily from recall. To ensure that multiple choice questions are accurate and reliable measures of student achievement, it may be time for a refresh on multiple choice questions.

Multiple choice questions are made up of two parts, (1) the stem and (2) several choices including distractors and one correct option. The stem is the first part of the question that introduces the central idea clearly and briefly in positive terms to avoid words such as “not” or “except.” The second part is made up of several choices, one choice being the correct answer, along with several alternatives. The alternatives are also known as distractors (“Designing quality multiple choice questions”).

Let’s start with best practices for writing the question stem 

  • The stem should stand alone as a clear, meaningful question rather than an incomplete statement. Otherwise, students may be tested on how well they draw inferences instead of how well they understand content. Essentially, students should be able to read the stem, cover the options, and generate the correct answer without even seeing the option. For example, a stem such as “Which of the following is a true statement?” does not provide any meaningful content. See the example in Figure 1 below.
Figure 1.
  • Be sure to eliminate any excessive or irrelevant information from the stem unless the purpose is to explicitly assess whether a student can interpret and synthesize information (for example, to determine the most likely diagnosis for a patient). 
  • Avoid using negatively stated stems, which can increase the risk that a student will overlook the negative. Occasionally, negative items are appropriate for objectives dealing with health or safety issues, where knowing what not to do is important. In these scenarios, the negative should be emphasized with bold, italics, underlining, or capitalization to help avoid confusion.
  • Avoid multiple true-false questions, which require students to select all of theoptions that are “true.” This type of question requires students to guess what the instructor originally had in mind when writing the question because it’s easy to argue ambiguities between completely true or completely false. Also this type of stem tends to assess recall of an isolated fact.

Best practices for writing alternatives/distractors

  • Each question should include at least three distractors to minimize students’ ability to guess the correct answer. However, more than five distractors can make the test too cumbersome for students. Most instructors aim for 4 choices.
  • Make sure that grammatical clues are not provided within the stem that would help students to guess the correct answer. Clues include the use of a or an, verbs or nouns, or singular or plural.
  • Try to keep the options about the same length and include the same level of detail; otherwise, students may be inclined to select the longest, most detailed option. Some test-takers have the belief that if it’s too long, it can’t be wrong.
  • Don’t let the distractors get too long or you risk testing the students’ reading speed rather than content knowledge. One way to reduce the reading time is to include repetitive words and phrases in the stem.
  • Make sure the incorrect alternatives are plausible. The best source of distractors often come from common student misconceptions and should appeal to low scorers who have not mastered the material. This will help discriminate between the students who actually know the answer and those who have a better chance at guessing after eliminating unrealistic choices.
  • Write the distractors before the correct answer. Often when the correct answer is written first, the distractors are written as variations of the correct answer. A wise student will select the answer that combines the most common elements from other distractions. 
  • Avoid using “all of the above”  because test-takers who can identify more than one alternative as correct can ultimately select the correct answer. Conversely, if “none of the above” is the correct answer, students who can eliminate a single option can then eliminate all options. In both cases, students can use partial knowledge to arrive at the correct answer.
  • Present alternatives in some logical order such as alphabetically. Students know the correct answers are most often C or D, so selecting one of those options increases their chance of getting the answer correct.
  • Closely review the final exam to look for overlapping content. Often students will use information in one question to answer another question. However, locking test questions so students cannot return to previous questions is not recommended.
Figure 2.

Creating questions that assess higher-order thinking

It is challenging, but possible, to write multiple choice questions that test higher-order thinking. The stem can be written in a way that requires students to apply course principles, analyze a problem, or evaluate something. For example, students in biochemistry can be asked to “interpret the plot and apply biochemical principles to choose the best answer” (Brame, 2013). See an example test question in Figure 2.

Canvas Quiz Analysis Measurements

One of the benefits of multiple choice questions is that they can easily provide useful data when given in Canvas. Canvas Quiz statistics provides data and visualizations that can be used to reflect on and refine future quizzes and quiz questions, including the highest and lowest quiz scores, the average amount of time spent in a quiz, and the distribution of student responses.

  • Discrimination Index - helps identify potential issues with quiz questions by displaying the correlation between how well students scored on a particular quiz question and their overall score on the whole quiz. A higher score indicates that students who scored well on this question did well on the test overall, and a low score indicates that students who scored well on the whole quiz got this question wrong. Reviewing questions with low scores can help you quickly identify quiz question that may need to be reframed or reworded for clarity.
  • Reliability - measures the test's internal consistency, meaning if several questions are designed to measure the same information, the test-taker will answer them in a similar way. The way to do this is to go to the Question Breakdown section in Canvas and compare questions testing the same content. If the percentage of students who got the answer correct is similar among the questions, the exam is considered reliable. If, however, there is a drastic difference in how the students score, then it will be worth evaluating the quality of the questions including the stem and distractors. For example, a poorly worded or negatively structured stem could be confusing students.
  • Difficulty - shows how hard it is to answer the question correctly. The index is computed as the proportion of students who answered correctly. Ideally, all of the question's incorrect answers should be equally appealing to the students who miss the question. (“Canvas Quiz Item Analysis”).

If you would like help writing multiple choice exams or using the Canvas Quick tool, contact the Learning and Teaching Consultants!

 

References:

Brame, C. (2013). Writing good multiple choice test questions. Retrieved July 1, 2022 from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Instructure Community. Canvas Quiz item analysis. https://community.canvaslms.com/t5/Canvas-Resource-Documents/Canvas-Quiz-Item-Analysis/ta-p/387082

Yale Poorvu Center for Teaching and Learning. Designing good multiple choice questions. https://poorvucenter.yale.edu/MultipleChoiceQuestions

 

Email
Release Date: 10/05/2023
Category: Learning & Teaching Consulting; Teaching Tips
Tags: Technology Services

TECHNOLOGY SERVICES

G155 Angell Hall, 435 South State St, Ann Arbor, MI 48109–1003
734.615.0100
LSATechnologyServices@umich.edu 

Technology Services Contact Center Chat