Closed response questions can be tailored to provide a number
of benefits. Below are some suggestions for maximizing the benefits
of using and analyzing information from closed-response
questions.
A) Get data which are as precise as you need without imposing
too much of a burden on the respondents.
Example:
If you are trying to track whether the age of the students
is related to their understanding of certain science concepts,
it will be better to ask for the specific ages rather than ranges
of ages (such as "5-8", "9-12", and
"13-16").
Conversely, you may not always need as much precision as you
could get. If you are collecting data on how technology is being
used in a particular class, it might be appropriate to ask how
many class periods in a week certain software programs are used,
but not necessary to ask how many minutes, because:
- Enough is known about the average class period to get an
approximate sense of numbers of minutes
- There is no theory underlying the evaluation that suggests
that small variations in minutes are going to make a difference
- Asking respondents to count minutes would impose too much
of a burden on them and discourage them from filling out the
questionnaire
B) Use common response scales for sets of items if you want to
be able to compare results across sets or generate summary data
such as sums, means, medians, and modes.
Example:
You are posing questions about how much trainees use computers
for in-class work such as online discussions and doing research
on the Web.
You ask respondents to rate the frequency of each task on a
response scale, where 1=never; 2=1-3 times a month; 3=4-7 times
a month; and 4=8 or more times a month. This allows you to draw
summary conclusions such as, "55% of trainees report that
their classes use the computer for online discussion sessions
4-7 times a month. On the other hand, only 24% report that that
their classes use computers to do research on the Web that
often."
C) For all questions that share a response scale, apply the response
scale in a consistent direction (e.g., low to high or high to
low), and use the same number and set of rating categories.
Example:
You have the following questions:
- How useful were the materials?
- How useful was the instructor?
- How useful were the group discussions?
- How useful was the follow-up?
You choose the following 4-point response scale for the first
question: 1=poor; 2=fair; 3=good; 4=excellent. Instead of choosing
that response scale for all the questions, however, you mistakenly
use different response scales. Below are examples of response
scale errors that make it difficult to compare results across
questions:
Questions |
Response scale errors |
Type of violation |
1. How useful were the materials? |
1=poor; 2=fair; 3=good; 4=excellent |
No violation |
2. How useful was the instructor? |
1=excellent; 2=good; 3=fair; 4=poor |
Reverse order of question 1's response scale |
3. How useful was the instructor? |
1=unsatisfactory; 2=satisfactory; 3=fair;
4=outstanding |
Same order as question 1, but the response categories
are different |
4. How useful was the follow-up? |
1=poor; 2=fair; 3=good; 4=very good;
5=excellent |
Has 5 instead of 4 response categories |
D) Use response scales with an even number of response
categories if that will make informed respondents more likely to
commit to a position. Odd-numbered response scales make it too easy
for respondents to be non-committal "fence-sitters" because
they can select the midpoint.
Example:
In the opinion questions below, the respondent is enabled
to be a fence-sitter, with the "No Opinion" response
category. Though appropriate in cases where the respondents
may lack the knowledge to have an opinion, here it is inappropriate
because all the respondents attended the workshop, hence have
enough knowledge to express opinions.
Questions |
Strongly disagree |
Disagree |
No opinion |
Agree |
Strongly agree |
1. The content of the training workshop
was interesting to me. |
1 |
2 |
3 |
4 |
5 |
2. Brainstorming with other workshop
participants was useful. |
1 |
2 |
3 |
4 |
5 |
3. I intend to use the workshop materials
in my courses. |
1 |
2 |
3 |
4 |
5 |
4. How useful was the follow-up? |
1 |
2 |
3 |
4 |
5 |
It would be better to eliminate the "no opinion"
option and reduce the scale to four (Strongly Disagree, Disagree,
Agree, Strongly Agree).
E) Label ALL response categories. Labels make them clearer. The
clearer they are, the more likely they are to be interpreted the
same way by multiple respondents.
Example of what NOT to do:
Questions |
Very dissatisfied |
|
|
Very satisfied |
How satisfied are you with the
training? |
1 |
2 |
3 |
4 |
Example of what to do:
Questions |
Very dissatisfied |
Somewhat dissatisfied |
Somewhat satisfied |
Very satisfied |
How satisfied are you with the
training? |
1 |
2 |
3 |
4 |
F) Build in alternative choices that make it possible for all
respondents to answer the question, even if the premise of the
question does not apply to them or the other choices do not capture
what they want to say. Otherwise, you force them to either be
untruthful or leave the answer blank.
Below are examples. The alternative choices are in
italics.
Examples from a questionnaire to teachers about their school
computer program:
Check
which of the following activities you use computers for
in your classes:
__ for drill and practice
__ for administrative tasks such as grading
__ for student projects
__ as visual aids in large group instruction
__ other (specify here: _________________________________________________
) |
What
has your principal done to support your school's technology
program?
___ initiated staff development about technology
___ hired computer resource staff
___ regularly updated computer equipment
___ led fundraisers for the purchase of software
___ written grant proposals for hardware purchases
___ the school has a technology program but the principal
has done nothing to support it
___ the school does not have a technology program
___ other (specify here: _________________________________________________
) |
Over
the past year, about how much of your time in class did
you spend engaging your students in constructivist activities?
___ 0%
___ 25% or less
___ 26% to 50%
___ 51% to 75%
___ 76% to 100%
___ don't remember
___ don't understand the question |
NOTE: Some of
the alternative selection choices in the above questions are needed
because the questions have false premises. For more on false premises,
see Step 5.