Step 2: Find and adapt instruments (R). |
|
|
The search for instruments should be guided by the constructs you want to measure (e.g., knowledge of planetary science, workshop satisfaction). Constructs are simply categorizations of phenomena to which you decide to attend to answer your evaluation questions.
Search the OERL instrument collection, which contains many different types of instruments from many different types of evaluation projects. Then search for other instrument resources from publishers, states, districts, special programs, professional organizations, and other library collections (e.g., Buros Mental Measurements Yearbooks). To get tips from professional peers about resources, you can pose questions on listservs such as Evaltalk (administered by the American Evaluation Association, at www.eval.org) or participate in other resource and information-sharing online communities hosted on Web sites such as the Math Forum (mathforum.org), Digital Library for Earth System Education (www.dlese.org), and Tapped In (tappedin.sri.com).
For the evaluation, screen potential instruments on the basis of their technical quality, content, and appropriateness for the sample of people from whom you want to collect data. If the instruments are not appropriate in their current state, you will need to adapt them and establish their validity and reliability with your intended audience. Table 8 shows examples of decisions made about the adaptability of specific instruments.
Table 8. Determining adaptability of specific instruments
Hypothetical instrument |
Original respondents |
Intended respondents |
Differences between original use and new use |
Adaptable? |
An interview protocol that solicits attitudes toward science |
High school students |
Elementary students |
Same content, different grade levels |
Yes, if the language and topics addressed can be appropriately simplified. |
A workshop evaluation questionnaire |
Post-secondary faculty |
High school faculty |
Same evaluative dimensions, different setting |
Yes, if the items cover common dimensions of workshops (e.g., quality of content, pace, overall satisfaction), with appropriate changes to the names of people, places, and workshop components. |
A learning assessment of students' knowledge of the advantages and disadvantages of different governmental structures |
Students in U.S. communities |
Students in other English-speaking nations |
Same content, same language, different respondent culture |
No, because the problem prompts and criteria for scoring in this particular test are too dependent on cultural differences and could not be sufficiently revised without major effort. |
A questionnaire designed to screen students' readiness for administering a computer network |
Students who have studied Windows-based systems |
Students who have studied Linux-based systems |
Same topic, different content |
No, because the systems are too different to adapt the instrument without major overhaul. |
|