|
Instrument Data Preparation |
|
Transforming instrument data into a systematic,
error-free format that can be analyzed according to
the evaluation plan
|
After the administration of a data-gathering
instrument, there usually are several steps required
before the data is ready for the intended form of
analysis.
Qualitative data typically require some form
of reduction for meaningful analysis about project
impact. It is possible, for example, to scrutinize
case study observations or unstructured verbal data
for common themes and to devise coding systems based
on these themes. Sometimes, quantitative information
can be extracted directly from the data (e.g., amount
of time spent on a training concept). Even when the
intent is to develop richly descriptive comparative
case studies, much is required to transform the raw
qualitative data (e.g., observer notes, interview
transcripts) into clear, fulfilling narratives with a
consistent structure.
Quantitative data typically need to be readied
for computer analysis using statistical methods
selected as the best means to answer the evaluation
questions. This preparation requires data checking
where the raw data are examined and any
inconsistencies resolved. Next comes data reduction
where the data are entered according to a
predetermined file format and set of codes.
Verification of data entry should take place by a
second coder or entry process. Last, data cleaning
should be conducted if the resulting data file has
cases that are incomplete, inaccurate, or nonsensical.
For example, finding a code of "6" for a
question that used a 4-point scale suggests a coding
error. More serious problems arise when data scores
defy reasonable patterns. For example, if a student
has a pretest score of 70 and a posttest score of 30
on the same test, this suggests that the posttest was
not completed or coded properly. If uncorrected,
allowing the scores from this case to remain in the
database would seriously distort the
results of most
moderately sized evaluations.
|
User-Friendly Handbook for Project Evaluation
Program Evaluation Standards
A9 Analysis of
Qualitative Information
Qualitative information in an evaluation
should be appropriately and systematically analyzed so
that evaluation questions are effectively
answered.
Program Evaluation Standards
A7 Systematic Information
The information collected, processed, and reported in
an evaluation should be systematically reviewed, and
any errors found should be corrected.
User-Friendly Handbook for Project Evaluation
Program Evaluation Standards
A8 Analysis of
Quantitative Information
Quantitative information in an evaluation
should be appropriately and systematically analyzed so
that evaluation questions are effectively
answered.
|
|
Reports of Test Construction Practices |
|
Including detailed descriptions of instrument
development and characteristics in all reporting
|
Reporting of the instrument construction process and
the resulting characteristics and use of the
instrument should be sufficiently detailed to show the
adequacy of the instrument for its original use and
other potential uses. Technical qualities outlined
above should be included in this description.
|
Standards for Educational and Psychological
Testing
|
|
Making descriptions and copies of instruments
available to the research evaluation community
|
Sharing descriptions and copies of instruments with
the research evaluation community is beneficial
because it increases the breadth and quality of
resources available to the community. There are many
venues for this sharing, including technical reports,
presentations to
stakeholders, published reports and
articles, and Web sites such as OERL.
|
Standards for Educational and Psychological
Testing
|