home
  : Plans : Glossary





























home reports instruments plans
search

Glossary of Plan Components

The glossary for sound project evaluation plans is organized into four sections corresponding to the following plan components: (1) Project Description, (2) Evaluation Overview, (3) Design, and (4) Analysis Process.

Quality criteria for plan components are also available. The alignment table shows how glossary and criteria entries for plan components align to evaluation standards.

Component Glossary Entry
Project Description

Describes the project that will be evaluated so that the reader of the report will understand the scope of the evaluation and be able to understand the association between the project's components and its intended outcomes (e.g., impacts and payoff).

Note: The evaluation plan need not describe the project if the plan is embedded in the project proposal.

Project Features

Describes the project's features (e.g., philosophy, rationale, goals, objectives, strategies, activities, procedures, location, duration, resources).

Project Participants, Audiences, & Other Stakeholders

Identifies individuals or groups participating in, affected by, or invested in the project.

Project Context

Identifies external influences on the project that will impact the proposed evaluation design (e.g., the timing of the project relative to other factors or events; organizational/institutional, historical, economic, political, and social conditions; demographic characteristics of project participants).

Evaluation Overview

Describes the purposes and questions that will drive the evaluation, as well as the credentials of the evaluator and the anticipated involvement of stakeholders in the evaluation.

Evaluation Purposes

Describes the goals and objectives of the evaluation. These should be focused upon identifying the project's strengths and weaknesses as well as accomplishments and challenges, in terms of how well its implementation will be carried out (formative evaluation) and/or how successful it will be in achieving itsintended outcomes (summative evaluation).

This section of the plan may also propose additional "goal-free" purposes that involve gathering and inductively analyzing data in order to understand dimensions of the project that were not anticipated by the project when its goals were set.

Evaluation Questions

States the questions that will be answered through data collection, analysis, and interpretation. Evaluation questions are developed from the evaluation goals and objectives and state specific information needs. Questions focus on aspects and outcomes of the project that are important to the stakeholders.

Evaluator Credibility

Specifies the evaluator's credentials.

Stakeholder Involvement

Describes what interests the various stakeholders will have in the evaluation, and what roles they will play in it.

Design

Describes what strategies and procedures will be used to gather and analyze data; as well as which ones will be used to periodically review the course of the evaluation.

Methodological Approach

Specifies

  • formative or summative approaches that will be taken;
  • types of data that will be needed (e.g., quantitative, qualitative, pre-post, longitudinal); and
  • sources of the data (e.g., participants, documents).
Information Sources & Sampling

Describes the sources of information that will be used in the evaluation; these sources may include

  • records and archival documents that contain relevant information;
  • the entire population of participants in the project, if data were collected on all of them; and
  • the sample or samples of participants or other informants that will be observed or solicited for information, in order to maximize the generalizability of the findings to the population from which the sample or samples are to be drawn.
Instruments

Describes the design and content of the instruments that will be used to collect and analyze data (e.g., survey questionnaires, interview protocols, observation forms, learning assessments).

Data Collection Procedures & Schedule

Describes how the data and other information will be gathered to meet the criteria of validity and reliability. Also describes the intended frequency, order, and duration of the various data collection activities.

Meta-Evaluation

Describes procedures that will be undertaken to review the quality of the evaluation being conducted.

Analysis Process

Describes the type or types of analyses that will be conducted (e.g., quantitative, qualitative, mixed methods) and procedures that will be used for examining results and ensuring their trustworthiness, such as.

  • training that will be conducted to ensure reliable coding and scoring of data,
  • systematic checks of the data to remove errors, and
  • procedures for reducing and summarizing the data.
Quantitative Analysis

Describes in general terms what procedures will be taken to analyze numeric data:

  • Organizing the data
  • Verifying it
  • Summarizing it
  • Examining relationships among variables (e.g., Pearson Product Moment correlations, Multiple regression, factor analyses)
  • Using inferential statistical techniques to test for significant differences between comparison groups (e.g., t-tests, analyses of variance, analyses of covariance)
Qualitative Analysis

Describes the qualitative analysis procedures that will be used to compile, analyze, and interpret the data in order to find themes, patterns, and trends.

Not sure where to start?  
Try reading some user scenarios for plans.