home
  : Plans : Alignment Table





























home reports instruments plans
search

Alignment Table for Plan Components

All Components

The alignment table for sound project evaluation plans can be viewed either as a whole, displaying all components, or as four separate tables corresponding to plan components: (1) Project Description, (2) Evaluation Overview, (3) Design, and (4) Analysis Process. See the alignment table overview for a general description of what appears in the alignment tables.

The glossary and quality criteria entries for plan components are also available on their own.

Component Glossary Entry Quality Criteria Related Program Evaluation Standards
Project Description

Describes the project that will be evaluated so that the reader of the report will understand the scope of the evaluation and be able to understand the association between the project's components and its intended outcomes (e.g., impacts and payoff).

Note: The evaluation plan need not describe the project if the plan is embedded in the project proposal.

   
Project Features

Describes the project's features (e.g., philosophy, rationale, goals, objectives, strategies, activities, procedures, location, duration, resources).

The following features of the targeted project should be overviewed:

  • project goals (both explicit and implicit) and objectives
  • principal project activities designed to achieve the goals
  • expected short-term and long-term outcomes

If available, additional overview information should be provided about:

  • project location and implementation sites
  • project duration
  • resources used to implement the project

If more than one site is implementing a project, the plan should, if possible, describe the sites and the anticipated variation that may be expected across them.

A1 Program Documentation
The program being evaluated should be described and documented clearly and accurately, so that the program is clearly identified.

Project Participants, Audiences, & Other Stakeholders

Identifies individuals or groups participating in, or otherwise affected by or invested in the project.

The different stakeholder groups should be identified and their relationships to the project summarized, as well as whatever is already known about their perspectives that has impacted decision-making on the evaluation design being proposed in the plan.

U1 Stakeholder Identification
Persons involved in or affected by the evaluation should be identified, so that their needs can be addressed.

Project Context

Identifies external influences on the project that will impact the proposed evaluation design (e.g., the timing of the project relative to other factors or events; organizational/institutional, historical, economic, political, and social conditions; demographic characteristics of project participants).

An understanding of contextual factors is necessary if an evaluation is to be realistic and responsive to the conditions within which the project operates.

A2 Context Analysis
The context in which the project exists should be examined in enough detail, so that its likely influences on the project can be identified.

Evaluation Overview

Describes the purposes and questions that will drive the evaluation, as well as the credentials of the evaluator and the anticipated involvement of stakeholders in the evaluation.

   
Evaluation Purposes

Describes the goals and objectives of the evaluation. These should be focused around identifying the project's strengths and weaknesses as well as accomplishments and challenges, either in terms of how well its implementation will be carried out (formative evaluation) and/or how successful it will be in achieving intended outcomes (summative evaluation).

This section of the plan may also propose additional "goal-free" purposes that involve gathering and inductively analyzing data in order to understand dimensions of the project that were not anticipated by the project when its goals were set.

The purposes of the evaluation should be stated in terms of goals and intended uses of results by stakeholders.

The evaluation should focus on whether or not promised project components are delivered and compare project outcomes against the assessed needs of the targeted participants or other beneficiaries. They should also be directed at finding unanticipated outcomes, both positive and negative.

A3 Described Purposes and Procedures
The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed.

Evaluation Questions

States the questions that will be answered through data collection, analysis, and interpretation. Evaluation questions are developed from the evaluation goals and objectives and state specific information needs. They focus on aspects and outcomes of the project that are important to the stakeholders.

Evaluation questions that address context, implementation, and outcome variables provide the perspective not only for the eventual interpreting of results, but also for understanding the conditions under which the results are obtained.

The questions should be justified against the following criteria:

  • To which stakeholders will answers to the questions be useful, and how?
  • How will answers to the questions provide new information?

The plan can also state questions that are worth answering but will not be addressed in the evaluation due to constraints (e.g., limited time or resources, insufficiency of available data-gathering techniques).

 
Evaluator Credibility

Specifies the evaluator's credentials.

The professional qualifications of the evaluator should be specified in order to build trust in the evaluation as it unfolds.

U2 Evaluator Credibility
Persons conducting the evaluation should be both trustworthy and competent to perform the evaluation, so that the evaluation findings achieve maximum credibility and acceptance.

Stakeholder Involvement

Describes what interests the various stakeholders will have in the evaluation, and what roles they will play in it.

The plan should describe how the positions and perspectives of the stakeholders will be taken into account throughout the evaluation, from planning to data collection, analysis, and interpretation. Stakeholder involvement in the evaluation can be beneficial because stakeholders can help the evaluator better understand project goals and objectives, shape evaluation questions, recommend data sources, and review findings. As a consequence of being involved, stakeholders are more likely to find the results credible, useful, and relevant, and less likely to curtail evaluation operations or hinder accurate and appropriate uses of the results.

F2 Political Viability
The evaluation should be planned and conducted with anticipation of the different positions of various interest groups, so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted.

Design

Describes what strategies and procedures will be used to gather and analyze data; as well as which ones will be used to periodically review the course of the evaluation.

   
Methodological Approach

Specifies:

  • formative or summative approaches that will be taken
  • types of data that will be needed (e.g., quantitative, qualitative, pre-post, longitudinal)
  • sources of the data (e.g., participants, documents)

The plan should describe the proposed methodological approaches and how, within the constraints of time and cost, they will yield data that help answer the evaluation questions. The data gathered will need to be aligned with the goals that the project is intended to achieve. The data can vary, however, in how directly they indicate the attainment of project goals. Most projects are more likely to show effects on proximal outcomes than on distal outcomes that are either logically or temporally remote. (For example, a project has been designed to improve high school students' motivation to learn science. A proximal measure of the project's success would be student self-reports of interest in science content gathered immediately before and after the project. A distal measure would be whether the students decide to study science in college.)

Furthermore, the approaches should be grounded in respected methodological frameworks and best-practice literature. This increases the chance that project features and context that are likely to make a difference in project operations and outcomes will be identified.

Methodological approaches that look narrowly at project inputs and solely examine the results of quantitative outcome measures may not capture all the noteworthy influences, impacts, and outcomes of a complex project. Qualitative and mixed method approaches present alternative ways of detecting impacts, especially unanticipated ones. To corroborate evaluation findings and to provide multiple perspectives, it is highly desirable that evaluators measure multiple outcomes and gather data from multiple sources (triangulation).

Important constraints on the evaluation design (e.g., lack of random assignment of respondents to treatment and comparison groups, or lack of data on long-term effects) should also be stated at this point in the report.

U3 Information Scope and Selection
Information collected should be broadly selected to address pertinent questions about the project and be responsive to the needs and interests of clients and other specified stakeholders.

F3 Cost Effectiveness
The evaluation should be efficient and produce information of sufficient value, so that the resources expended can be justified.

Information Sources & Sampling

Describes the sources of information that will be used in the evaluation, which may include:

  • records and archival documents that contain relevant information
  • the entire population of participants in the project, if data were collected on all of them
  • the sample or samples of participants or other informants that will be observed or solicited for information, in order to maximize the generalizability of the findings to the population from which the sample or samples are to be drawn

The sources of information that will be used in the evaluation should be described in enough detail to build confidence that the information will be sufficient to meet the evaluation's purposes.

The groups selected to provide information (e.g., administrators, teachers, students, parents) should be identified and briefly described. If a sample is to be drawn, the description should contain:

  • the sample selection criteria (e.g., the lowest achievers, the best instructors)
  • the process by which the sample is to be selected (e.g., random, purposive)
  • the proposed sample size
  • whether or not any comparison or control groups will be included
  • whether and how participants will be assigned to treatment and comparison groups

The extent to which the sample will be representative of the entire population should be indicated. Information about the sample will help reviewers determine the extent to which the information provided about the sample is of sufficient depth to help users of the report judge its representativeness and appropriateness given the scope, context, and resources of the evaluation.

A3 Described Purposes and Procedures
The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed.

A4 Defensible Information Sources
The sources of information used in a program evaluation should be described in enough detail, so that the adequacy of the information can be assessed.

Instruments

Describes the design and content of the instruments that will be used to collect and analyze data (e.g., survey questionnaires, interview protocols, observation forms, learning assessments).

The plan should describe the nature of the various instruments and how they will be used to gather the needed information. Instruments should be used as intended in order for the data produced to be reliable and valid.

A3 Described Purposes and Procedures
The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed.

Data Collection Procedures & Schedule

Describes how the data and other information will be gathered to meet the criteria of validity and reliability. Also describes the intended frequency, order, and duration of the various data collection activities.

The plan should describe how and when data will be obtained from the various sources and how the sources will provide corroboration and multiple perspectives.

A description of the data collection and its intent will provide a context for the eventual judging and interpreting of evaluation findings and recommendations.

The timing of data collection is important because the project's maturity is likely to have an impact on outcomes.

Hence, this section should describe:

  • how and when an appropriately broad range of data will be collected
  • what steps will be taken to get essential data from the sample and other targeted sources (this might include a human subjects review)
  • what steps will be taken to ensure that the data meet the criteria of validity (e.g., piloting, field testing, stakeholder review)
  • what steps will be taken to ensure that reliability is achieved (e.g., systematic training of data collectors and consistent data collection and scoring procedures)

Different models of evaluation present different data collection needs. For example, a formative evaluation requires that ongoing project activities be assessed at points in time that enable project developers to refine the project's components.

F1 Practical Procedures
The evaluation procedures should be practical, to keep disruption to a minimum while needed information is obtained.

A3 Described Purposes and Procedures
The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed.

A5 Valid Information
The information-gathering procedures should be chosen or developed and then implemented so that they will assure that the interpretation arrived at is valid for the intended use.

A6 Reliable Information
The information-gathering procedures should be chosen or developed and then implemented so that they will assure that the information obtained is sufficiently reliable for the intended use.

Meta-Evaluation

Describes procedures that will be undertaken to review the quality of the evaluation being conducted.

Evaluation purposes and procedures should be reviewed periodically, particularly during longitudinal evaluations, to determine whether the evaluation design, instruments, and procedures are adequately capturing the project's implementation, impacts, and outcomes.

A12 Meta-Evaluation
The evaluation itself should be formatively and summatively evaluated against … standards, so that its conduct is appropriately guided and, on completion, stakeholders can closely examine its strengths and weaknesses.

Analysis Process

Describes the type or types of analyses that will be conducted (e.g., quantitative, qualitative, mixed methods) and procedures that will be used for examining results and ensuring their trustworthiness, such as:

  • training that will be conducted to ensure reliable coding and scoring of data
  • systematic checks of the data to remove errors
  • procedures for reducing and summarizing the data
   
Quantitative Analysis

Describes in general terms what procedures will be taken to analyze numeric data:

  • organizing the data
  • verifying it
  • summarizing it
  • examining relationships among variables (e.g., Pearson Product Moment correlations, multiple regression, factor analyses)
  • inferential statistical techniques that will be used to test for significant differences between comparison groups (e.g., t-tests, analyses of variance, analyses of covariance)

The proposed quantitative analysis procedures should be appropriate to the evaluation questions being addressed and the characteristics of the information being analyzed. The practical significance (e.g., effect sizes) and replicability, as well as statistical significance, should be considered when drawing inferences and formulating conclusions from quantitative analyses. Analyses of effects for identifiable subgroups should be planned, as appropriate, because a program may have differential effects for them.

Potential weaknesses in the quantitative data analysis, along with their possible influence on interpretations and conclusions, should be explained.

A8 Analysis of Quantitative Information
Quantitative information in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.

A7 Systematic Information
The information collected, processed, and reported in an evaluation should be systematically reviewed, and any errors found should be corrected.

Qualitative Analysis

Describes the qualitative analysis procedures that will be used to compile, analyze, and interpret the data in order to find themes, patterns, and trends.

The proposed qualitative analysis procedures should be appropriate to the evaluation questions being addressed and the characteristics of the information being analyzed. As the evaluation progresses, the accuracy of findings from qualitative data will need to be confirmed by gathering evidence from more than one source and by subjecting inferences to independent verification.

Potential weaknesses in the qualitative data analysis, along with their possible influence on interpretations and conclusions, should be described.

A9 Analysis of Qualitative Information
Qualitative information in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered.

A7 Systematic Information
The information collected, processed, and reported in an evaluation should be systematically reviewed, and any errors found should be corrected.

Not sure where to start?  
Try reading some user scenarios for plans.