home
  : Plans : Under-Represented Populations





























home reports instruments plans
search

Under-Represented Populations Annotated Plan Excerpts

Return to Under-Represented Populations Plans

Design

The table below contains plan excerpts (right column) accompanied by annotations (left column) identifying how the excerpts represent the Design Criteria.

Project Description | Evaluation Overview | Design | Analysis Process

Annotations Plan Excerpts
 

Excerpt 1 [College of St. Scholastica]

Methodological Approach:
Describes use of multiple evaluation approaches

A comprehensive approach for evaluation will be employed including process and outcome evaluation. Process evaluation will be conducted by reviewing the content and strategies for the workshops and weekend, by monitoring the implementation of activities conducted to achieve the objectives, and by observing the participation of students in the various activities. Outcome evaluation will examine the impact of the program on students, teachers and families.

Data Collection Procedures & Schedule:
Relates project goals to data collection procedures

The program coordinator will design and implement evaluation instruments to assess the degree of interest and learning. These will be distributed to participants throughout the program. At the beginning of the program participants will be asked to state their expectations and to respond to a survey on attitudes toward science and mathematics … Girls will be asked to complete the survey after participating in Science Connections for a full year. They will be asked if their expectations for the program have been realized. Girls will be asked to describe the nature of scientific inquiry and to identify fields of interest to them; it is expected that girls' interest and awareness of scientific career opportunities will significantly increase over the course of a year. In addition, each girl will be asked to select a scientific career and prepare a written description of what kind of training (specifically what courses) would be required through high school and college to achieve this career. This activity will be repeated annually and examined for changes.

Specifies design of formative evaluation

Following each workshop, participants will evaluate the workshop and provide feedback. These evaluations will be monitored on an on-going basis to allow the program coordinator to make modifications and changes immediately to better serve the participants.

 

Excerpt 2 [Dartmouth College]

Methodological Approach

Instruments

Data Collection Procedures & Schedule:
Describes multiple evaluation methods and specifies instrument purposes

The proposed evaluation will involve two primary research components to investigate the above type of issues. First, survey questionnaires will be administered to all first-year women who have previously indicated an interest in science through a college-administered academic interest survey. We envision two rounds of questionnaires. A fall quarter survey will be administered to gather background information on students' past experiences in science (e.g., number and types of courses taken in high school, previous experience in science-related work settings, past contact with professionals in the scientific community); current plans and expectations of the prevalence of science during their first two years (e.g., what types of courses taken during their first and second years, their projected likelihood of declaring science as a major), and current projections of their lives following graduation.

A second survey administered later in the spring quarter will provide a second "snapshot" of students' plans and activities regarding courses actually taken during the first year, plans for sophomore course loads and likelihood of declaring a science major, and thoughts on post-graduate training and work in the sciences. The study's survey component will provide important baseline information on the population of incoming women with initial interest in science as a whole, and provide an important backdrop for the more in-depth interview component described in the following section.

Information Sources & Sampling

The central component of the proposed evaluation will be a closer examination using an interview format with a selected sample of participants in the internship program. We propose to follow closely 8-10 women who represent a rich cross-section of individuals participating in the internships. The women will be selected to represent a range of internships from the different disciplines (natural science, physical science, math, engineering), involvement of both female and male faculty sponsors, involvement of both faculty and non-college/industry sponsors, and representation of minority students. Information gathered from the fall survey will also allow us to select individuals representing a range of pre-college experiences along a variety of dimensions (e.g., moderate vs. high levels of science preparation in high school; previous work experience in lab-settings; parents or significant adults already in the sciences, etc.).

 

Excerpt 3 [Anonymous 1]

Data Collection Procedures & Schedule:
Describes multiple procedures

The current evaluation plan for the program is three-fold:

1. Classroom observation to determine classroom climate issues. The observers meet with the faculty to discuss teaching techniques, student/student interaction, faculty/student interaction, student participation, etc. and other areas that will be of a benefit to the faculty and in turn, the students.

Specifies relationship between data to be collected and evaluation purposes

2. Evaluation by the students of the program to determine effectiveness of teachers and relevancy of materials and activities. General information about the program is gathered at the same time.

3. The program targets high school juniors so it is necessary to wait two years to determine if the program is effective in recruiting students into engineering or computer science at the University. Reports are prepared from university registration records to determine if students who participated in the program enrolled in engineering or computer science at the university. Follow-up studies will be performed to check graduation rates for the participants.

 

Excerpt 4 [University of Washington]

Methodological Approach:
Describes provisions for feedback between evaluation activities and project implementation

Using the informal feedback and formal evaluation questionnaires gathered from the pilot testing at the University of Washington and Stevens Institute of Technology, revisions and modifications will be made to the training materials, the individual handbooks, and the step-by-step approach for implementing the Mentoring Training Program…

Describes design for disseminating information

An aggressive dissemination plan will be developed, focusing on three primary and other secondary targets. The primary targets will include: WEPAN member institutions (80); institutions that have attended the WEPAN Regional Training Seminars (96), other institutions with Colleges of Engineering and Supporting Sciences (150). The secondary targets will include: other women's professional organizations, such as SWE (Society of Women Engineers), NAWE (National Association of Women in Education), and AWIS (American Women in Science), and ASEE (American Society of Engineering Educators).

Articles will be submitted for publication in professional journals, such as the Journal of Women and Minorities in Science and Engineering; presentations will be given at professional association meetings, such as AWIS, SWE and NAWE and WEPAN (Women in Engineering Program Advocates Network). Training will be delivered three times a year at the WEPAN Regional Training Seminars and at pre-conference workshops at the national WEPAN Conference.
(…)

Methodological Approach:
Identifies uses of external evaluators

Evaluation is a critical component in the accountability of this organization in attaining its goals. Individual evaluations are conducted on each of the program components, as well as the WIE Initiative as a whole on increasing recruitment and retention. External evaluators, including corporate and faculty boards, are used to assess the quality and effectiveness of these programs…

Data Collection Procedures & Schedule:
Present evaluation timeline in table format

Description J
u
n
e
J
u
l
y
A
u
g
S
e
p
t
O
c
t
N
o
v
D
e
c
J
a
n
F
e
b
M
a
r
A
p
r
M
a
y
J
u
n
e
J
u
l
y
A
u
g
S
e
p
t
O
c
t
N
o
v
D
e
c
1. Design and develop MTP X X X X                              
1a. Form Task Force X                                    
1b. Develop Student Handbooks X X X X                              
1c. Develop Faculty Handbooks X X X X                              
1d. Develop Professional Handbooks X X X X                              
1e. Develop Step- by-Step Plan for Administrators X X X X                              
1f. Develop Evaluation Instruments   X                                  
2. Increase Participants, (Students, Faculty) X X X X                              
3. Collect Retention Data           X X X X X X X X X X X X X X
4. Students, Faculty & Corporate Board Review Materials       X                              
5. Make Modifications         X                            
6. Pilot Test at UW and Modify           X                          
7. Pilot Test at Stevens Institute               X X                    
8. Evaluate and Modify     X X X X X X X X X X X X X X      
9. Prepare Materials for Printing, Packing & Publication                           X X X      
10. Deliver Workshop for Administrator at WRTS                               X      
11. Disseminate MTP Materials                                 X X X
 

Excerpt 5 [Miami-Dade Community College]

Instruments

Data Collection Procedures & Schedule:
Describes assessment of student outcomes

Achievement on the college level (objective 1.1) will be measured by using a version of the Elementary Algebra Skills test which is one of the four placement tests in mathematics and English, each designed to provide information about readiness for an entry level course. These tests comprise the Florida Multiple Assessment Programs and Services: Assessment and Placement Services Colleges and Universities Program (Multiple Assessment Programs & Services of the College Board, 1984). This instrument contains thirty-five multiple choice items dealing with topics found in most first-year algebra courses. Students will take a different form of the test twice: pretest and posttest. Achievement will be determined by raw scores (the number of correct answers). On the middle school level, achievement (objective 2.1) will be measured using the mathematics portion of the Stanford Achievement Test which is four tests representing a sample of the major components of school mathematics curricula in each grade. This annual test contains a basic, multiple choice battery. Achievement will be determined by scaled scores.

Description J
u
n
e
J
u
l
y
A
u
g
S
e
p
t
O
c
t
N
o
v
D
e
c
J
a
n
F
e
b
M
a
r
A
p
r
M
a
y
J
u
n
e
J
u
l
y
A
u
g
S
e
p
t
O
c
t
N
o
v
D
e
c
1. Design and develop MTP X X X X                              
1a. Form Task Force X                                    
1b. Develop Student Handbooks X X X X                              
1c. Develop Faculty Handbooks X X X X                              
1d. Develop Professional Handbooks X X X X                              
1e. Develop Step-by-Step Plan for Administrators X X X X                              
1f. Develop Evaluation Instruments   X                                  
2. Increase Participants, (Students, Faculty) X X X X                              
3. Collect Retention Data           X X X X X X X X X X X X X X
4. Students, Faculty & Corporate Board Review Materials       X                              
5. Make Modifications         X                            
6. Pilot Test at UW and Modify           X                          
7. Pilot Test at Stevens Institute               X X                    
8. Evaluate and Modify     X X X X X X X X X X X X X X      
9. Prepare Materials for Printing, Packing & Publication                           X X X      
10. Deliver Workshop for Administrator at WRTS                               X      
11. Disseminate MTP Materials                                 X X X