|
|
|
: Plans : Curriculum Development |
|
|
|
|
|
|
|
|
|
|
Annotations |
Plan Excerpts |
|
|
Excerpt 1
[University
of Tennessee, Chattanooga]
|
Methodological
Approach:
Specifies design using experimental and control
groups and describes measures
|
To answer this question, pairs of calculus courses
will be selected with which to perform control/experimental
trial. One will be taught in the traditional manner,
the other will make use of the new strategies,
materials, and computer demonstrations. We will
use objective tests, grades at the end of the
courses, number of withdrawals, kinds of questions
students can answer on the tests, number of students
majoring in scientific field at the beginning
and ending of the courses, problem solving techniques
used by students, number of females and minorities
who are enrolled and successfully complete the
course and who register for the next course. Via
discussion questions we will attempt to measure
abilities to effectively communicate mathematical
ideas to others. We will gather information concerning
perceptions by interviewing and/or surveying students.
|
Instruments:
Describes instrument development process
|
The construction of tests for comparison purposes
will include both mechanical and conceptual problems
on questions the students are unlikely to have
seen. The tests will go beyond template problems.
We will use the advisory panel to help us review
all materials. A nationally representative advisory
panel will be formed to provide advice and assistance.
A local committee of faculty in fields that rely
on the application of calculus will help us determine
the proper levels of emphasis on various topics
and kinds applications for each affected major
areas.
|
|
|
Excerpt 2
[North
Carolina State University]
|
Instruments:
Describes multiple methods
|
Evaluation will include both quantative and qualitative
assessments, with special attention to women and
minorities, using existing survey instruments
as well as standardized assessment questions developed
to test intellectual outcomes (See Table
7). The comprehensive evaluation strategy
will yield longitudinal information about significant
affective, intellectual and behavioral outcomes
based on an entire population of students and
faculty.
|
|
|
Excerpt 3
[Oregon
State University]
|
Meta-Evaluation
|
My role as an outside evaluator was to advise
the project on devising evaluation plans, consult
with the project director on evaluation issues,
direct the data gathering analyses, and meet with
the national advisory panel. The original proposal
called for evaluation in the final year, yet early
and on-going evaluations of the project proved
more useful. Adjustments and corrections could
be implemented as needed.
|
|
|
Excerpt 4
[University
of Oklahoma]
|
Methodological
Approach:
Specifies formative evaluation design
Instruments:
Describes multiple methods
|
The evaluation plan for the proposed project
will include formative and summative evaluations.
Information for the two formative evaluation questions
("Is this project working as anticipated?"
and "Are any significant changes needed?")
will be gathered through mid-semester and end-of-semester
student interviews and questionnaires, faculty
interviews, and observations from the Oversight
Committee. In addition, simple assessment techniques
will be used throughout the semester to assess
student learning as the courses unfold.
|
Methodological
Approach:
Specifies summative evaluation design
|
Information necessary to answer the summative
evaluation questions ("Will the retention
rate of students be improved?", "Can
students retain concepts and knowledge from previous
courses?", and "Can students apply these
concepts to solve comprehensive design problems?")
will be gathered from tracking the retention rates,
standardized exams, performance in the capstone
course, scores on the Fundamentals of Engineering
exam (a national licensing exam), and surveys
of employers conducted two years after graduation. |
|
|
Excerpt 5
[Occidental
College]
|
Data Collection
Procedures & Schedule:
Collecting data from experimental and control
groups
|
During the first three years of the program we
will follow two groups of Occidental students,
a control group and the project group. We will
follow the two groups of students through their
mathematics and physics courses and evaluate them
in the same way. We will of course follow these
two groups of students through their four years
at Occidental and will continue to evaluate and
follow subsequent groups of students who enroll
in the integrated course. |
Instruments:
Describes multiple methods
|
During the first year the evaluator will create
the measurement tools. These measurement tools
may consist of a quantitative attitude assessment,
a test of basic skills in calculus and physics,
and a problem solving and synthesis assessment.
They may also include such qualitative assessments
as journals kept by selected students in the control
and project groups and interviews (entrance, progress,
exit, and post-program summative) with members
of both groups. Finally, we will consider including
a qualitative and quantitative measure of students'
ability to discuss and analyze a significant problem
in depth. Pairs of students from each group will
be observed (or videotaped) by the evaluator as
they discuss, analyze and solve a problem. Although
this type of evaluation is more challenging to
conduct and analyze, we believe that such qualitative
results will be more interesting than the quantitative
results of a written test.
|
|
|
Excerpt 6
[Anonymous
1]
|
Data Collection
Procedures & Schedule:
Describes collection schedule
|
Students in the experimental Integrated Introductory
Biology-Chemistry Lab will complete a questionnaire
at the beginning of their participation in the
lab (Spring semester, freshman year), at the conclusion
of the lab, and after their summer research internships
(Fall semester, junior year). |
Instruments:
Describes measurement of outcomes using several
questionnaires
|
Questionnaires will measure
the following outcome variables:
- Research knowledge: understanding of hypotheses
development, experimental design, sampling,
collection, analysis and interpretation of data;
appreciation of the integrated nature of modern
science and the complexities of research in
biology and chemistry.
- Research skills: proficiency in a variety
of specific research techniques; self-reported
ability to design and conduct experiments, reporting
findings both orally and in writing.
- Commitment to science: intent to continue
with a major in biology or other field of science,
and intent to pursue a career in science.
The questionnaire at the conclusion of the lab
will also examine the students' experience in
this innovative course and their reactions to
both the content and the process of the course.
Data will be gathered on the following:
- Mentorship atmosphere: extent to which students
had access to faculty, graduate and undergraduate
teaching assistants and the extent to which
they received helpful support from them.
- Evaluation of course content: extent to which
students found the lab lectures, discussion
sections, and individual comeback time to be
challenging, interesting, and educationally
valuable.
The questionnaire at the conclusion of the summer
research internship will likewise examine the
students' reaction to their internships. It will
also provide an opportunity to assess participants'
reactions to the career seminar course and its
link to their own experience in the field.
|
|
|
Excerpt 7
[Virginia
Polytechnic Institute and State University]
|
Data Collection
Procedures & Schedule:
Relating survey items to evaluation questions
|
The efficacy and accessibility of the tutorials
will be evaluated by student surveys and personal
interviews at mid-term and at the end of the course.
The survey will try to answer the big questions
of this project:
- Were the tutorials useful, i.e., did they
improve students' understanding of the material
and the effectiveness of their study time?
- Was Internet access to the information sufficient
and easy?
- How can the tutorials be made more effective
and accessible?
|
Instruments:
Describes multiple methods
|
Since the tutorials will also be distributed
on diskettes, the survey will also ask the students
how often they used the tutorials in a local mode.
The personal interviews should be insightful as
to how students actually use the tutorials and
how they take advantage of the hypermedia design.
The tutorial design and content, and the instructions
on Internet access will be refined after each
evaluation. The success of this project will be
determined by the efficacy of the hypermedia tutorials
to actually help students learn. The major emphasis
when refining the hypermedia designs will be on
end-user effectiveness; including design, content,
accessibility, and ease of use.
|
|
|
Excerpt 8
[SUNY
Stony Brook]
|
Information
Sources & Sampling:
Describes sample selection process
Methodological
Approach:
Specifies design and how effectiveness is to be
judged
|
To measure the projects impact on student
performance and attitudes, a random sample of
students will be selected from courses implementing
the projects major goals (changes in modes
of instruction, use of technology, coordination
with other disciplines) and compared to a matched
control group with similar GPA's, majors, etc.
taking an unreformed version of the same course,
a previous (unreformed) offering of the course,
or a course judged to have similar general goals
and clientele. Immediate and longer term performance
will be compared for the two groups.
|
|
|
Excerpt 9
[Anonymous
2]
|
Methodological
Approach
Data Collection
Procedures & Schedule:
Describes multiple methods
|
Recognizing the innovative nature of the project
and the complexity of its environment, we will
use a combination of quantitative and qualitative
methods to evaluate the effect of interdisciplinary
courses on student learning and beliefs and to
document the faculty's creative process as they
develop or reconfigure their courses. We will
use statistics, pre-post data collection, interviews,
and observation to measure not only the project's
outcomes but also to capture the experience of
both students and faculty. Much of what the project
hopes to accomplish is a change in attitude, a
revitalization of the teaching and learning processes
that is best described qualitatively.
|
|
|
Excerpt 10
[Duke
University]
|
Data Collection
Procedures & Schedule:
Specifies using comparison group
|
Students' problem-solving abilities were tested
at the beginning of the course, and we will test
them again at the end. This testing focuses on
"non-routine" problems; strategies used
by the freshmen are being compared with those
used by "expert" problem solvers (first-year
graduate students in mathematics).
|
|
|
Excerpt 11
[Five
College Consortium]
|
Methodological
Approach:
Describes summative evaluation process
|
Summative Evaluation
The summative evaluation can be conceptualized
in terms of internal and external components.
The internal component entails the assessment
of the extent to which the program goals have
been met; the external component deals with determination
of the effectiveness of the program relative to
a traditional calculus sequence.
|
Specifies evaluation design related to project
goals
|
The goals of the program in terms of student
outcomes are that the students (i) master the
basic concepts and manipulations; (ii) be able
to follow the mathematics in technical expositions;
(iii) be able to describe a specific situation
in the language of mathematics; (iv) be able to
use mathematics intelligently in unfamiliar settings;
and (v) develop positive attitudes to mathematics
and feel empowered to use it. The first four outcomes
are in the cognitive domain, while the fifth is
in the affective domain.
|
Instruments:
Specifies multiple instruments
Data Collection
Procedures & Schedule
Methodological
Approach:
Explains rationale of design
|
Mastery of the basic components and manipulations
is readily measured through periodic testing of
the students. Tests will be developed for this
purpose and administered uniformly across classrooms
and campuses. The second, third, and fourth goals
will be assessed through tests constructed to
contain non-routine problems. Scoring schemes
will be designed to take into account aspects
such as ability to translate a given situation
into mathematical terms and creativity of the
proposed solution. To assess the final goal, an
attitude instrument will be developed in which
students are asked to evaluate the effect of the
course on their attitudes to mathematics. This
instrument will be administered at the end of
each course. This approach is preferred to a pre-test
post-test paradigm for evaluating change because
it avoids "over-testing" of students
and because prior to the course, students may
not be able to anchor the rating scale. After
the course, the students will have a better context
in which to evaluate their attitudes and changes
in them. In addition, the percentages of students
who continue from the first course to the second
to the third will be compared with the corresponding
data from the traditional sequence. |
Describes use of comparison group
|
To evaluate the effectiveness of the program
relative to traditional calculus sequences, comparisons
will be made between students who participated
in the program and students who did not. Unfortunately,
it is not feasible to design a well-controlled
experiment in which subjects are randomly assigned
to the two approaches. Given this, several other
methods will be designed to compare the approaches.
|
Information
Sources & Sampling
Data Collection
Procedures & Schedule
|
The first method is to identify a group of students
from the five colleges who are currently enrolled
in a calculus sequence. These students will be
given the tests described above and their performance
will be compared with that of the students in
the program when it is implemented.
The second method is to obtain impressions of
the instructors in the program as well as other
instructors in related subjects who have the opportunity
to observe the students in their own classes.
The instructors will be asked to provide an assessment
of the preparedness of the students to apply mathematical
concepts to solve problems. In particular, the
instructors will be asked to compare the performance
of the students who participated in the program
with that of students in a traditional calculus
sequence. This information will be anecdotal and
hence will not lend itself to statistical analysis;
however, it will provide supporting evidence of
the effectiveness of the program.
The third method is to carry out extensive post
interviews with the students who have completed
one or more courses in the program. Such interviews
will also be carried out with students who have
taken the traditional courses. Among other things,
the comparisons between the two will provide valuable
information regarding the extent to which the
program is successful in developing positive attitudes
towards mathematics.
|
|
|
Excerpt 12
[University
of Michigan]
|
Methodological
Approach:
Specifies design using control groups
|
There are three major components of this intervention;
they will be operationalized as the following
independent variables: calculators (used in course/not
used in course); training of instructors (trained/not
trained); and new syllabus (in use/not in use).
There are also a number of other factors that
need to be included as independent variables.
Demographic information on the instructors will
be valuable. For example, we will want to consider
whether the instructor is a TA or a faculty member.
(
)
|
Instruments:
Describes multiple methods
Methodological
Approach:
Specifies how effectiveness is to be judged
|
Students also bring a variety of different experiences
and learning styles to this situation. Some of
these may be very relevant to how they respond
to this innovation. In addition to recording their
age, gender, and year in school, we should obtain
a measure of their preparation, both in high school
and college mathematics classes. Using their SAT
or ACT scores, we can explore whether these purported
measures of aptitude actually relate to their
success in the new calculus. Since cognitive abilities
exert their influence at different times for different
individuals, we will include several questionnaires
to assess the cognitive complexity and extant
learning styles and beliefs of the students as
they begin calculus. These factors may influence
the effectiveness of the innovation and may even
determine whether a student benefits or actually
does worse in the new system. We will also give
a questionnaire that measures various attitudes
towards mathematics, both at the beginning and
end of the course.
The assessments above will be made in both the
calculator and non-calculator sections of Calculus
1 and Calculus 2.
Operationalization of the dependent
variables is somewhat more difficult; benefits
are expected to be wide ranging and ill defined.
The students are expected to think more, think
better, be able to apply concepts in new areas,
generally have more positive attitudes toward
mathematics, and ultimately to take more mathematics
and science classes. The most straightforward
dependent measures will be based on the performance
of the students. Grades in the calculus course,
grades in subsequent mathematics courses, and
grades in subsequent science courses will be entered
for each student. We will also record the number
of mathematics courses each graduate of the calculus
course takes in the following year and the dropout
rate during calculus (both of these are not pure
performance measures; they include a large attitudinal
component). We will also ask the instructors of
the current course and subsequent mathematics
courses if they perceive any difference in the
students (as a whole) thinking ability,
attitudes toward mathematics, and calculation
ability. This will also be done with the instructors
of selected courses in other departments that
rely heavily on their students calculus
preparation (e.g., Economics, Chemistry, and Engineering).
Students will be administered the aforementioned
attitude questionnaire, which will include items
assessing their beliefs about their ability in
mathematics, how much effort they put into mathematics,
the intrinsic interest of the subject, its potential
usefulness to them, etc. Outcome measures will
also include the instructorsend-of-term
student evaluations of the instructors and courses
combined with more detailed questionnaires will
be used. At a later time, the data from the mid-semester
feedback may be able to be combined and used for
a qualitative part of the summative evaluation,
in addition to its use as a formative evaluation
tool.
|
|
|
Excerpt 13
[Five
College Consortium]
|
Methodological
Approach:
Describes formative and summative approaches
|
Evaluation of the program will include formative
and summative components. The formative component
will entail monitoring the program during its
implementation to provide feedback to the project
staff so that any problems or weaknesses in the
program can be corrected. The summative evaluation
will be conducted for each of the semester courses
and for the whole sequence to determine if the
goals and objectives of the program have been
met. Complete evaluations will be prepared for
the third-year review and again for the Final
Report.
|
|
|
Excerpt 14
[University
of Colorado, Boulder]
|
Methodological
Approach:
Specifies design using experimental and control
groups
|
We envision that hands-on-homework assignments
will get students to apply what they learn in
the classroom to what they see every day, with
enormous impact. This fundamental benefit is not
an easy one to directly assess, and we will have
to rely on much anecdotal information. However,
we propose to also obtain a more quantitative
assessment by using the following procedure:
- randomly divide each class into two equal
sized groups;
- give, on a regular basis, hands-on-homework
to only one group; and
- evaluate the effectiveness of hands-on-homework
by holding consensus group evaluations at two
points in the semester and by statistically
comparing overall performance of two groups.
|
|
|
Excerpt 15
[Gettysburg
College]
|
Instruments:
Describes instruments and scoring procedures
|
For the final evaluation of the project, both
objective and open-ended questions will be prepared
by the external evaluator, working with the project
staff. Rubrics for grading the student-produced
responses will be created and inter-rater reliability
will be established so scoring can be conducted
by instructors at different schools with as high
a degree of uniformity as possible.
|
Data Collection
Procedures & Schedule:
Specifies how effectiveness will be judged
|
We have recruited approximately ten sites where
multiple sections of astronomy-related courses
are taught, so that we will be able to compare
pre- to post-test gains of students with those
using traditional lecture and text approaches
(as we have already done at the University of
Wisconsin). Students in both groups will answer
knowledge/concept/application items as well as
reactions to learning and doing astronomy.
(
)
|
Describes scope of data collection and use of
external evaluator
|
All measures and modules use Gettysburg College
as the main site. These data will be gathered
from a number of sites nationally. We will make
every effort to get several classes' results for
each exercise. All data will be maintained, analyzed,
and summarized by an external evaluator. |
|
|
Excerpt 16
[Iowa
State University]
|
|
Appendix C
Assessment of Student Learning: Do the Materials
Produce Results?
The acid test of the curricular materials we
develop is to assess what students have learned
as a result of using them. To illustrate our methods
for carrying this out, we describe here our ongoing
assessment of the preliminary materials included
in this proposal.
|
Instruments
Methodological
Approach:
Describes comparison of treatment and control
groups on final exams
|
- We are examining the performance on the final
exam of the chemistry students who tested the
"Chemical Thermodynamics" worksheets
(Appendix D). We are determining the average
grade achieved on thermodynamics-related
questions on the final exam of these students,
and comparing this to the average grade on these
same questions by students who did not use our
worksheet materials. We will examine whether
there is any difference in the grades of the
two groups of students, specifically on the
thermodynamics questions. However, we will
also examine the grades on the complete
final exam of the two groups of students.
If there is any statistically significant difference
between the two groups on the complete exam,
then that will have to be taken into account
in evaluating any possible differences that
may appear on the thermodynamics grades.
- We are following a similar procedure with the
physics students in the lab-recitation sections that
used the "Thermal Measurements" worksheets included
in this proposal as Appendix F. We are examining their
grades on thermodynamics-related questions on the final
exam, and comparing those grades to that of students who
used the standard version of the Pre-Lab instead. We will
also examine the relative grades on the complete final
exam of the two groups, to determine if any normalization is
required when judging the grades on the thermodynamics questions.
In the preliminary assessment method described
above, there are several weaknesses which we plan
to correct in future evaluations of student learning:
|
Describes planned improvements to instrumentation
|
- The physics and chemistry final course exams
stress quantitative problem solving and do not
necessarily assess students' qualitative conceptual
understanding of thermodynamics topics. In the
future, we will add one or two qualitative,
concept-oriented questions to these exams. In
some cases, students will be asked to write
explanations of the reasoning they use to obtain
their answers. (Both PI's are involved in teaching
these courses, and in addition we have obtained
agreement in principle from other course instructors
to include this type of material on exams.)
|
Describes plans to better control important variables
|
- In the case of the physics material included
as Appendix F, we had two distinct groups assigned
in the testing procedure: (1) those who used
only the preliminary versions of our worksheets
(given as a Pre-Lab in four lab sections); (2)
those who used only the traditional version
of the Pre-Lab, which stressed quantitative
problem solving (given to all other lab sections
14 in all). In this case, the new materials
were substituted for the standard
materials in the test sections. As a result,
no additional study time was
expended (presumably) by the students using
the materials being tested. However, those students
involved in testing the chemistry materials
took this on as an additional task beyond their
other class activities (due to logistical difficulties
on our part). In this case, there was indeed
additional study time expended by the students.
This, of course, may have had an independent
effect on their exam performance, beyond any
particular utility of the new materials. In
the future, we will try to ensure that student
learning is assessed by examining groups of
students that have expended comparable amounts
of class/study time on the thermodynamics topics,
and who differ only on which
materials they have used for study.
|
Instruments:
Describes additional sources of data
|
Finally, we will describe two additional forms
of assessment which we are using in other curriculum
development projects, and which we will employ
in this project as well.
- Questions related to topics covered in the
curricular materials are given as "pretests,"
before use of the materials, and then identical
or closely related questions are given on course
quizzes, midterms, and final exams. Often, students
are asked to write explanations of the reasoning
they use to obtain their answers. We compare
pretest-to-posttest gains of students who have
used the new materials, to those of students
who use only standard textbooks and study guides.
In this way we gather information about the
pedagogical effectiveness of the new materials,
relative to those that are presently being used.
- Individual interviews with students are carried
out, and recorded on videotape. During these
interviews, students work through curricular
materials as they "think out loud,"
and the interviewer (faculty member or graduate
student) probes them with additional questions
when necessary. These interviews give significant
additional insight into the utility of the materials,
and help uncover ambiguous or confusing language,
as well as unanticipated gaps in students' background
knowledge.
|
|
|
Excerpt 17
[Vassar College]
|
Methodological
Approach:
|
Evaluation of the course will be done during
the semester the course is taught as well as two
years after the course is completed and the students
have moved on to acquire additional training and
experience. Dr. Ken Livingston, director of the
Office of Teaching Development and Resources,
will assist in the design and implementation of
the evaluation plan. Dr. Livingston is a professor
of Psychology and the director of Vassar's program
in Cognitive Science.
|
Specifies immediate feedback on course content
|
During the semester, students will be given weekly
questionnaires and asked to prepare journal entries
in order to examine their reaction to each exercise
and to determine their comfort level with new
material and methods of quantitative analysis.
I will determine the extent to which the methods
and exercises aid students in overcoming their
math anxiety and give them a greater appreciation
for the use of math in solving geological problems.
I will also undertake an examination of more than
30 senior theses written in the department over
the last two decades to determine the extent to
which model development could have been applied
in those investigations. A compilation of the
results will be used to assist students in the
design of individual modeling projects for the
course and for future senior theses.
|
Specifies long term follow-up with students
|
Evaluations of progress during the semester will
be used by myself and Dr. Livingston to design
a procedure and interview questionnaire to determine
the success of this course in meeting its 3 long-term
objectives of 1) developing students' analytical
thinking skills, 2) developing students' abilities
to engage the geological literature, and 3) teaching
students how to model and program. The interview
questionnaires will track the progress of students
until approximately two years after they have
graduated from Vassar. To facilitate evaluation
the tracking will include students in four categories,
as follows:
|
Information
Sources & Sampling:
Describes four comparison groups
|
- Vassar students who previously enrolled in
the Computer Methods and Modeling in the Earth
Sciences class and who are engaged in some aspect
of computer use and or model development or research
during subsequent graduate work or employment.
- Students who previously enrolled in the Computer
Methods and Modeling in the Earth Sciences class
and who are not engaged in employment or studies
that involve computer use or model development.
- Vassar graduates in geology who did not take
the course but are directly involved in graduate
research or employment that requires computer
programming skills or modeling.
- Vassar graduates in geology who did not take
the course and presently are not engaged in computer-related
or modeling research.
|
Instruments:
Describes foci of questionnaires
|
Tracking of former students will continue for
two years because this is the normative time required
for graduate students to begin work on their own
research. Students will be queried about whether
they have been engaged in modeling based research,
and specifically whether the models they have
used were written by others or by themselves.
Students who did not take the course will be asked
whether a lack of modeling experience has in any
way shaped the kinds of research questions or
job tasks they have been able to work on; for
example, were they hindered by a lack of modeling
exposure, and how do they feel about their abilities
to analyze and solve problems. If they are involved
in modeling, we will ask where they acquired the
skills necessary to do the work, whether they
feel they would have benefited from earlier exposure
to the modeling process, and what difficulties
they have encountered in undertaking modeling
projects.
For students who previously enrolled in the Computer
Methods and Modeling in the Earth Sciences class
we will determine if their early exposure to modeling
as undergraduates benefited them in their research
or job opportunities. We will also ask whether
the course had any lasting impact on how they
address problems. Observations and conclusions
will be summarized with a view toward improving
course design at Vassar and wider application
of computer modeling at other institutions.
|
|
|
Excerpt 18
[Anonymous 4]
|
Methodological
Approach:
Describes formative approach
Instruments:
Describes multiple methods
|
Formative evaluation will be conducted through
student surveys and technical quizzes at the completion
of each module. Students will be queried regarding
their interest level in the material, adequacy
of background preparation, usefulness of the handouts,
the knowledge they acquired from the module, relevance
to course material, and any suggestions they have
for improvement. A similar, but expanded survey
at the end of the Freshman course will request
feedback on how the modules mesh in a multidisciplinary,
cross-cutting context. Classroom observers from
Center X at University Y will conduct an external
formative evaluation of the project. The formative
evaluation will be used to determine whether the
project is meeting its goals, and to perform continuous
improvement of the project.
|
Methodological
Approach:
Describes summative approach
Describes comparison of treatment and control groups
|
A summative evaluation will also be conducted.
Surveys will be conducted with two groups of Freshman
students: (1) students who participated in the
project and (2) students who had a traditional
freshman engineering course. They will be surveyed
about the engineering principles they learned
in the freshman year, their interest level in
the topics and applications covered, what was
most effective, and their overall enthusiasm level
for engineering. The students' perceptions of
the lasting impact of the modules and the effectiveness
of vertical integration will be addressed in the
senior exit interviews. The faculty and their
department chairs will evaluate whether the project
assisted in professional development, based on
conference proceedings, publications and potential
collaborations. A final measure of success of
the project will be whether the project has been
successfully adapted into other University Y courses,
and SMET programs in other colleges and universities.
|
|
|
Excerpt 19
[Anonymous 5]
|
Methodological
Approach:
Discusses use of control classes
|
Because University X runs a significant number
of sections of each course in question, it will
be fairly straightforward to make direct comparisons
between sections which use Supplement W to those
which do not.
For each course, we will run approximately half
of the sections with Supplement W for its first
two semesters. We will use three measures to compare
students in those sections with Supplement W with
those which do not use Supplement W.
|
Lists three outcome measures
|
Student Persistence What percentage of
students finish the course as opposed to withdrawing
(large numbers of students withdraw from their
math courses at University X).
Success Rate Percentage of students finishing
the course who earn a grade of A, B, or C. Success
rates in First Year Math courses at University
X have been an issue of institutional concern
over the years.
Student Attitudes Are students happier
with their math courses when they have used Supplement
W?
|
Addresses potential drawbacks of grades as an outcome
|
The first two items can be computed fairly easily
by accessing University X's administrative computer
systems. While grades may sometimes be a poor
measure of assessment due to instructors having
control over the grades of their sections, we
do not expect this to be an issue at University
X for two reasons. First, the courses in question
use similar exams across sections and common final
exams, so students are being tested at the same
level. Second, there are simply too many instructors
involved for results to be swayed by individuals
who may be tempted to adjust their grades to make
a project look good.
We will use data from student evaluations of
their teachers/courses to gauge the effect of
Supplement W on student attitudes. By averaging
over large numbers of sections and students, we
think that other factors (such as some teachers
being more popular than others) will not cloud
the result.
Finally, we will survey students in Supplement
W sections each term to gain feedback from the
students directly as to what aspects of the system
they like, and which ones could be improved upon.
|
|
|
Excerpt 20
[Oregon
State University]
|
Information
Sources & Sampling
|
Data regarding the impact of the new degree program
on the students who enroll in it and on the physics
department as a whole will be collected through:
interviews with OSU physics professors and college
administrators, surveys with physics department
alumni from both and after the computational physics
degree program was available, surveys with all
students enrolled in the computational physics
courses, a series of in-depth interviews and surveys
with physics majors enrolled in the computational
physics degree program and those enrolled in the
standard physics degree program, and a comparative
analysis of course and student records for computational
and standard physics majors. |
Methodological
Approach
Information
Sources & Sampling
|
It is anticipated that the most informative data
on the benefits and drawbacks of the program will
be generated through a multi-factored comparison
between students in the new degree program and
those who remain in the standard physics program.
The LEAD evaluator will use quantitative and qualitative
methods to analyze the similarities and differences
between these two groups of students in terms
of: (1) demographics, (2) their incoming test
scores and academic preparation, (3) their career
interests, (4) their course enrollment trajectory
and course performance, (5) their interest and
engagement in their course work, (6) their involvement
in research, (7) their career prospects, and eventually,
(8) their placement in jobs or enrollment in graduate
programs related to their major. Data collected
on this program will be shared with program administrators
and physics department faculty to inform them
of the impact of the new program and the means
for improving it. This information will then be
disseminated to college administrators and physics
faculty nationwide to assist them in making decisions
about such programs at their own universities.
|
|
|
Excerpt 21
[Gadsden
State Community College]
|
Information
Sources & Sampling
Data Collection
Procedures & Schedule:
Presents schedule for collecting three types of evaluation data
|
Evaluation of the NSF Calculus Grant:
Improving Student Learning in Calculus Through
Effective Implementation of Model Activities
|
Attitude Survey
|
Interviews
|
Status of Students
|
Spring 2000 (January) |
- Administer attitude survey to calculus
I class |
- Using stratified methods obtain 15 students
to interview at the end of the semester |
- Gather information concerning student
success rate and success in future math courses
(A, B or C) |
Spring 2000 (May) |
- Administer attitude survey to calculus
I class |
- Conduct student interviews |
- Gather information concerning student
success rate and enrollment in future math
courses |
Fall 2000 (August) |
- Administer attitude survey to calculus
I class |
- Using stratified methods obtain 15 new
students to interview at the end of the semester |
|
Fall 2000 (December) |
- Administer attitude survey to calculus
I and II class |
- Conduct student interviews of the calculus
I and II students |
- Gather information concerning student
success rate and enrollment in future math
courses |
Spring 2001 (January) |
- Administer attitude survey to calculus
I, II and III class |
- Using stratified methods obtain 15 new
students to interview at the end of the semester |
|
Spring 2001 (May) |
- Administer attitude survey to calculus
I, II and III class |
- Conduct student interviews of the calculus
I, II, and III students |
- Gather information concerning student
success rate and enrollment in future math
courses |
Fall 2001 (August) |
- Administer attitude survey to calculus
I, II, III and ___ class |
- Using stratified methods obtain 15 new
students to interview at the end of the semester
|
|
Fall 2001 (December) |
- Administer attitude survey to calculus
I, II, III and ___ class |
- Conduct student interviews of the calculus
I, II, III and ___ students |
- Gather information concerning student
success rate and enrollment in future math
courses |
Spring 2002 (January) |
- Administer attitude survey to calculus
I, II, III and ___ class |
- Using stratified methods obtain 15 new
students to interview at the end of the semester |
|
Spring 2002 (May) |
- Administer attitude survey to calculus
I, II, III and ___ class |
- Conduct student interviews of the calculus
I, II, III and ___ students |
- Gather information concerning student
success rate and enrollment in future math
courses |
|
|
|
Excerpt 22
[Western
Michigan University]
|
Describes the project
|
The project is intended to give students a conceptual
understanding of physical and geometrical optics
at the sophomore level and is specially intended
for future high school teachers. Explicit goals
include familiarization of students with the conceptual
foundations of wave behavior, electromagnetic
waves, interferometry, spectroscopy, diffraction,
and the basic properties of image forming systems.
Teaching methodology is based on discovery-type
laboratory work supplemented by lectures and group
work. Laboratory manual was written by author
as part of this grant and is available upon request.
|
Information
Sources & Sampling:
Lists the information sources
|
The project is evaluated on the basis of
- Performance on four regular in-class examinations
and the final exam.
- Student performance as judged by lab work,
lab reports, and homework.
- Mid semester student evaluation
- Comments on the end-of-semester evaluations
- Performance on a set of conceptual questions
in optics
- Special assignment for teaching majors
- Term project for all other majors
|
Describes what useful data the information sources
will yield
|
Item 1 tests understanding of textbook material
and lab results. Item 2 tests the effectiveness
of the lab work and the assigned reading. Items
3 and 4 assess the degree of interactive engagement.
Item 5 tests conceptual understanding on an instrument
incorporating research on student understanding
in physical optics. These questions are administered
either as in-class small group projects, examination
questions, or homework. In the course of a semester
all these questions are administered in one fashion
or another.
|
|
|
Excerpt 23
[Anonymous
6] |
Information
Sources & Sampling:
Data Collection
Procedures & Schedule:
Aligns anticipated outcomes to proposed data sources
and frequencies of data collection
|
Evaluation Plan
Expected Outcomes |
Data to be Collected |
By whom & Frequency |
-Students' mastery
of research methods in geophysics will be
enhanced, including: planning research, collecting
data, analyzing data, evaluating success of
a research project and reporting/communicating
outcomes. |
-Observations of students' work noted in
daily log
|
-Course instructor; each class day
|
-Student research field reports |
-Course instructor; when turned
in |
-Students' understanding
of "real world" constraints on geophysical
research will increase & impact their
performance on research assignments |
-Observations of students' work noted in
daily log
|
-Course instructor; each class day
|
-Student research field reports |
-Course instructor; when turned
in |
-Students' mastery of the geophysical
content of the course will be greater than
in the past
|
-Comparison of answers on final
examinations before and during project period |
-Course instructor; end of
each semester when course is taught |
-Students' ability
to work effectively and efficiently as part
of a research team will improve from the beginning
to end of each semester |
-Student self-ratings
of specific teamwork skills |
-Students; beginning
& end of semester |
-Observations of students' work
noted in daily log |
-Course instructor; each class
day |
-Students' readiness
for and success in finding employment in positions
related to geophysical work will improve
|
-Student self-ratings
of readiness & confidence |
-Students; beginning
& end of semester |
-Post-graduation job placements |
-Students, instructor, other
advisors; annually at graduation |
|
|
|
Excerpt 24
[Middle Tennessee State University] |
Instruments:
Describes multiple methods
|
The evaluation of the revised pedagogy for the first semester of the College Physics sequence will have three components. First is the administration of the Force Concept Inventory (FCI) at the beginning and end of the first-semester course. This will give a measure of the gain in student understanding of concepts that tend to give the students the most trouble in introductory mechanics. Second will be written evaluations and oral interviews with students who have completed the first-semester course with the new pedagogy. Third will be written evaluations and oral interviews with the faculty who have taught using the new pedagogy (this will be approximately 6 faculty over the duration of the project). There is not a universal agreement on the utility and interpretation of the results of the FCI, so it is important to perform the evaluations and interviews to try to get a more complete picture of the efficacy of the revised pedagogy in the College Physics sequence, both in student learning and in student attitudes.
|
|
|
Excerpt 25
[Oregon State University] |
Instruments:
Describes questionnaires and interviews for site visits
Describes collection of baseline data
|
During the winter and spring of 2003 the assessment instruments (questionnaires, and site visit interview scripts) will be written and tested. Four questionnaires will be created. First, a short-answer pre-workshop questionnaire will be created for each potential instructor at the beginning of his or her participation in the project. This questionnaire will provide a means of recording instructor expectations and assessing the instructor's previous experience in teaching vector calculus setting a base line from which to determine any change in the instructor's beliefs or practice over the course of the project. A second questionnaire will be designed for these instructors after the end of their first workshop. Two other questionnaires will be used as follow-up, one to be given to instructors after each year of their experience with the materials. Potential site visit candidates will be chosen from the pool of all the instructors who participate in the first workshop.
The selected site visits will provide a deeper assessment of the instructor's use of information from the workshops and instructor's guide. The goal is to evaluate the effectiveness
of the workshops and instructor's guide from the instructor's perspective as well as an outside perspective. The site visits will include a brief interview with the instructor prior to the
classroom observation, the observation itself, and a follow-up interview with the instructor after the observation. The pre-observation interview will establish the instructor's goals for
the class or laboratory session, and the post-observation interview will focus on the instructor's views of that particular class session. If possible each site visit will include observations
of at least two consecutive class or laboratory sessions.
|
|
|
Excerpt 26
[Anonymous] |
Information Sources & Sampling:
Overviews three types of measures
Data Collection Procedures & Schedule:
Describes a scheduled data collection day
Information Sources & Sampling:
Describes use of existing assessment system created by FLAG, a different NSF-funded project
|
The project will be assessed through the use of (1) standardized course evaluation forms, (2) scientifically
designed student,
faculty, and alumni surveys, and (3) classroom assessment techniques (CATs) from the Field-tested
Learning Assessment Guide (FLAG) project1.
Assessment will be facilitated by University X's extensive ongoing assessment
effort. For example, each spring semester the University devotes a day without
classes to assessment. The Chemistry Department will use this "Assessment
Day" for the administration of the student and faculty surveys.
...
For assessment of student learning, instructors will use the curriculum,
instruction, and assessment (CIA) model for course development.1
Explicitly stated course goals will be categorized and then matched to CATs as
listed on the FLAG web site.1b The use of CATs as assessment
instruments should not only provide valid measures of student learning but should
also enhance the success of the project by driving student learning beyond the
superficial levels required for traditional classroom assessment instruments
such as multiple choice tests.1
(1)(a) Angelo, T.A.; Cross, K.P. Classroom Assessment Techniques: A Handbook for
College Teachers, 2nd ed., Jossey-Bass: San Francisco, CA, 1993.
(b) Field-Tested Learning Assessment Guide, National Institute for Science Education,
http://www.wcer.wisc.edu/nise/CL1/flag/, accessed April, 2001.
|
|
|
Excerpt 27
[Anonymous] |
Overviews the evaluation
Information Sources & Sampling:
Describes quantitative measures pertaining to teachers' implementation of instructional practices learned in a training program
|
Our evaluation plan will measure the impact of the hands-on optics activites at <University XYZ> by tracking the activities of high school teachers in the <ABC program> and the careers of <University XYZ> Physics majors. In the following sections, we describe how we will measure that the hands-on optics experiences have improved the optics education of both of these groups.
High school science teachers in the program
We will track our success in improving optics education for high school science teachers on
two levels. First, we will track the teachers' use of optics in their own curriculum. The extent
to which they implement hands-on activities from the courses as well as new
activities they develop themselves provides an objective, concrete measure of the improvement of
their optics understanding. The quantitative measures we will track are: the number of activities
generated, the amount of classroom time the teachers use for optics-related activities, the number of
collaborative projects the teachers initiate (with colleagues or high school students), the number of
students impacted by these activities and collaborative projects, and the number of downloads from the
Share Base web site. Further, the teachers in the will meet for a "Share Day" to evaluate
the effectiveness of the activities they use with their students. In addition to gauging the interest
generated in students by the optics activities, the teachers will also report whether the use of
optics in their classroom has resulted in improvements in student understanding of Physics as
evidenced by their grades.
Second, through the teachers, we will track whether there is a correlation between the use of optics in their classrooms and the following objective measures: the number of students choosing to take Physics in their schools, the number of seniors taking Physics who choose to go to college, and the number of seniors choosing Physics and Optics careers.
|
|
|
Excerpt 28
[Anonymous] |
Data Collection Procedures & Schedules:
Describes use of repeated measures to investigate a sustained effect in a quasi-experimental design |
<University XYZ> Physics Majors
To measure the impact of improved optics education for our Physics majors,
we will track our students' use of optics in their careers. To quantify the
long-term impact of hands-on optical experiences for our physics majors,
we will track our graduates to learn which skills they use in their careers.
These evaluations will occur on the following years after graduation: 1 year,
3 years, 5 years, and 10 years. The Physics department
maintains an extensive alumni database from which we will be able to successfully generate
a statistically meaningful number of responses. We will compare the use of optics by alumni from the
following categories: students who did not take Physics: Optics course, students who took the course
before the hands-on experiences were added, and students who took the course after the changes were made.
Using a checklist of skills as well as open-ended questions, we will determine whether necessary skills
were learned in optics-based classes and whether we prepared undergraduates successfully for their SMET
careers.
Second, through the teachers, we will track whether there is a correlation between the use of
optics in their classrooms and the following objective measures: the number of students choosing to
take Physics in their schools, the number of seniors taking Physics who choose to go to college, and
the number of seniors choosing Physics and Optics careers.
|
|
|
|
|
|