|
|
|
: Plans : Curriculum Development |
|
|
|
|
|
|
|
|
|
|
Annotations |
Plan Excerpts |
|
|
Excerpt 1
[Oregon State University]
Calculus students are benefiting from a joint effort
involving universities, two-and-four-year colleges,
high schools, and high technology industry. The first
pilot testing at institutions other than Oregon State
University will begin in the Fall 1990. We sought
a diverse range of institutions. Ten schools have
firmly committed to begin pilot testing.
|
Evaluation Purposes:
Specifies evaluation components
|
There are primarily three components to the evaluation
of the projects' activities:
- evaluation of effects on students
- evaluation of materials
- evaluation of instructional training.
|
Evaluation
Questions:
Addresses math curriculum
|
The Calculus Connections Grant concentrated on evaluating
the project in the following areas:
- Have the in-service institutes adequately prepared
the grant-award winners, the high school calculus
teachers, to use the Hewlett-Packard 28 series calculators
and the Dick/Patton curriculum materials?
- Have the Calculus Advanced Placement scores changed
in comparison to scores prior to the project?
- What can be done to improve the in-service institutes?
- How well are the grant recipients disseminating
the project materials?
- How well did the calculus students fare?
|
|
|
Excerpt 2
[University of Tennessee, Chattanooga]
|
Evaluation
Questions:
Addresses instructional materials
|
To evaluate the success of the project we need to
answer the major question: Do the materials meet the
goals of a "fully-realized, practically teachable,
and readily transportable course" which addresses
mainstream students and which uses the power of technology
to illuminate the concepts?
|
Evaluator Credibility:
Describes credentials and roles of evaluators
|
An evaluation team has been drawn up which will work
with the investigators in preparing and carrying out
the evaluation. The members of the evaluation team
are X, Y, and Z. Two are social scientists with extensive
experience in the evaluation of programs in mathematics
and science, and two are mathematicians who are involved
in other calculus reform projects. Three of the four
will only be involved in preparing the evaluation
instruments. Z will also be involved in overseeing
the implementation of the plan.
|
|
|
Excerpt 3
[Virginia Polytechnic Institute and
State University]
|
Evaluation
Questions:
Addresses tutorials
|
The efficacy and accessibility of the tutorials will
be evaluated by student surveys and personal interviews
at mid-term and at the end of the course. The survey
will try to answer the big questions of this project:
- Were the tutorials useful, i.e., did they improve
students' understanding of the material and the
effectiveness of their study time?
- Was Internet access to the information sufficient
and easy?
- How can the tutorials be made more effective and
accessible?
|
|
|
Excerpt 4
[Occidental College]
|
Evaluation
Purposes
|
There are five broad assessments we hope to make
of the program: (1) continuation or persistence in
science and mathematics by the students, (2) personal
and scientific attitude of the students, (3) basic
skills in calculus and physics, (4) problem solving
and synthesis abilities in calculus and physics, and
(5) the ability to reason and discuss a physical concept
in-depth.
|
|
|
Excerpt 5
[University of Oklahoma]
|
Evaluation
Purposes
|
The evaluation plan for the proposed project will
include formative and summative evaluations. The evaluation
activities are designed to collect information that
will provide data-based, criterion-referenced answers
to the following questions.
|
Evaluation
Questions:
Addresses formative and summative evaluation
|
Formative: (1) Is this project working as anticipated?
(2) Are any significant changes needed?
Summative: (1) Will the retention rate of Sooner City
students be improved? (2) Can the Sooner City students
retain concepts and knowledge from previous courses?
(3) Can the Sooner City students apply these concepts
to solve comprehensive design problems?
|
|
|
Excerpt 6
[SUNY Stony Brook]
|
Evaluation
Purposes
|
The evaluation is looking at all areas of the project,
but has selected three specific areas for more intense
investigation. These provide lenses that bring into
focus many disparate activities.
|
Evaluation
Questions
|
- Technology. What increased use is there? How does
technology fit with the traditional goals and methods,
and with the other project goals?
- Cooperative Learning. How is it used? What is
the response of students and professors?
- Precalculus. It was considered important to select
one specific discipline area on which to focus in
order to consider how implementation takes place
in a concrete setting.
|
Stakeholder
Involvement
Evaluation
Questions
|
The areas listed above were selected by the evaluator
in consultation with and approval by the principal
investigator and executive committee. In each of the
above areas, the evaluation is concerned with the
question: How does this engage professors and students
in more active teaching/learning and critical thinking?
|
|
|
Excerpt 7
[Five
College Consortium]
|
Evaluation
Purposes:
Relates evaluation goals to project goals
|
Our brief is to determine whether the "Math"
project succeeds in its stated goals of (1) creating
interdisciplinary courses that will motivate students
to learn, apply, and appreciate mathematics, and (2)
developing planning and instructional approaches that
support the creation and presentation of such courses.
Meetings with the working faculty groups (except Group
5) in November put a finer point on these goals, but
did not alter their configuration. Faculty concern
with student learning was paramount. The single most
consistent desire was to inculcate in students an
attitude, often glossed as "mathematical maturity,"
which would enable them to attack novel, open-ended
problems confidently and productively. They anticipated
that mathematically mature students would be more
likely to pursue courses requiring mathematical competence
and would be more successful in them. Other goals
also found frequent expression: that students be able
to express mathematical ideas in clear English prose
and that they understand the math they encounter in
daily life, especially the statistical presentation
of data.
|
|
|
Excerpt 8
[Utah State University]
|
Evaluation
Purposes
Relates evaluation questions to project goals
|
The Planning Evaluation will assess understanding
of project goals, objectives, strategies and timelines
(National Science Foundation, 1993). Questions investigated
in this phase of the case study include:
- What are measurable outcomes that will enable
the investigators to determine the effect on students
(male and female) of course materials that are free
of gender bias and contain material that is pertinent
to the lives of female students?
- What do the investigators predict will be the
impact over time on attitudes and interest in teaching
science among project participants?
|
Evaluation
Purposes
|
The Summative Evaluation will assess the degree to
which project goals and objectives have been met.
This mixed-method evaluation strategy will provide
a rich narrative of the change process and provide
answers to the following questions:
|
Evaluation
Questions
|
- Do course materials that are pertinent to the
lives and interests of female students result in
improved attitudes and achievement in physics and
do such materials affect the attitudes and achievement
in physics of male students?
- Do course materials that are free of gender bias
result in improved attitudes toward and mastery
of physics among female preservice teachers?
|
|
|
Excerpt 9
[Iowa
State University]
|
Evaluation
Purposes
Evaluator
Credibility
|
Evaluation Plan: Central to the development
process of these curricular materials is an intensive
cycle of ongoing assessment and research, aimed
at testing and improving their effectiveness.
Conceptual quizzes based on the materials will
be given both as pretests and posttests. On these
quizzes, students are frequently asked to explain
the reasoning they used to arrive at their answers.
Along with results from in-class discussions and
group work making use of the materials, these
provide real-time feedback and allow the repair
of unclear or confusing passages, addition of
activities (hard or easy as the situation demands),
and occasionally thorough rewrites of whole sections.
Conceptual diagnostic questions will be presented
on midterm and final exams; student answers and
written explanations allow comparison with results
in previous courses, and with results that have
been reported by other researchers in physics
and chemistry education. This comparison provides
information about the pedagogical efficacy of
the curricular materials. A detailed description
of our evaluation and assessment methods, including
examples of ongoing work, is contained in Appendix
C. Dr. Barbara Sawrey, Professor of Chemistry
at the University of California at San Diego and
Vice-Chair for Education, has agreed to serve
as an external evaluator for this project. Professor
Sawrey is a leader in the development of multimedia
to assist student learning of scientific concepts.
In addition, Professor Lillian C. McDermott and
Professor Paula R. L. Heron of the University
of Washington, Seattle, have agreed to consult
with us on an informal basis during the course
of this project. Professor Alan Van Heuvelen (Ohio
State University) has also agreed to be a consultant.
|
|
|
Excerpt 10
[Oregon
State University]
|
Evaluator
Credibility
|
Third-party formative and summative evaluation
will be provided by the University of Wisconsin-Madison's
Learning through Evaluation, Adaptation and
Dissemination Center [LEAD]. LEAD's Educational
Technology team has developed a national reputation
for its evaluations of educational reforms that
utilize high performance computer technologies.
As the official evaluator for the Education, Outreach,
and Training programs [EOT] of the NSF-funded
National Partnership for Advanced Computational
Infrastructure [NPACI], this team has experience
in assessing, analyzing, and disseminating the
impact of computer-assisted learning on a diverse
mix of student populations. This same team also
has extensive experience evaluating programs designed
to recruit and retain women and underrepresented
minorities into the fields of science, math, engineering,
and technology.
|
Evaluation
Purposes
|
The purpose of the evaluation will be twofold:
(1) to assess whether the OSU degree program in
computational physics is filling an unmet need
and producing graduates who are better equipped
for the 22nd century's technology-driven laboratories
and workforce; and (2) to determine the OSU program's
strengths and weaknesses so that the program may
be improved and successful computational physics
degree programs can be developed at other universities
as well.
|
|
|
Excerpt 11
[Oregon
State University]
|
Evaluation Purposes:
Addresses formative and summative evaluation
|
The goal of the formative and summative evaluation is to assess the usefulness and effectiveness of the workshops and instructor's materials that are being designed to help potential instructors use the Bridge Project materials. The questions of interest are: |
Evaluation
Questions
|
- What role do the workshops and the instructor's guide play in preparing instructors to use these materials?
- In what ways do the prospective instructors view the workshops and instructor's guide as useful?
- What changes, if any, must instructors make in their teaching philosophy and pedagogical practices in order to use these materials effectively?
- To what extent are these changes facilitated by the workshops and instructor's guide?
In order to answer these questions we propose an evaluation cycle that would include pre- and post-workshop questionnaires at the time of the summer workshops, six site visits at selected beta testing sites, and e-mail communication with instructors as they use the instructional materials with their students.
|
|
|
Excerpt 12
[University of California, Santa Barbara]
|
Evaluation Purposes:
Shows relationship between project goals and evaluation activities
|
Project goals |
Implementation activities contributing to evaluation |
Specific assessment strategies |
I. Development of scientific literacy skills among college students: |
a) critical analysis of scientific arguments |
On-line activity for students to analyze two papers on plate tectonic theory. |
Analysis provides assessment of students' abilities to evaluate the validity of scientific
arguments. |
b) building scientific arguments through use of data and key scientific concepts |
"Speed writing" heuristic where students choose appropriate information from a collection of data to support a scientific claim. |
Analysis of how students "put together" pieces of an argument provides insight into ways of modifying instruction to address the learners' needs. |
c) development of writing skills |
Use of large scale data sets for student writing. |
Research into the epistemological and lexical features of scientific writing among students (see section on educational research). |
d) understanding of the nature of science |
Pre- and post-test of nature of science. Instruments will be posted, and accessible from all sites. |
Measure of gains in students' understanding of the nature of science. |
II. Creation of well-researched oceanography courses with inquiry-based pedagogy: |
a) Institutional collaboration and support for inquiry-based teaching |
On-line discussions regarding issues of teaching: list-serve to discuss pedagogy and uses of the technology. |
Document history of pedagogical issues facing implementers of inquiry-based pedagogy. |
b) On-line inquiry |
Web-page of successful science lesson plans, student assignments, and examples of student work. |
Model lessons (lectures, labs, student projects) provide on-going data regarding the extent to which instructors are able to achieve the goals of inquiry-based pedagogy. |
III. Creation of effective online software to support inquiry-based pedagogy: |
a) Software robustness |
Online bug report system. |
Number of bug reports vs. time. |
b) Software effectiveness |
Listserve discussions, student online surveys, requests for new features by instructors. |
Online feedback, workshop discussions, and instructor interviews provide information about whether software features are supporting a variety of instructors' needs. |
Table #7. Issues and strategies for evaluation.
|
|
|
|
|
|