|
|
|
: Reports : Teacher Education |
|
|
|
|
|
|
|
|
|
|
Annotations |
Report Excerpts |
|
|
Excerpt 1
[Los Angeles Collaborative]
|
Evaluation
Purposes
|
The Evaluation and Training Institute (ETI) is conducting
an independent, third-party evaluation of the Los Angeles
Collaborative for Teacher Excellence. This formative
report examines the progress made by LACTE in Year One
and provides qualitative and quantitative findings.
|
|
|
Excerpt 2
[Oklahoma Collaborative]
|
Evaluation
Purposes
|
We view evaluation as a cooperative effort aimed at
improving the effectiveness of the O-TEC initiatives
and we see the evaluation process as a dialogue between
the evaluators, the participating institutions, and
the principal investigators. Therefore, throughout the
evaluation process, we have attempted to walk a line
between program participation and neutral evaluation.
We believe that this approach yields the best data in
such a complex situation and we believe that this is
reflected in the quality and comprehensiveness of the
information we present below.
|
Identifies similar projects
Stakeholder
Involvement
|
The National Science Foundation has funded other initiatives
along with O-TEC under the Collaboratives for Excellence
in Teacher Preparation (CETP) program. These grants
have focused extensively on outcomes of educational
reform. In general, the findings of these grants suggest
that educational reforms have beneficial effects on
teacher preparation in math and science. However, O-TEC
has two components that are uniquethe Summer Academy
program and the Master-Teacher-in-Residence. Therefore,
our evaluation to date has focused for the most part
on these components of the grant. Nonetheless, curriculum
reform is taking place in Oklahoma, and the evaluation
team therefore is preparing to address this issue in
detail in our next round of studies. Finally, the most
difficult challenge of the O-TEC initiative is to foster
collaboration within and between institutions. Consequently,
assessment of collaboration has been a principal concern
of the O-TEC evaluation team.
|
Describes relation between evaluation goals and
project goals
|
Our evaluation strategy has been to act, in essence,
as a contract research organization. Other O-TEC participants
have been charged with the responsibility of providing
extensive demographic analysis of the O-TEC participants.
This has permitted the evaluation team to devote most
of its resources to conducting studies concerning each
of the strategic goals of the grant. This report describes
two sets of these studies in detailthe May, 1997
evaluation of the 1996-1997 cohort of MTIRs and the
qualitative and quantitative analysis of the summer
academies. We also provide preliminary findings concerning
curriculum revision across the participating institutions
and describe the extent to which collaboration occurs
within and across institutions. The latter two reports
are preliminary because they are based, in large part,
on information gathered from ongoing site visits.
|
|
|
Excerpt 3
[Maryland Collaborative]
|
Evaluation
Questions:
Suggests periodic reassessment of evaluation
plan
|
The following questions serve as the a priori research
questions (a posteriori questions will emerge throughout
the research period):
- What is the nature of the faculty and teacher candidates'
beliefs and attitudes concerning the nature of mathematics
and science, the interdisciplinary teaching and learning
of mathematics and science to diverse groups (both
on the higher education and upper elementary and middle
level), and the use of technology in teaching and
learning mathematics and science?
- Do the faculty and teacher candidates perceive the
instruction in the MCTP as responsive to prior knowledge,
addressing conceptual change, establishing connections
among disciplines, incorporating technology, promoting
reflection on changes in thinking, stressing logic
and fundamental principles as opposed to memorization
of unconnected facts, and modeling the kind of teaching/learning
they would like to see on the upper elementary, middle
level?
|
Evaluation
Questions:
States global questions to which evaluation
findings may be generalized
|
Answers to those questions will address the
following global research questions driving
teacher education research:
- How do teacher candidates construct the various
facets of their knowledge bases?
- What nature of teacher knowledge is requisite for
effective teaching in a variety of contexts?
- What specific analogies, metaphors, pitfalls, examples,
demonstrations, and anecdotes should be taught content/
method professors so that teacher candidates have
some knowledge to associate with specific content
topics?
(
)
[these additional evaluation questions were
posed in another report:]
- Is there a difference between the MCTP teacher candidates
and the non-MCTP teacher candidates attitudes
and beliefs about mathematics and science?
- Do MCTP teacher candidates attitudes toward
and beliefs about mathematics and science change over
time as they participate in the MCTP classes?
|
|
|
Excerpt 4
[Oregon Collaborative]
|
Evaluation
Purposes:
Critically examines evaluation purposes against
project context
|
Development of the Plan has occupied a great deal of
the Team's time. The latest version, included in Appendix
I, will be reviewed by the Management Team in the Spring.
A concern throughout many of the discussions has been
over what we believe OCEPT can reasonably be expected
to affect over the life of the grant; and what will
take a longer period of time. OCEPT is a very large
Collaborative, involving thirty-four different colleges
and universities in the State. Seventeen different institutions
offer Teacher Education programs, all organized a bit
differently; some only at the undergraduate level and
others only at the post-baccalaureate level. Student
mobility is considerable, with many teacher education
programs serving students who have completed some or
most of their undergraduate math and science course
work at other in-state or out-of-state institutions.
|
Describes intended outcomes
|
The Cooperative Agreement spells out a number of outcomes
that should be achievable. The Plan reflects the belief
that over the next 4 years OCEPT can have an impact on
faculty, their students and their colleagues; on Mentor
Teams and the development of more extended professional
communities; on the recruitment and retention of individuals
currently underrepresented as teachers of K-12 math and
science; and on the collaboration process itself, involving
more communication across institutions and disciplines.
As the project continues to develop, the Team will need
to develop more specific ways to assess more specific
impacts or changes influenced by OCEPT.
|
Critically assesses evaluation procedures
|
In particular, we need to develop better ways to document
and assess change in our future teachers. Such change
might manifest itself as becoming interested in teaching
because of the encouragement of a particular faculty
member; a change in attitude about teaching math and
science; a change in the amount of math and science
courses of particular kinds taken by pre-service elementary
students and secondary math and science students; and
a change in actual classroom or field-experience
practices.
|
|
|
Excerpt 5
[Montana Collaborative]
|
Evaluation
Questions:
Listed by project component
|
[evaluation questions are outlined for each of the
STEP project components]
- Course Revision
- What is the nature of team approach?
- What are the characteristics of professional
development activities?
- What sorts of courses are emerging?
- What are the site specific issues?
- What are the technology needs/accomplishments?
- How is coordination & sequencing accomplished
among/between mathematics and science courses,
and with other education courses?
- How many students are enrolled in STEP courses?
- How many faculty members, teaching assistants,
and K-12 teachers are involved in course revision
teams?
- How many teachers and administrates are involved
in Model Site Programs?
- How do the courses fit into the program of study
in Teacher Education?
- How do the courses evolve and become institutionalized?
- How does course development change over the
life of the project?
- Have course or components of them been adopted
at other colleges?
- Model Schools
- What are the characteristics of the connection
of sites with the universities?
- What are the processes for improving the student
teaching experience?
- How well does the communication among university
personnel, teachers, and student function?
- What is the nature/effectiveness of school
administrators' work?
- What is the nature /effectiveness of lead teachers'
work?
- What is the nature of student teacher acculturation
into teaching?
- How have the teachers' instructional approaches
changed?
- How many teachers, administrators and teachers
are involved in the model site program?
- What multiplier effects have taken place?
(
)
|
|
|
Excerpt 6
[Anonymous 2]
|
Evaluation
Purposes:
Describes evaluation goals
|
III. What are the principal
goals of the evaluation of Program A?
The evaluation of Program A aims to document and assess
the overall effectiveness of each of the key components
of the program strategy and, equally important, to determine
how well these components work together to achieve the
programs goals. Our inquiries will focus on the
effectiveness of Program A:
- In helping teachers acquire valuable content and
pedagogical knowledge in environmental sciences;
- In helping teachers acquire and exercise leadership
skills (as defined by the Foundation and the Leadership
Team of Program A);
- In helping teachers pursue change in science education
in their classrooms, schools, districts, communities,
professional networks, and/or associations; and
- In improving students opportunity to learn
valuable environmental science content.
|
|
|
Excerpt 7
[Evaluation
Cooperative Services Unit, MN]
|
Evaluation
Purposes:
Describes evaluation goals and relates them to
project goals
|
The purpose of the evaluation is to document the activities
of the project and relate those activities to the impact
that computational science has on teacher and student
attitudes, knowledge, and behavior in mathematics and
science. The project evaluation consists of three parts,
(a) documentation of the activities conducted by the
project, (b) assessment of changes in teachers' attitudes,
knowledge, and behavior, and (c) assessment of changes
in students' attitudes, knowledge, and behavior. The
activities of the project and the assessment of the
effects of the activities will be related to the goals
of the project as identified in the funded proposal.
|
Describes purpose of evaluation report
|
This report consists of a description of the activities
of the first and second years of the project. The primary
purpose of the report is to document the actual activities
of the program and relate the activities to the original
program proposal. This report also includes preliminary
information related to the summative evaluation questions
based on the first cadre's opportunity to try some of
the computational science resources and strategies in
their classrooms during the past year. This information
was collected so that the project staff could design
the second summer's training sessions and modify the
first summer's training sessions for the new cadre (Cadre
2) to reinforce those activities that seemed of value
and add activities to address any areas of need.
|
|
|
Excerpt 8
[Oregon Collaborative]
|
Stakeholder
Involvement:
Identifies evaluators
|
The PI, Project Coordinator and Evaluation Coordinator
identified individuals early on, before OCEPT received
the award, who had background in evaluation and research
and who we thought would be interested in serving on
the Team. A series of meetings took place between February
1997 and Fall 1997 to develop the Evaluation Plan framework
and proposed activities. By late Fall, membership on
the Team had stabilized at ten individuals, and includes:
the PI, Project Coordinator, Evaluation Coordinator,
the Science Specialist for the Oregon Department of
Education, four Mentor Team members, a Faculty Fellow
and the director of a Teaching and Learning Center at
a local community college. The backgrounds represented
include two- and four-year, private and public institutions;
and math, science and teacher education.
|
|
|
Excerpt 9
[Anonymous 3]
|
Evaluator
Credibility
|
As the associate director for research and evaluation
at University XX from 1987 to 1991, I was aware of
Project A, but didnt play an active role in
the project. After the project was completed, I served
as the final evaluator, interviewing about 30 participating
teachers in person and by phone. Because of my history
as evaluator of the first generation of Project A,
I had the opportunity to evaluate its second generation.
For this later generation I played a different role:
I was involved with the program from the beginning
and served as an internal and formative evaluator.
I visited the summer sessions and documented the program
development. From my perspective as evaluator on both
generations of the program, I offer the following
three observations about the ideas that underlie Project
A. In brief, they are the vital importance of teacher
collaboration, connection to the standards-based curriculum,
and the flexibility of the Project A "model."
|
|
|
Excerpt 10
[Philadelphia Collaborative]
|
Stakeholder
Involvement
|
In Year 3, several subcommittees were formed at the
suggestion of the Evaluation Subcommittee. These committees
consisted of faculty, teachers, and staff whose programs
"overlapped" (e.g., faculty from TU and
CCP Math Departments, Math Education faculty, Philadelphia
School District math teachers, Science faculty from
TU and CCP, Science Education faculty. An additional
purpose for forming these subcommittees was to foster
collaboration in a number of areas, including a collaboration
on issues of assessment and process evaluation.
|
|
|
|
|
|