|
|
|
: Reports : Teacher Education |
|
|
|
|
|
|
|
|
|
|
Annotations |
Report Excerpts |
|
|
Excerpt 1
[Los
Angeles Collaborative]
|
Qualitative
and Quantitative
Analysis:
Describes data processing procedures
|
During the course of Year One, ETI staff engaged
in a variety of evaluation activities.
- Thematic review. Focus groups,
interviews, and field notes were transcribed and
scrutinized for common themes and trends.
- Survey Analysis. Faculty and
student surveys were cleaned and keypunched. Resulting
data were analyzed using SPSS-PC, a Windows-based
statistical analysis package.
|
|
|
Excerpt 2
[Philadelphia
Collaborative]
|
Qualitative
and Quantitative
Analysis:
Describes coding procedure
|
The course survey in Spring 1996 contained 34 multiple
choice questions and of these, 21 related specifically
to students' attitudes about the course. The 21-item
scale measure was reliable (coefficient alpha=0.93).
Four open-ended questions were included on the student
survey. One of the open-ended questions asks if students
had observed anything in the course that they thought
they could incorporate into their own later teaching.
Responses to the open-ended questions were sorted
into two broad categories"Methods Used
in the Course" and "Skills Gained by the
Participant."
|
|
|
Excerpt 3
[Rocky
Mountain Collaborative]
|
Qualitative
Analysis:
Describes data limitations
|
Limits to Value of Survey Responses.
Responses of students on the Course Checklist provide
valuable, but limited, information on the courses
affiliated with the RMTEC Project. Responses are brief
and no follow-up was possible. Students were not asked
to formulate their own criteria, and some students
commented that the instrument did not include items
they wanted to see. In addition, the surveys offer
student perspectives while ignoring instructor perspectives.
By virtue of attempting to revise and improve their
courses, instructors inevitably are put into disequilibrium,
and it may take time to integrate instructional strategies
into their repertoire. Thus, for example, instructors
may struggle with ways in which they can best integrate
experimentation, inquiry modes of instruction, and
cooperative learning into their existing methods.
Depending upon the experiences of the instructor,
preliminary efforts may not be completely successful.
Hence, survey responses are summarized here with caution
as to their meaning and interpretation.
|
Quantitative
Analysis:
Describes the generating of summary statistics
|
Analyses of Survey Responses. For
this report, responses were analyzed in aggregate
(collapsing across institutions and courses). In addition,
frequencies were computed separately for each individual
course, and copies of these analyses are being sent
directly to the respective instructors (with typed
notes from narrative responses to the open-ended questions).
|
Quantitative
Analysis:
Provides response rate
|
Characteristics of Respondent Sample.
The "Demographic Information" report prepared
by the Evaluation Team (Spring 1996) indicates that
474 students were enrolled in RMTEC courses during
spring semester. Responses to the course evaluation
survey were available from 402 students (response
rate of 85%). One section of a course was eliminated
from subsequent analyses, with the ultimate sample
size being 362.
|
Quantitative
Analysis:
Describes analysis of subgroup data
Describes data limitations
|
Sub-Group Analyses. The RMTEC Project
includes a commitment to accommodating diversity in
the backgrounds of its students. Because of this commitment,
we examined responses separately according to 3 student
variables: gender, ethnic background, and intention
to teach. Tests of statistical significance were computed
as admittedly unsatisfactory, though possibly interesting,
measures of differences in satisfaction between the
various sub-groups participating in the RMTEC Project.
(The students did not comprise a randomly selected
sample, making statistical inferences questionable;
also, sample sizes tended to be quite small for some
of the subgroups, reducing power of analyses and generalizability
of findings; finally, statistical significance does
not equate to practical significance, and any resulting
differences should be considered thoughtfully in terms
of possible impact of the magnitude of the differences.)
|
Quantitative
Analysis:
Describes creation of composite variables, data limitations,
and tests of statistical significance (t-tests and
analysis of variance)
|
Global Scores on Checklist (Mean Comparisons).
A composite variable was computed across all items
on the checklist. Although the ordinality of the responses
to scale can be questioned (roughly, numbers indicate"no
implementation of strategy" to "implementation
that was not helpful" to implementation of strategy
that was somewhat to very to extremely helpful"),
the single composite variable provides a rough estimate
of students' perceptions of the degree to which course
strategies were implemented effectively in their courses.
A high number of the variable represents positive
perceptions of strategy implementation. t tests indicated
no significant differences between men (n=156, M=56.55,
SD=20.74) and women (n=200, M=58.16, SD=22.18). Students
who declared an intention to teach (n=154, M=62.49,
SD=18.95) were more positive in their perceptions
than students who did not plan to teach (n=187, M=52.70,
SD=22.15). Differences between teaching and non-teaching
students could have arisen because the two groups
tended to take different courses. To contend with
this possibility, we compared teaching and non-teaching
students only in the content courses (mathematics
and chemistry courses, excluding education and science/mathematics
methods courses). Differences were present overall,
but when chemistry and mathematics courses were examined
separately, statistically significant differences
persisted only in the chemistry courses. Finally,
there was a non-significant trend for subgroups to
differ according to ethnicity (with a one-way ANOVA,
p<.08).
|
Quantitative
Analysis:
Describes chi-square analysis of subgroup data
|
Analysis of Individual Items by Subgroups
(Chi Square Analyses).
After conducting the comparisons on the global composite
variable by subgroups, individual crosstabs were computed
for each item with each sub-group. As a rough guide
for differences between subgroups, chi-square tests
were conducted. In terms of analyses by gender, only
a few items showed distributions that varied between
men and women, and no apparent overall patterns were
discernible (6 out of 28 items produced statistically
significant chi squares). Consistent with the t test
on intention to teach, there were quite a few differences
in distributions between students who did and didn't
intend to teach (18 out of 28 chi-square analyses
were significant at the p<.05 level). In general,
students who were not preparing to teach more often
selected the "didn't happen" option than
did students who were planning to teach. Conversely,
students who declared a teaching intention were more
positive overall in using higher levels of "helpfulness"
responses. Finally, the same series of chi square
tests were computed with ethnicity, but small sample
size of some of the ethnic groups and vacant cells
of many tables and complexity of patterns make interpretation
questionable (6 out of 28 items produced significant
chi square tests).
|
Qualitative
Analysis:
Describes how categorization scheme emerged inductively
from examining responses
|
Qualitative Comments. At the end
of the survey, two open-ended questions were posed:
(1) Please add any other comments you would like to
make about the strengths of this course (Any comments
about which aspect of the course produced the most
learning for you will be very helpful); and (2) What
recommendations do you have for improving this course?
Examples of comments are included in Appendix C. The
following categories emerged in analysis of the comments:
instructors, Teachers-in-Residence and teaching assistants,
teaching methods, working in groups, assignments,
alignment of course content with students' need to
learn about teaching strategies, assessment strategies,
relationship between this course and other courses,
books, labs, and other learning supplements, technological
support, structural variables, work load, and other
global evaluative comments.
|
|
|
Excerpt 4
[Oklahoma
Collaborative]
|
Quantitative
Analysis:
Describes quantitative analyses appropriate to evaluation
questions and goals
|
We will present the assessment data in terms of change
in scores, not in terms of the absolute value
of scores. There are two reasons for this. First,
the goal of the summer academies is to present education
reform materials and methods that may change attitudes
about educational reform. Thus, assessing the change
in attitudes is the most logical way to evaluate the
performance of the summer academies. Second, the absolute
values of item endorsement may be dependent upon numerous
factors such as participants backgrounds and
examination of absolute scores does not address the
central goal of O-TEC which is to change attitudes
and teaching behaviors.
|
Quantitative
Analysis:
Describes quantitative analyses of subgroup data
|
In addition to examining overall patterns of change
in scale scores, the Attitudinal data was also examined
to see whether there were significant differences
between groups on the basis of (1) teaching service
status (high school/college/in-service teacher), (2)
gender, and (3) ethnic identification, and (4) program
site.
|
Quantitative
Analysis:
Describes analysis of variance
|
Measure to Evaluate Satisfaction with the
Summer Academies The Final Evaluation survey
was designed to assess global attitudes of students
towards their summer academy experiences and consisted
of 12 statements rated for agreement on a Likert-type
scale from 1 (strongly disagree) to 5 (strongly agree).
Analysis of the Final Evaluation surveys consisted
of determining the frequency, range, and mean level
(average) of responses to each question for each summer
academy site, and an analysis of variance (ANOVA)
to determine whether significant differences existed
between the average levels of satisfaction at different
summer academy sites. Unlike the attitudinal survey,
analyses of final evaluation results were not performed
by gender, ethnicity, and other demographics because
this information could not be matched to the anonymous
questionnaires.
|
|
|
Excerpt 5
[Montana
Collaborative]
|
Quantitative
Analysis:
Summarizes survey data in table format
|
Table 1 provides information on the courses surveyed
by campus and by content area.
Table 1: Courses Surveyed
|
Education |
Mathematics |
Science |
Totals |
MSU-Bozeman |
2 |
4 |
4 |
10 |
U. of Montana |
2 |
3 |
1 |
6 |
MSU-Billings |
1 |
2 |
1 |
4 |
MSU-Northern |
2 |
1 |
1 |
4 |
Western MT College |
0 |
2 |
5 |
7 |
Totals by Area |
7 |
12 |
12 |
31 |
|
Identifies where the respondents came from
|
Twelve (12) mathematics and twelve (12) science content
classes were surveyed. Seven (7) education methods
courses were surveyed. Twenty-one (21) courses were
required by elementary education programs and ten
(10) courses in secondary education programs. All
surveys were completed by students during regularly
scheduled class time.
|
Specifies response rates for sample subgroups
|
Fourteen-hundred and ninety three (1,493) completed
student surveys were collected, including (833) 55.8%
from students indicating that they were planning a
career in teaching. Females represented eight-hundred
and eighty nine (889) or 59.5% of responses. Caucasians
represented (1379) 92.4% and Native Americans (46)
3.1% of the responses. Most respondents had graduated
from high school in the 1990s (79.0%) or 1980s
(13.1%). Large science courses provided the greatest
number of responses, (744) 49.8%. Mathematics gave
(492) 33.0% of responses. Education methods course,
which have smaller enrollments, contributed (257)
17.2% of responses. A summary of respondent demographics
is found in Table 2, with information on gender, ethnicity,
year of high school graduation, and course type.
|
|
|
Excerpt 6
[Anonymous
4]
|
Quantitative
Analysis:
Summarizes survey data on key topics
|
The participants were asked how much they learned
about eleven content topics in the course. All ten
teachers responded to learning "a great deal"
or a "good amount" on the following topics:
Search, Solve, Create & Share (SSCS), Learning
Cycle, Benchmarks, National Science Education
Standards, and Scientific Inquiry. For the six
other course topics, participants also reported to
have learned "a great deal" or "a good
amount," but one or several teachers reported
that they learned "some" about the topic.
Those answering "some" were asked the reason
why they did not learn more. Teachers responded either
the material was not well presented, they already
knew most of the material, or they needed to spend
more time on it or see more examples.
|
Quantitative
Analysis:
Summarizes survey data in table format
|
11. How much did you learn about:
Question |
A great deal |
A good amount |
Some |
Little |
Mean |
|
4 |
3 |
2 |
1 |
|
Constructivist teaching & learning |
90% |
|
10% |
|
3.80 |
Search, Solve, Create & Share (SSCS) |
80% |
20% |
|
|
3.80 |
Learning Cycle |
70% |
30% |
|
|
3.70 |
Alternative Assessment |
50% |
20% |
30% |
|
3.20 |
BSCS materials |
50% |
40% |
10% |
|
3.40 |
GEMS materials |
40% |
40% |
20% |
|
3.20 |
Benchmarks |
70% |
30% |
|
|
3.70 |
National Science Education Standards |
70% |
30% |
|
|
3.70 |
Teaching evolution |
10% |
50% |
40% |
|
2.70 |
Teaching heredity |
10% |
60% |
30% |
|
2.80 |
Scientific inquiry |
56% |
44% |
|
|
3.56 |
Overall mean for Question
11 |
3.42 |
|
|
|
Excerpt 7
[Anonymous
4]
|
Qualitative
and Quantitative
Analysis:
Presents key findings in text and table formats
|
Nine teachers participated in the course taught by
Dr. A. Course evaluation data indicated the teachers
found the course to be challenging and were pleased
with the update in their content knowledge. The students
were also pleased with Dr. As knowledge and
teaching of the subject matter as well as her treatment
of and responsiveness to students in the class.
The teachers evaluated Dr. As teaching ability
in several categories. Dr. A received an overall mean
of 4.76 (scale of 5 high, 1 low) with means of responses
ranging from 4.335.0. These scores indicate
the teachers found Dr. A to be a very effective teacher.
Question |
Agree strongly |
Agree |
Not Sure |
Disagree |
Disagree Strongly |
Mean |
Dr. A |
5 |
4 |
3 |
2 |
1 |
|
Was knowledgeable about subject matter |
89% |
11% |
|
|
|
4.89 |
Stimulated interest in subject matter |
89% |
11% |
|
|
|
4.89 |
Utilized visual material effectively during
lecture |
67% |
22% |
11% |
|
|
4.56 |
Synthesized, integrated & summarized information
effectively |
56% |
33% |
|
11% |
|
4.33 |
Was responsive to students' questions |
100% |
|
|
|
|
5.0 |
Was available & approachable outside of
class (n=8) |
75% |
12.5% |
12.5% |
|
|
4.63 |
Treated students with respect |
100% |
|
|
|
|
5.0 |
I would recommend Dr. A to other students |
78% |
22% |
|
|
|
4.78 |
Overall mean for Question
1 |
4.76 |
When asked what techniques, approaches or teaching
methods worked particularly well in the course, students
commented on Dr. As use of scientific articles
and her ability to answer student questions in understandable
terms. Representative comments were:
- "I really enjoyed reading the articles and
then figuring out what I did not know"
- "Article discussion is nice way to [the]
concepts."
- "Probing questioning/translation of technicalese
to laymans terms."
- "Using articles that explained how researchers
used techniques."
When asked how Dr. A might improve, teachers desired
more background information on difficult topics and
more direction for weaker students or when working
in groups. Two comments were:
- "The group work was very frustrating. Except
for one group, all the other groups were dysfunctional.
Dr. A should require groups to divide the work in
specific sections."
- "A glossary of technical terms found in the
articles would be useful."
|
|
|
Excerpt 8
[Virtual Economics,
National Center for Research in Education]
|
Quantitative
Analysis:
Compares responses of two different groups
|
C. Comparison: Beta versus Final
It is also possible to compare the responses to the teachers
reviewing the beta and final versions of Virtual Economics.
Aside from the different selection criteria used,
the two samples are similar in that both groups use computers
and CD-ROMs at similar rates. The average years of teaching
experience is 16 for both samples. The beta sample, however,
has substantially more credit hours of college economics
and is more involved in economics workshops and seminars.
|
|
|
Excerpt 9
[Montana Collaborative]
|
Qualititative
and Quantitative
Analysis:
Describes data limitations
|
1996 Student Survey Report
There was considerable variation in the care taken
by respondents in completing the survey. Some students
in large enrollment, entry level courses (1) did not
follow directions; (2) left many questions blank;
and/or (3) failed to respond to the open-ended questions.
Some students deliberately gave senseless responses
to open-ended questions. On the other hand, surveys
returned from higher level education courses were
observed to have almost all answers completed according
to directions and open-ended questions were generally
given thoughtful answers. Some classes were exceptions.
For example, carefully written student answers were
the norm for one large introductory science class.
These students demonstrated a facility in responding
to instructional questions.
|
Describes limitations to subgroup
comparisons
|
The largest course provided 160 responses
and the smallest course gave just 5. Responses were
summarized by (1) gender; (2) ethnicity; (3) course
type, mathematics, science or education; (4) planning
to teach vs. other career plan; and (5) elementary vs.
secondary education program courses. Comparison of response
patterns by gender was confounded by course enrollment.
Female responses were predominantly from elementary
education courses, while male responses were most often
from science or mathematics content courses with lower
pre-teacher enrollments. Review of responses by ethnicity
was limited by the relatively low numbers of minority
students surveyed.
|
Presents rationale for reporting results by
subgroup
|
Reporting data by subject area subgroups was logical
for some instructional questions. Response frequency
patterns for many items vary considerably when data
are broken out by course type. For example, reported
use of graphing calculators is relatively high for
mathematics courses, but appears low when all survey
responses are combined.
Discussion of findings is clustered in the following
areas targeted by survey questions: (1) student affect;
(2) inquiry instruction methods; (3) technology; and
(4) assessment. The open ended response questions,
which may be the most revealing part of the survey,
are described at the end of this section.
|
Quantitative
Analysis:
Presents quantitative findings in table
format
|
Student affect was the focus of five (5) forced choice
questions. The questions and the tally of student
responses are provided below:
Responses to Student Affect
Questions (N=1493)
|
Yes |
No |
Don't Know |
No Response |
Do you think the work you are asked to do is
challenging?
|
1292
86%
|
176
12%
|
17
1%
|
8
1%
|
Do you feel free to talk to your instructor
individually about work/progress in
course?
|
1363
91%
|
87
6%
|
40
3%
|
3
<1%
|
Are all students (regardless of gender,
ethnicity, or handicap) treated equally
in this class?
|
1414
95%
|
23
2%
|
51
3%
|
5
<1%
|
|
Never
"1"
|
Rarely
"2"
|
Sometimes
"3"
|
Frequently
"4"
|
Almost Always
"5"
|
No Response |
Encounter materials or activities that
provoke curiosity
|
76
5%
|
137
9%
|
478
33%
|
549
37%
|
221
14%
|
23
2%
|
Do problems or projects that you find
interesting
|
66
4%
|
160
11%
|
557
37%
|
519
35%
|
169
11%
|
22
2%
|
|
Summarizes key findings
|
Students reported that the class work they are asked
to do is challenging (86%), they feel free to talk
with instructor (91%), and all students are treated
equally (95%). When students were asked to rate how
often they encounter materials or activities that
provoke curiosity, 52% reported frequently or almost
always. When the response "sometimes" is
included, the combined total is 85%, this gives
a mean of 3.50 for responses.
Students were asked how often they do problems or
projects they find interesting, 46% reported frequently
or almost always. By including the response
"sometimes" the total becomes 83%, the
mean is 3.38.
Open Response questions may reveal the most unique
information. When students were asked:
|
Qualititative
and Quantitative
Analysis:
Presents percentages of responses and related
comments
|
How is (or isn't) this course different from other
math/science courses you have taken in the past?
Out of 1,274 student responses 1,057 students (83%)
indicated they found the courses different. Sample
positive student comments about why include: This
course is designed more for the student's benefits
than my other course. The focus is put on me instead
of what the professor wants to accomplish for
himself.
(
)
What kind of connections (if any) do you see between
the content presented in this class and the world
outside of school? 79% of responses, 961 out of
1,215 total responses, indicated students saw connections
between class content and the real world. Sample positive
student comments include: The content of this course
contains many items that deal with the outside world,
from medical applications to everyday activities like
dragging a suitcase or pushing a box.
|
|
|
|
|
|