home
  : Reports : Curriculum Development





























home reports instruments plans
search

Curriculum Development Stand-Alone Report 2 (Final)

Return to Curriculum Development Reports

NSF CCLI-A&I Pilot Project Evaluation

Return to Table of Contents

Previous Page


Findings

This section presents major qualitative and quantitative findings, arranged by module, for each data collection method used. Quantitative analyses were conducted using univariate descriptive statistics, means comparison tests, and correlations, as appropriate.

Appendix C contains complete frequency distributions for all close-ended responses to the Module Effectiveness Surveys. Participant characteristics were provided by questions Q15-Q21 of the survey, and frequency of use data was derived from question Q14. Mean ratings for questions Q1, Q2, Q3, Q4, Q5 and Q11 were used to assess student satisfaction, and Q6, Q7, Q8, Q9 and Q10 provided impact on learning data. Responses to questions Q1 through Q11 ranged from 1 (strongly disagree) to 5 (strongly agree) on a 5-point Likert-type scale. Impact on learning was further assessed using responses to Q13.

Sustainable Urban Development - URBS 492

Student Evaluations of Teaching Effectiveness (Close-ended SETs)

Of nine dimensions compared in this analysis of student SETs, mean scores on seven factors dropped slightly from Spring 1999 to Fall 2000, while two dimensions were rated slightly higher by students. Overall, students in both semesters strongly agreed the course was well organized, the instructor showed a strong interest in the subject matter, course objectives and requirements were clearly presented, and assignments were effectively used to enhance learning. However, students in the module-enhanced class agreed slightly more than Spring 1999 students that they were motivated to learn about the subject and that the course had contributed significantly to their knowledge of the subject.

Table 1 provides mean scores and computed differences between semesters on all nine dimensions considered in this analysis. Scores range from 1 (strongly disagree) to 5 (strongly agree), and dimensions are ranked from highest to lowest score received in Fall 2000.

Table 1 Comparison of Student Evaluations of Teaching Effectiveness (Close-ended SETs)

Pre- and Post- Module Student Evaluations
(URBS 492)
Mean
Spring 1999
Mean
Fall 2000
Change
6.
Instructor showed a strong interest in subject matter
4.73
4.70
-0.03
2.
Syllabus clearly stated course objectives, requirements, grading policies
4.59
4.44
-0.15
3.
Instructor effectively used assignments to enhance learning
4.41
4.30
-0.11
12.
Course contributed significantly to my knowledge of subject
4.18
4.22
0.04
10.
Instructor encouraged critical thinking about course topics and material
4.27
4.19
-0.08
1.
Class was well organized
4.23
4.19
-0.04
13.
Overall, course was taught effectively
3.95
3.85
-0.10
11.
Instructor motivated me to learn more about subject
3.77
3.85
0.08
7.
Instructor able to express ideas clearly
4.00
3.81
-0.19

Student Evaluations of Teaching Effectiveness (Open-ended SETs)

Of 27 students who provided open-ended comments on year-end SETs, 20 were selected for review. Fifteen of these (75%) offered generally positive statements such as "the course exposed me to a wide variety of research," to "[I feel] much more confident after taking this class." Negative appraisals often concerned the amount or topic area of the material covered in class. One student felt too much material was covered, another believed "more information could have been conveyed per class," and a third thought the course material should be "more Urban Studies-related."

Overall, the constructive tone, specificity and length of comments provided indicated students were satisfied and highly engaged with the module as an instructional tool. Module activities were specifically mentioned by 65% of the students. Of these, two students offered unconditional positive feedback regarding their satisfaction with the course, such as "I learned about computer programs and research tools that were new to me, especially SPSS and ArcView" and "I gained professionally with the PowerPoint lessons."

Eleven students provided a mix of criticisms and suggestions regarding the module. Most indicated they liked the "lecture/lab split," but felt more time should be allocated to the lab to allow for more "hands-on" experience, more "interaction with the module activities as a class," and more time to work on the variety of computer programs introduced. Students generally felt the module assignments were "valuable," "informative," and "challenging enough to be interesting," although several suggested the module could be "streamlined," "updated," or contain more visual components to improve its clarity.

Focus Groups

Of the 12 students randomly sampled from URBS 492, a majority (75%) had primarily enrolled in the course as a prerequisite to their graduate or undergraduate degrees. Others had taken the course to enhance their job prospects or to advance their interest in research. Students were unaware of the module component to this course until after enrollment.

Module Relevance

Students generally believed all components of the class were important, but that the hands-on lab exercises provided an essential opportunity to bridge the gap between abstract concepts and actual application. Several particularly appreciated how assigned readings prepared them for the lecture material, which was then reinforced through practical application in the lab assignments. One student summarized, "The different components definitely support each other."

Most concurred one of the modules' primary benefits was to provide an alternative mode of learning for students with different learning styles and abilities. Students acknowledged some information is not always best conveyed, nor easily understood, through written or verbal means alone. One hearing-impaired student, who agreed to be identified, could not envision how the course would be taught effectively without the module, as it afforded him greater access to the lecture material. Several students also believed their exposure to new research tools, especially PowerPoint, provided both immediate and long-term academic and professional benefits. One student asserted, "In a year, you will pull out the book and use it again."

Content and Function

Most dissension among the group regarding the module's effectiveness centered on aspects of its content and function. The discussion became more animated as concerns over the module's level of detail and discrepancies between the manual and the actual Websites surfaced. In particular, several students felt the instructions were too detailed, lengthy or repetitive, which reduced their clarity as well as students' interest. One student remarked, "I ignored everything between the bolded text," and another admitted, "I had to skip a lot to get [the exercises] done in time." Some students felt this step-by-step instructional approach made the module less than challenging academically; however, one student, who identified herself as "less skilled," believed this approach was imperative for her to be able to "reduce anxiety" and complete assignments. She, however, also agreed instructions were frequently repeated.

To address the problem of academic fit, one student then recommended the module include broader concepts and more visuals, which could be accessible through help screens. That would allow students to develop a better sense of context for the exercises and to apply the material elsewhere. Four students verbally agreed with this suggestion and several others nodded.

Regarding discrepancies in the manual, students felt weekly updates in the form of handouts or printed Web-pages would be helpful. Several acknowledged the difficulty in keeping the manual up-to-date, but most found it absolutely essential. One stated, "Without it, I would have quit."

Students generally believed the tool was appropriate for their varied academic level and computer background, although several specifically stated the use of cartoon characters was "inappropriate," "silly," and even mildly "condescending." A few students, however, "appreciated the effort to make it less dry," and "to humanize it." All agreed the tools presented were relevant and useful, and the majority felt the module had enhanced their communication skills and improved their access to information. Overall, students expressed great interest in seeing the module retained and developed stating, "I have faith in what it can become," and "It was not bad for a first version!"

Module Effectiveness Surveys

Characteristics of Participants

Surveys were administered to all 27 students enrolled in URBS 492 in Fall 2000, and 23 (85%) participants provided demographic data. Students ranged in age from 20 to 40 (M=27.9, SD=5.7), and 14 (61%) were female. Of 22 students who provided ethnicity data, 17 (77%) identified as exclusively white, 3 (14%) as multiethnic and two (9%) as other ethnicity.

A majority of students (91%) were employed at least part-time and worked from 10 to 60 hours per week (M=32.8, SD=14.5). Graduate students comprised 57% of the class, 35% were juniors and 9% were seniors. Masters of Public Administration students made up 52% of the class, and another 30% were Urban Studies majors.

Students' self-assessment of prior experience on five course-related dimensions did not vary by sex, age, class level or ethnicity. Students' level of experience on each dimension did, however, vary considerably. For instance, while most students (88%) characterized their prior Internet experience as medium to high, 54% and 65% rated their Investigator and PowerPoint experience as low, respectively. Additionally, more than 90% of the class reported low pre-module exposure to SPSS and ArcView.

Frequency of Use

Students in URBS 492 reported relatively high usage of the module over the course of the 15-week semester. Students used the module at least 5-6 times during the semester and the class average was 11-12 times per semester. Eight of 24 students (33%) reported they had used the module 15 or more times in 15 weeks. Frequency of use did not vary as a function of sex, age or ethnicity, and number of hours worked per week was unrelated to frequency of use. Frequency of use was, however, positively related to students' class level (r=.43 p<.05) and to overall satisfaction with the module (r=.56 p<.01).

Student Satisfaction

Students reported moderate satisfaction with the module's overall quality (M=3.70 SD=.87) as measured by response to Q11. Students more strongly agreed the module was applicable to course material (M=4.44 SD=.80), clearly integrated with the goals of the course (M=4.11 SD=1.12) and not too difficult (M=4.00 SD=1.02). Most students (67%) felt they had received sufficient instruction to use the module effectively, although 12 (44%) were unsure or disagreed that the module was technically easy to use. Women were somewhat more satisfied with the module than men, but satisfaction did not vary by age, class level or ethnicity.

Students who found the module technically easy to use also used it more frequently (r=.58 p<.01), felt they had sufficient instruction to use it effectively (r=.80 p<.01), and were very satisfied with its overall quality (r=.87 p<.01). Those who believed the module was applicable to course material felt it was clearly integrated with the goals of the course (r=.59 p<.01), believed they had been adequately instructed on its use (r=.61 p<.01), and were also highly satisfied with its overall quality (r=.69 p<.01).

Impact on Learning

Although students generally agreed the module increased their confidence in accessing new technology (M=3.93 SD=.96), improved skills relevant to career goals (M=3.81 SD=.79), and enhanced their interest in social science inquiry (M=3.74 SD=.81), those who used the module most frequently were more likely to report it had a positive impact on their learning. Students only moderately agreed the module motivated them to learn more about computers (M=3.67 SD=1.14) or made course work more engaging (M=3.63 SD=.93). Impact on learning scores did not vary by student sex, age, class level or ethnicity.

Students' self-ratings of their pre- and post-class experience on five dimensions as measured by Q13 is illustrated in Figure 1. Before and after ratings of experience with Investigator, SPSS, ArcView and PowerPoint software increased from low to medium on average. Students indicated less dramatic changes in Internet experience, as most students (88%) reported medium to high Internet experience before this course. An overall before and after competency rating, derived by summing Q13 scores on all five dimensions, revealed students' mean self-appraisals increased from 7.43 (before this class) to 11.04 (after this class) on a scale from 5 (low competence) to 15 (high competence).

Students who used the module most frequently reported the highest overall after-class competency scores (r=.54 p<.01). Those who found the module easy to use were more likely to report the module increased their confidence in accessing new technology (r=.55 p<.01) and enhanced their interest in social science inquiry (r=.41 p<.05). Students who felt the module motivated them to learn more about computers also reported increased confidence in accessing new technology (r=.57 p<.01), found the course work more engaging (r=.61 p<.01), believed career-relevant skills had been improved (r=.40 p<.05), and used the module more often (r=.50 p<.05). Students who reported increased confidence in accessing new technology as a result of using the module were also more likely to be satisfied with its overall quality (r=.57 p<.01).

Strengths and Weaknesses

Twenty-two students provided open-ended comments regarding the module's major strengths. Of these, 7 (32%) referred to the tool's ability to provide valuable introduction to a wide variety of research methods, useful information and computer software. Half of these students further indicated they would use the module as a resource in the future. Five students (23%) felt the hands-on approach to learning afforded by the module was its key strength, while 5 others (23%) felt the web-based design made the tool easy to use and highly accessible. One student felt the module was "interesting and fun," while another specifically valued learning ArcView and SPSS.

As for weaknesses, students most often mentioned the tool seemed technically outdated in places (45%) and that some functions did not coincide with the printed instructions. Seven respondents (32%) suggested the module was somewhat tedious, too detailed and wordy, and might benefit from more visual aids and "greater clarity without oversimplification." One student felt the online instructions were repeated too often and another suggested the booklet might be reformatted and spiral-bound.

Figure 1

figure 1

Return to Table of Contents

Next Page