Math 124/5 Calculus Reform:
Midyear Report: January 2004

Department of Mathematics
University of Washington



Introduction

In Autumn 2001, with the aid of Tools for Transformation funding, our department embarked on a three-year project to improve both student and instructor satisfaction in Math 124/125, the first two quarters of our calculus course for science and engineering. The reform model which has emerged features continuous improvement through ongoing feedback from students, teaching assistants and course instructors. We believe that this midyear update, in conjunction with the annual reports for the academic years 2001-02 and 2002-03. paint a picture of successful reform. In this short update, we focus on an overview of assessment results to date.

Assessment Overview

From the very beginning of our project, assessment has been a centerpiece. Much of this assessment has been external (via OEA or CIDR), and has provided quantitative measures of success. Results from Small Group Interactive Diagnostic surveys (SGIDs) conducted by CIDR, OEA student tracking, student evaluations, and an OEA-designed set of supplementary questions have provided enormous amounts of useful feedback. For example, over 6000 students have participated in SGIDs since Spring 2001. In addition, ongoing internal assessment is built in to our reformed calculus model via weekly course instructor meetings and TA worksheet training workshops. Having a dynamic reform model in place that can take advantage of these continuous streams of assessment data has been central to our success.

Supplementary questions

In Autumn 2001, OEA helped to design eleven supplementary questions aimed at assessing the success of our reform efforts. (Actually, twelve questions were designed, but two were merged into one question at the end of the first year of the TFT project. See Appendix A for details and the list of questions.) These questions are appended to the usual OEA student evaluation forms.

We now have data from a total of 7 academic quarters, included in Appendix A. We want to highlight a couple of particular points. First, student satisfaction with their overall learning has been consistently high during the entire project; see question 12 in 2001-02 and question 11 in subsequent quarters. Secondly, the variance on all of the questions has tightened as the project has progressed. We think this is a reflection of the fact that as instructors become aquainted with the new curriculum they are offering students a more uniform improved experience.

Any curricular reform will experience initial glitches and ours was no exception. The comparatively low ranking responses to the homework and worksheet questions during the first year of reform indicated that additional work was needed. (We also obtained similar feedback through the first rounds of SGID surveys and through the weekly course instructor meetings.) We concluded that TA-training in the use and implementation of worksheets was needed and introduced such training in 2002-03. The idea was that every TA go through weekly training in Math 124 worksheets the first time he/she used them, and similarly for Math 125 worksheets. Responses to supplementary question 10 indicate the success achieved in this way during 2002-03. As 2002-03 was the year of introduction of worksheet training, the weekly sessions were attended by all TAs and the sessions evolved into TA equivalents of the weekly instructor meetings. Weekly instructor meetings are a core component of the reformed course. Regular interaction between course instructors encourages sharing of ideas, builds collegiality, and offers everyone a sense of ownership in the course. Having put the majority of continuing TAs through worksheet training last year, this year's training sessions are attended mostly by new TAs. A small dip in Math 125 worksheet perception in Autumn 2003 indicates that a one-time requirement to attend worksheet workshops was slightly off the mark. (Math 124 fared better since most new TAs were assigned to Math 124, resulting in a wider segment of the TAs to be required to attend weekly meeting than was the case for Math 125.) What we really need is an analogue of the weekly course instructor meetings for the TAs. Consequently, beginning in Autumn 2004, each of Math 124 and Math 125 feature weekly TA meetings attended by all TAs to discuss not only worksheet implementation but also broader issues related to course instruction.

SGIDs

As noted above, over 6000 students have participated in SGIDs. A breakdown can be found in Appendix C. Without exception, all faculty involved have voiced their belief that the feedback they obtained midquarter led them to institute changes that improved the course. Taken as a whole, themes have emerged that help us to fine tune the course. For example, early on we learned through SGIDs that instructors need to pay more attention to explaining the relevance of each part of the course package: the lectures, the worksheets, the homework. The specific SGIDs for Autumn 2003 are available in Appendix D; SGIDs for previous quarters are accessible via the annual reports cited above. Comparing these with SGIDs contained in the previous two annual reports shows very similar positive results.

OEA tracking

OEA was asked to obtain information on the success of our reform. For comparison, we selected the academic year 1997-98, the last year prior to the implementation of any course changes. (As noted in previous reports, the department implemented a number of curricular changes prior to the the major calculus reform efforts being discussed here; see Appendix E, 2002-03 Report for an account of these changes.) We have attached the OEA tracking results in Appendix B.

The data contained in the final table of the tracking results concerns the the future success of our students. Here, one can see mean grade performance of our Math 124/5 "graduates" in subsequent courses. In almost all cases the mean grades have improved.

In the first section of the OEA tracking report, we see increased numbers of Math 124 students going on to take Math 125. Additionally, the breakdown shows a dramatic increase in this measure of success for female students. Grades for both Math 124 and Math 125 improved from 1997-98 to the first year of reform, 2001-02. Interestingly, while mean grades for men and women are similar in Math 124, women have performed better in Math 125. The mean grade trends are highlighted in the graph below:

All of these findings go along with data we previously presented on pass rates, which we reproduce here:

Math 124/125 Pass Rate Comparison

 

Course

1997-98*

2001-02*

2002-03*

Math 124

81% (=1375/1693)

88% (=1590/1801)

87% (=1538/1758)

Math 125

82% (=1048/1281)

85% (=1405/1662)

87% (=1377/1590)

Student Evaluations

We have seen a dramatic and sustained improvement in student evaluation data. For example, the data for Math 124 is given below:

Math 124 Student Evaluation Comparison

 

Question

1997-98*

2001-02*

2002-03*

Aut 2003*

1. The course as a whole was:

3.6

3.9

3.9

4.0

3. The instructors contribution to the course was:

3.8

4.1

4.2

4.3

4.The instructor's effectiveness in teaching the subject matter was:

3.7

4.0

4.1

4.2

18. The amount you learned in the course was:

3.1

3.3

3.6

3.7

* The numbers for questions 1,3 and 4 are adjusted scores; the numbers for question 18 are unadjusted. The numbers given are weighted averages (using the count of students who turned in evaluation forms) of OEA data.

Broad faculty involvement

Our faculty are deeply invested in their mathematical research and teaching, and enjoy the opportunity to share their enthusiasm at all levels of instruction. For this reason, our core belief is that a successful course curriculum should attract a broad spectrum of the faculty as instructors. It is satisfying to report that a diverse collection of faculty have taught the newly reformed version of calculus. A list of involved faculty can be found at this link: Appendix E. Of the 37 different instructors (since the implementation of reform in Autumn 2001), over half are tenure track faculty, including 13 tenured full professors.


Appendix A
Supplementary Questions Tables A1,A2,A3


The questions for 2002-03 and Autumn 2003 are as follows:
  1. Class sessions (lecture) provided opportunities for student questions.
  2. Class sessions (quiz sections) provided opportunities for student questions.
  3. Time spent in lecture was useful to my learning.
  4. Time spent in quiz section was useful to my learning.
  5. The course encouraged student commitment to learning.
  6. The instructor made efforts to align instructor and student expectations.
  7. The lectures and the textbook worked well together in this course.
  8. The textbook was useful to my learning in this course.
  9. The homework contributed to my learning in this course.
  10. Worksheets contributed to my understanding of course content. (Leave blank if worksheets were not used in your class.)
  11. I was satisfied with my learning in this course.

During 2001-02, the initial year of reform, question 9 was divided into two seperate questions 9* and 10*:

9*. The textbook homework contributed to my learning in this course.
10*. The supplementary homework contributed to my learning in this course.

Thus, when comparing the data below:

questions 9 plus 10 in 2001-02 corresponds to question 9 in subsequent quarters;
question 11 in 2001-02 corresponds to question 10 in subsequent quarters;
question 12 in 2001-02 corresponds to question 11 in subsequent quarters.

Table A1: Math 124/5 Supplementary Questions Results, Autumn 2003 vs. Math 124/5 Year 2002-03 vs. Math 124/5 Year 2001-02.
Autumn 2003 based on 10 sections of Math 124 and 6 sections of Math 125.
(Note: all sections approximately size 80 students; honors Math 124 section is omitted.).
Academic Year 2002-03 based on 24 sections of Math 124 and 21 sections of Math 125.
Academic Year 2001-02 based on 21 sections of Math 124 and 13 sections of Math 125.
Data gathered by OEA in conjunction with end of quarter student evaluations.
(Key: labeled bar = median; box= 25% through 75% quartiles; line = range of scores.
Survey scores range from "7=strongly agree" to "4=neutral" to "1=strongly disagree".)

Table A2: Math 124 Supplementary Questions Results, Autumn 2003 vs. Math 124 Year 2002-03 vs. Math 124 Year 2001-02.
Autumn 2003 based on 10 sections of Math 124.
(Note: all sections approximately size 80 students; honors Math 124 section is omitted.).
Academic Year 2002-03 based on 24 sections of Math 124.
Academic Year 2001-02 based on 21 sections of Math 124.
Data gathered by OEA in conjunction with end of quarter student evaluations.
(Key: labeled bar = median; box= 25% through 75% quartiles; line = range of scores.
Survey scores range from "7=strongly agree" to "4=neutral" to "1=strongly disagree".)



Table A3: Math 125 Supplementary Questions Results, Autumn 2003 vs. Math 125 Year 2002-03 vs. Math 125 Year 2001-02.
Autumn 2003 based on 6 sections of Math 125.
(Note: all sections approximately size 80 students.).
Academic Year 2002-03 based on 21 sections of Math 125.
Academic Year 2001-02 based on 13 sections of Math 125.
Data gathered by OEA in conjunction with end of quarter student evaluations.
(Key: labeled bar = median; box= 25% through 75% quartiles; line = range of scores.
Survey scores range from "7=strongly agree" to "4=neutral" to "1=strongly disagree".)


Appendix B: OEA TFT Results


Appendix C: SGID numbers


Appendix D: SGIDs-Autumn 2003


Appendix E: Calculus Instructors Autumn 2001- Autumn 2003