FNL
HomePage
Editorial Board
E-mail FNL
FNL Archives
MIT HomePage

Improving Student Understanding With TEAL

John W. Belcher

Introduction

Over the last three years, the MIT Physics Department has been introducing major changes in the way that 8.02, Electromagnetism I, is taught at the Institute, through the TEAL (Technology Enhanced Active Learning) Project [Supported by the d'Arbeloff Fund for Excellence in MIT Education, the MIT/Microsoft iCampus Alliance, the MIT School of Science, and NSF] (Belcher 2001). After being taught as a prototype twice, in fall 2001 and fall 2002, TEAL went to a large-scale implementation for the first time in spring 2003. In the first two prototype years of the program, student reaction as judged by commentary in The Tech was generally positive (Chen 2001), but in spring 2003 the student reaction ranged from positive to mixed (Li 2003) to very negative (Agarwal 2003, LeBon 2003), with numerous questions raised about the format.

In this article, I address the educational efficacy of the TEAL format, using assessment results from TEAL fall 2001, TEAL spring 2003, and from a control group from spring 2002, when on-term 8.02 was taught in the traditional lecture/recitation format. This assessment strongly suggests that the learning gains in TEAL are significantly greater than those in the traditional lecture/recitation. This result is consistent with many other studies of introductory physics education over the last two decades. It is also consistent with the much lower failure rates for the spring 2003 8.02 (a few percent) compared to 8.02 failure rates in recent years (from 7% to 13%).

I also discuss, with hindsight, the missteps we made in the transition from the prototype course to the mainline course in spring 2003 that contributed to the adverse student reaction. Many of these missteps had to do with insufficient training of both students and instructional staff for teaching and learning in this new format. The major lessons of the TEAL experience for educational innovation at the Institute are: (1) any serious educational reform effort at MIT must be accompanied by a robust assessment effort; and (2) any move from small-scale innovation to large-scale implementation requires careful thought about a number of design issues, and training.

 

Motivations for Change

The TEAL format is centered on an "interactive engagement" approach, merging lecture, recitations, and desktop laboratory experience into a technologically and collaboratively rich experience. It is taught in a highly interactive, hands-on environment, with extensive use of networked laptops in a classroom especially designed for this approach (the d'Arbeloff Classroom, 26-152). We are not the first to try this format. "Studio Physics" loosely denotes a format instituted in 1994 at Rensselaer Polytechnic Institute by Jack Wilson. This pedagogy has been modified and elaborated on at a number of other universities, notably in NCSU's Scale-Up program under Robert Beichner. We have expanded on the work of others by adding a large component centered on active and passive visualizations of electromagnetic phenomena.

What is the motivation for this transition to such a different mode for teaching introductory physics? First, the traditional lecture/recitation format for teaching 8.01 and 8.02 has had a 40-50% attendance rate, even with spectacularly good lecturers (e.g., Professor Walter Lewin), and a 10% or higher failure rate. Second, there have been a range of educational innovations in teaching freshman physics at universities other than MIT over the last few decades that demonstrate that any pedagogy using "interactive engagement" methods results in higher learning gains as compared to the traditional lecture format (e.g., see Halloun and Hestenes 1985, Hake 1998, Crouch and Mazur 2001), usually accompanied by lower failure rates. Finally, the mainline introductory physics courses at MIT do not have a laboratory component. This is quite remarkable - to my knowledge MIT is the only major educational institution in the United States without a laboratory component in its mainline introductory physics courses. The motivations for moving to the TEAL format were therefore to increase student engagement with the course by using teaching methods that have been successful at other institutions (including Harvard, see Crouch and Mazur 2001), and to reintroduce a laboratory component into the mainline physics courses after a 30-year absence.

 

The TEAL Format Spring 2003

In the TEAL classroom, nine students sit together at a round table, with a total of thirteen tables. In five hours of class per week (two two-hour sessions and one one-hour problem-solving session led by graduate student TAs ), the students are exposed to a mixture of presentations, desktop experiments, and collaborative exercises. The course was broken down into six sections. A physics faculty member, assisted by a physics graduate student, an upper-level undergraduate who had previously taken the course, and a member of the Physics Demonstration Group, taught in each section. In spring 2003, Professors Wit Busza, Michael Feld, Eric Hudson, David Litster, Ernest Moniz, Jr., and Dr. Justin Kasper led the six sections of 8.02.

Students were assigned to groups of three and remained in those groups for the entire term. In the two prototype versions of the course, we assigned students to groups based on their score on an electromagnetism pre-test, discussed below, using heterogeneous grouping (i.e., each group contained a range of student backgrounds as measured by the pre-test score). In spring 2003, because of the logistics of dealing with over 500 students, we assigned students to groups randomly. The grade in spring 2003 was based on: in-class activities, desktop experiment summaries, and worksheets; standard weekly problem sets; questions about reading assignments that were turned in electronically before each class; three one and one-half hour exams; and a final. Three-quarters of the tests were made up of the standard "analytic" problems traditionally asked in 8.02; one-quarter of the tests were made up of multiple-choice conceptual questions similar to questions asked in class and on the pre- and post-tests. Students typically score lower on these multiple-choice questions because they test concepts that may not be well understood, and because there is no partial credit.

The course was not curved. In other words the cut-lines for the various letter grade boundaries were announced at the beginning of the term. Because collaboration is an element, it was important the class not be graded on a curve, either in fact or in appearance, to encourage students with stronger backgrounds to help students with weaker backgrounds. Also, the cut-lines in the course were set in such a way that a student who consistently did not attend class could not get an A. This was a deliberate policy to encourage attendance, based on the belief that at least part of the reason for the traditionally high failure rates in 8.02 is the lack of student engagement with the course.

 

Successes and Failures In The Large-Scale Implementation

In many ways we were pleased with the results of the large-scale implementation of TEAL in the spring of 2003. The physics faculty teaching the course felt that students were learning more with this new method of instruction than in the traditional lecture/recitation format. This feeling was borne out by our detailed assessment results, [see "TEAL Assessment and Evaluation"]. To summarize those results, the learning gains in TEAL spring 2003 by standard measures are about twice those in the traditional lecture/recitation format. The fact that interactive-engagement teaching methods produce about twice the average normalized learning gains when compared to traditional instruction replicates the results of many studies obtained at other universities, including Harvard.

However, what was disappointing was that much of the student reaction to the course in spring 2003 was mixed to negative. The CEG overall course score for spring 2003 was 3.7/7.0, a very low ranking. What accounts for this glaring discrepancy between learning gains and student satisfaction with the spring 2003 course?

In hindsight, there were a number of missteps we made that contributed to this situation. For example, our prototypes were taught in off-term 8.02. Two-thirds of the population in off-term 8.02 consisted of upper-class students who had failed either 8.01 or 8.02 in their freshman year, and one-third of the population consisted of freshmen who had received credit for 8.01, most of whom had an excellent high school physics background including an introduction to electromagnetism. In any case, almost all of our students in the prototype course had seen the material before at some level, and thus had some comfort level with it. This was not the case in spring 2003, when some students entering the course had never seen the material before. Our introductory material did not take this into account, and thus many of these students felt lost at the beginning of the course.

To compound this error, we used group work extensively in class, and although in the prototype courses we grouped according to background (that is, every group had a range of prior knowledge based on the pre-test), in spring 2003 we simply assigned students to groups randomly, because we thought the spring population was more uniform in its background than the fall term course, and because we did not think we could make heterogeneous assignments in a timely way with 550 students. The result was that some of our groups consisted entirely of students who had never seen the material before. A frequent student complaint in our focus groups and in the course surveys was that "the blind can't lead the blind" in group work, and the more homogeneous grouping on our part certainly contributed to that reaction. It also contributed to the perception of the students that they were not learning enough in class because of the emphasis on students teaching themselves. Students complained they felt they did most of their learning outside class, and only came to class because they knew class participation was part of their grade.

Another factor was that the sections in spring 2003 were led by faculty who had never taught in this format before. The prototype courses were taught by Peter Dourmashkin and myself. Although we did train the faculty in the teaching methods in the course, with hindsight our training was not thorough enough to prepare them for the new environment in the d'Arbeloff Classroom, both in terms of the technology in the room and the teaching methods used in "interactive engagement." In particular, we provided to the teaching staff PowerPoint presentations for the material to be covered in a given class, and many students felt that the section leaders went through this material too rapidly. They preferred more traditional board work, which moderates the pace of the presentation of material.

Moreover, we did not do enough training of the student groups themselves in collaborative work. Ideally, collaborative work is a positive experience for everyone in the group – the students with poorer backgrounds can learn from more advanced peers who have recently struggled with the same concepts, and the students who have stronger backgrounds find that the best way to clarify one's understanding of material is to explain it to others. But to function in this way instructors need to train students to understand the purpose of group work. We did not do a good job of setting out the mechanics of group work, and in particular we did not set up mechanisms for corrective action for groups that were not working.

Finally, many students did not find the experiments useful – they were unsure of what they were supposed to learn from them, and the length of the experiments was such that frequently students did not have a chance to finish them.

 

Future Directions

Because the TEAL Project has had a robust assessment effort from the outset, we have been able to understand and document the successes and failures of the implementation over the course of the last three years, and to learn from them. For TEAL to succeed in the long term, it is crucial we improve the learning environment for the students. In particular, since we feel that class attendance is a central part of this teaching method, we must structure the course so that coming to class is seen by the students as a profitable use of their time. The changes we plan to make in the future are: (1) heterogeneous grouping, and more training of students in collaborative methods; (2) more extensive training for course teaching staff, both section leaders, graduate student TAs, and undergraduate TAs; (3) an increase in numbers of the course teaching staff (students felt we were understaffed during class); (4) fewer experiments that are better explained and better integrated into the course material; (5) better planning of individual classes to break our active learning sessions into smaller units that can be more closely overseen by the teaching staff.

The lessons of the TEAL experience thus far for educational innovation at the Institute are first, that any serious educational reform effort at MIT must be accompanied by a robust assessment effort. One needs some quantitative measure of the effectiveness of instruction to gauge whether the innovation is actually producing results that are superior to or equal to what it is replacing. Second, as is well known in educational circles, the most perilous part of any innovation is the attempt to move from small-scale innovation to large-scale implementation. With hindsight, we feel that our major misstep in this transition was not training course personnel and students adequately to prepare them for this new method of teaching.

FNL
HomePage
Editorial Board
E-mail FNL
FNL Archives
MIT HomePage