FNL
HomePage
Editorial Board
E-mail FNL
FNL Archives
MIT HomePage

Teach Talk

Enhanced Conceptual Understanding

David L. Darmofal

Four years ago, I was co-teaching our department's undergraduate course in aerodynamics, 16.100. This subject is taken by about 40 students each year split between juniors and seniors. My faculty cohort and I decided that the final exam should be different from the exams we had given in previous semesters. Our old exams were often variations on homework problems, "plug-and-chug," or "prove that" questions. This time, we felt we wanted to test the students' ability to integrate concepts and apply them in a more complex, open-ended problem, i.e., the type of problems they would face as practicing engineers. Though we had the best of intentions, the final exam was an unqualified disaster. Students resoundingly said that is was the toughest exam they had ever taken at MIT. Many students and, as a result, the faculty were clearly shaken by the exam.

Although we thought our students were achieving a deep level of conceptual understanding through our teaching, they were not. As a result, in the final exam, we assessed skills which the students did not have a good opportunity to develop through the subject's pedagogy. Since we felt strongly that conceptual understanding was a primary goal in our subject, we needed to change our teaching.

 

A New Pedagogy

Conceptual understanding is often hindered by previous knowledge and/or experiences which may conflict with the new knowledge. In recent years, faculty throughout MIT have changed pedagogy seeking to improve conceptual understanding [Breslow, L., "Educational Innovation Moving Ahead at Full Speed," TeachTalk, MIT Faculty Newsletter, Vol. XIII, No. 1, September 2000.] We chose to implement in-class concept questions following Mazur [Mazur, E., Peer Instruction: A User's Manual, Prentice-Hall, Upper Saddle River, NJ, 1997.]. In a typical class, two-to-three concept questions will be given to students with time for individual reflection following each question. After a check to see how well students have understood a question, small group discussions may be held. In addition, the instructor will usually clarify misconceptions and lead students in further exploration of the concept. In 16.100, we measured class response through PRS [Personal Response System, PRS, http://www.educue.com], a handheld personal response system. PRS has several advantages over hand raising or flash cards including anonymity of student responses and the generation of assessment data to analyze aggregate performance.

Our experience with concept questions has shown that students must have some engagement with the material prior to class. In 16.100, we give reading assignments and (graded) homeworks which are due prior to in-class discussion. By encouraging self-directed learning through pre-class homework, students are better prepared for class and faculty can then focus on the important concepts and misconceptions. I personally believe this adds significant value to the classroom experience by allowing our faculty to do what they do best.

In addition to modifying our pedagogy, we have also modified our exams from a written to an oral format. While written exams can only analyze the information which appears on paper, i.e., the final outputs of a student's thought process, an oral exam is an active assessment which can provide greater insight into how students understand and relate concepts. Also, oral exams are adaptive to each student. If a student is stuck or has misunderstood a question, the faculty can help the individual. As opposed to a wasted assessment opportunity, the dynamic adaptivity of an oral exam raises the likelihood of an effective assessment. Finally, practicing engineers are faced daily with the real-time need to apply rational arguments based on fundamental principles. By using oral exams, we can directly assess this ability.

 

The Impact

Quantifying the impact of pedagogical change on learning is a difficult task. Our approach is to take data from a variety of sources and draw our conclusions from the aggregate. While any single source is suspect, taken together, the results become convincing.

The generation of lift on an airfoil is filled with many misconceptions due to the (usually inaccurate) folklore regarding how airplanes fly and is further complicated by the knowledge gained in previous courses. On the first day of the fall 2000 and 2001 semesters, I gave the students a survey on aerodynamic concepts which included an open-ended question on lift generation. For the fall 2001 semester, the students were also asked in the mid-term oral exam to explain lift generation. In Figure 1, the responses have been divided into five groups. The momentum change and streamline curvature response is arguably the best answer but only 10% of the students offered this explanation at the beginning of the semester. For the second response, students correctly explained that a net pressure difference is acting on the airfoil to produce lift, but then offered the Bernoulli effect as the underlying cause of this pressure difference (which is not really true). At the beginning of the semester, this was the most popular answer, at over 60%.

We use a series of concept questions concentrating on understanding lift generation through momentum changes and reaction forces. The first question involves the impingement of a water jet on a cylinder (see Figure 2). Although many students believe the jet will cause the cylinder to be propelled away from the stream, in actuality, the object will rotate into the stream. A simple momentum balance leads directly to the connection between lift generation and momentum change – our intended result! When we use this question, we include an in-class demonstration which clearly demonstrates the cylinder being drawn into the stream. As evidenced in Figure 1, the active-learning pedagogy has made a substantial impact by the mid-term exam in fall 2001, with an over 60% response rate for the momentum-based lift explanation.

We have also assessed the students ability to integrate several concepts using a question from the disastrous fall 1998 final written exam as the basis for the 2001 final oral exam. While a significant shift in performance has occurred (see Figure 3), several caveats exist. In particular, in the written exam, students had several other questions to answer and could adopt the strategy of spending less time on this specific question. Thus, we believe that the apprarent performance gains were in part due to the more effective assessment strategy.

Student reactions to the new pedagogy have been overwhelmingly positive. In Figure 4, end-of-semester student evaluations from fall 2001 clearly show a dramatic improvement in effectiveness over fall 2000 for the lectures, in-class exercises, and assignments. We note that in both years, we used active learning but in fall 2000, our pre-class assignments were not difficult and required little student engagement of the material to answer. Student comments also show that an initial opposition to the new learning style fades as students recognize the effectiveness of the new approach. For example,

 

Closing Thoughts

Since the final exam debacle a few years ago, my teaching and, I believe, student conceptual understanding has been greatly improved. The last few years have personally been very rewarding as my classroom has become an active environment with a focus on conceptual understanding.

FNL
HomePage
Editorial Board
E-mail FNL
FNL Archives
MIT HomePage