REFLECTIVE MEMO
UNIFIED THERMODYNAMICS
Fall Term 2005
Learning Objectives
1. What are the learning objectives (expressed as measurable outcomes) for this subject?
The subject learning objectives are contained on the course web page. I do not have any changes to recommend to these.
2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?
I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, several of the in-class experiments I did required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis--see item 3. e) below. Both skills were taught at the "Introduce" level.
Teaching and Assessment Methods
3. What teaching and assessment methods did you use and what evidence indicates these methods were successful or not?
a) Prepared lecture notes were available on the web for all of the material: http://web.mit.edu/16.unified/www/FALL/thermodynamics/index.html. These notes have envolved over several years starting with a set of handwritten lectture notes. Each year I augment them when I find specific areas of difficulty from mud responses, etc. In general I am quite happy with the notes. In the end-of-term evaluations 98% of the respondents rated the web page (for all of Unified) Very Effective (60%) or Generally Effective (38%). 98% of the respondents rated the prepared lecture notes (for all of Unified) Very Effective (80%) or Generally Effective (18%).
b) I used 25 concept questions over the 12 lectures with responses taken on the PRS system. The performance and answers to these were provided on the web page. I continue to find these to be very useful for engaging the class in the material while I am lecturing. 93% of the respondents on the SEF rated the lectures for all of Unified as Very Effective (33%) or Generally Effective (60%). 86% of the respondents on the SEF rated the Concept Questions via PRS as Very Effective (44%) or Generally Effective (42%). When asked specifically about "Professor Waitz effectively uses active learning techniques," 94% of the respondents Agreed (46%) or Strongly Agreed (48%). Also some positive comments were made about the PRS/concept questions in the written comments from the end-of-term evaluations. In general my teaching reviews were good (shown below) , so I think the students found my lectures and homework assignments to be helpful to them.
Times Presented: 56 Times Answered : 55 Average (Mean) Score: 4.36 Variance: 0.58 Standard Deviation: 0.76 |
Times Presented: 56 Times Answered : 56 Average (Mean) Score: 4.43 Variance: 0.55 Standard Deviation: 0.74 |
Fall 2005 Course Survey        
16.010/020 Unified Engineering I & II
Times Presented: 56 Times Answered : 56 Average (Mean) Score: 4.25 Variance: 0.64 Standard Deviation: 0.8 |
Times Presented: 56 Times Answered : 56 Average (Mean) Score: 4.17 Variance: 0.6 Standard Deviation: 0.78 |
Times Presented: 56 Times Answered : 55 Average (Mean) Score: 4.55 Variance: 0.51 Standard Deviation: 0.72 |
Fall 2005 Course Survey        
16.010/020 Unified Engineering I & II
Times Presented: 56 Times Answered : 54 Average (Mean) Score: 4.5 Variance: 0.52 Standard Deviation: 0.72 |
Times Presented: 56 Times Answered : 55 Average (Mean) Score: 4.35 Variance: 0.55 Standard Deviation: 0.74 |
Times Presented: 56 Times Answered : 55 Average (Mean) Score: 1.89 Variance: 0.19 Standard Deviation: 0.44 |
Fall 2005 Course Survey        
16.010/020 Unified Engineering I & II
Times Presented: 56 Times Answered : 55 Average (Mean) Score: 4.62 Variance: 0.51 Standard Deviation: 0.72 |
c) I used mud cards for each lecture. 66% of the respondents on the end-of-term evaluations said the mud cards were Very Effective (11%) or Generally Effective (53%). Although this is lower than prior years, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these.
d) I did 4 small demonstrations/in-class experiments. The students seemed to like these activities since they allowed them to apply the subject material to a real problem. They all were of the form where I asked the students to estimate something (using the concepts in class), then did the experiment and then discussed the result in light of their estimates. The activities thus had three primary objectives: to engage the students in the material we were working on and show them how to apply it, to highlight the various simplifications and assumptions in the engineering models we use, and to give the students practice in estimating various parameters required for input to the models (e.g. the volume of the room, or the weight of something, etc.)
e) I wheeled the CFM56 into class for one of the lectures. The students very much enjoyed this. I find that it prompts a degree of interest and a depth of questioning that I do not get otherwise. Here is a quote from the SEF essay questions in response to "best parts of the class" question: "I love Waitz's lecture on the CFM-56 7B engine"
f) We introduced the concept of the "bonus quiz" in Unified this year. Twice during my 12 lectures I had short unannounced quizzes (Bonus Quiz 1, Bonus Quiz 2). Together these were equal to 5 points on the quiz (and Joe B was not adjusted -- so they were truly bonus points). I thought these worked well. The students also found these to be effective (see below). Indeed, the fact that the students felt pop quizzes were effective is a ringing endorsement, since pop quizzes are generally not viewed in a favorable light. I think the students came to class better prepared.
The “minute quizzes” are intended to encourage you to come to class prepared, and to give you a periodic assessment of your understanding, without much pressure. The minute quizzes have been effective in achieving these goals.
1. Strongly agree
2. Agree
3. Neutral
4. Disagree
5. Strongly disagree
g) I used homeworks and exams to assess student learning. Each of the homework problems and exams was coded to specific subject learning outcomes. Homework T1 and homework T3 were designed to directly support the Systems Problems focusing on the Water Rocket. The overall weighting was 5% class participation 30% homework, 65% quizzes. Also, the use of the PRS system in-class and for the self-assessment activity and the collection of the mud cards gave me a large amount of data on the class performance. Also, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).
Data Source | Formative |
Summative |
Feedback for Me |
PRS System |
X |
. |
X |
Time spent |
X |
. |
X |
Muddiest Part of the Lecture |
X |
. |
X |
Homework | X |
X |
X |
Quizzes | . |
X |
X |
End-of-term SEF data | . |
. |
X |
Class attendance | X |
||
Bonus quizzes | X |
X |
X |
Student Learning
4. How well did the students perform on each subject learning objective? (Where possible, make reference to specific data to support your conclusion.
I will start with a general discussion of the performance on the homeworks and the quiz followed by a discussion of the performance on each learning objective.
The performance on the homework and the time spent were both good with students on average performing well (see plot below). The one exception is Homework T3 which was designed to support the Water Rocket problem (along with Homework T1). This problem was longer than desired (2 hours) and many of the students did not do well. Nonetheless, I thought it was valuable for providing a direct link between the systems problems and the thermo material. It also allowed the systems problem assignments to be a little shorter than in the past.
The following plot shows the overall class performance on the quiz. Middle-B performance for the quiz (Joe B) was 68% -- about 10 points lower than typical for my exams. The class average was only 62%. I knew when I set the exam, that it was a hard exam and it was a long exam (I even told the students this before the exam). The distinguishing element was problem 3. It required them to think deeply and represent the behavior of a device using thermodynamic concepts and processes they saw in class. This is the opposite of the homework problems, where I told them what the processes were. In general the class did poorly on this question and this led to the poor performance on the exam as a whole. The question-by-question performance on the quiz is shown in the table below with each question labeled in terms of the learning objectives that were addressed. More detailed notes on the quiz performance follow.
Quiz | 1a | 1b | 2 | 3a | 3b | 3c | 3d | 4 | 5 | Total |
LO# | 5 | 1 | 2,3,4,6 | 1,2 | 2,4,5 | 4 | 4 | 4 | 4 | na |
Mean | 68% | 86% | 72% | 62% | 49% | 34% | 8% | 86% | 70% | 62.0% |
StDev | 37% | 23% | 30% | 24% | 28% | 32% | 17% | 26% | 27% | 14.2% |
Weight | 5% | 5% | 15% | 7% | 8% | 10% | 10% | 20% | 20% | 100% |
Key problem areas on the quiz were:
1 a) Some students only focused on the system for the definition of irreversible, versus system + surroundings -- but they did okay on this in general.
1 b) In general the students did well on this. Some forgot to say that heat was an energy TRANSFER.
2) Most students did okay on this. Some could not draw a Brayton cycle (about 1/3). Of the 2/3 that could, about half got it all correct; the other half couldn't figure out how to represent an afterburner. This latter item was less important to me than drawing the Brayton cycle and identifying the work as the enclosed area (some missed this latter point too -- maybe 10% of the class).
3 a) Most did a fair job at describing the energy exchange process -- about 2/3 did not realize the weight continues to drop during the slow heat transfer process. Most used the right words (heat, work, different forms of energy). Some (maybe 15%) treated the two chambers as one system and never described the flow of heat between the two.
3 b) About 1/3 got this all right. About 1/3 incorrectly assumed the first process was q-s. About 1/3 were way off-base and could not describe rational processes (4-5 students did not even define a process).
3c) About 1/3 got a good start (or even completed) the non-q-s constant pressure analysis. About 1/3 assumed q-s adiabatic. About 1/3 got nowhere.
3d) Maybe 20% of the class had something rational here. Maybe 40% thought the temperature was the average of T2 top and T1 bottom -- not recognizing that the weight continues to drop and that there is work done. And about 40% left this blank or said nothing rational at all.
4) In general the class did well on this problem. The most common flaw was that a student would write Tt and T, but never box an answer and say what Tvessel is. But the average performance on this problem was quite high. I have the feeling that a few of the students plugged the numbers they were given, into the equation they knew they had to use, and came up with the right answer by luck. But most who got it right described why Tvessel = Ttatm.
5) Most students understood that this was a q-s adiabatic process. About 50% did not realize that the atmospheric conditions were stagnation conditions so they went through some complicated math (solving for the velocity) to get to the stagnation temp at the outlet. But by-and-large about 90% of the class got this far. Then about 60% of the class solved for work instead of shaft work. And again, many (50%) got wrapped around the axle with the velocities instead of realizing they were working with the stagnation conditions. In general, the performance on this was okay. Not great.
Based on the above data, here is my assessment of the student performance on each of the subject learning objectives:
Both of these learning objectives were addressed in homework T1 and quiz questions 1b and 3a. Based on the very good performance, I believe most of the students achieved these learning objectives. Indeed, they took home the message (as stressed in lectures and recitations) to describe these processes in terms of different forms of energy and transfers of energy (heat and work).
There were three homeworks that supported this learning objective (T7, T8, T11). I also started the discussion of heat engines with a thorough description of how an IC engine works. And I brought the CFM-56 into the class for one of the lectures. However, I did not directly assess the student performance on this learning objective on the quiz.
Historically the students do very well on this learning objective since it stresses the mechanical elements of solving a thermodynamics problem, versus conceptual understanding. Many homework problems supported this learning objective (T2, T3, T4, T5, T6, T7, T8, T9, T10, T11). The performance on these homework problems and the good performance on quiz problems 2, 4 and 5 is evidence that the students achieved this learning objective. However, as noted in the discussion of quiz problem 3 above, the students' facility with these concepts is at a beginner level. When they were told what the processes are, they did a good job analyzing the behavior. When they were given a physical description of an event (quiz problem 3), they were (in general) not able to determine the appropriate thermodynamic processes. This latter item is a higher level skill and their poor performance is not unexpected based on the limited experience they have.
The students performed fair in this area as evidenced by quiz question 1a and homeworks T12 and T13.
This learning objective is an extension of learning objective 4. The students did well on this as noted above.
Continuous Improvement
5. What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?
6. What did you learn about your teaching and assessment methods this semester?
7. What actions do you recommend to improve this subject in the future?
Information Sharing
8. To whom have you forwarded this reflective memo?