REFLECTIVE MEMO

UNIFIED PROPULSION

Spring Term 2004, Ian Waitz

(Previous Reflective Memo is here)

Learning Objectives


1. What are the learning objectives (expressed as measurable outcomes) for this subject?

The subject learning objectives are contained on the course web page. Note that these objectives represent a wide range of new topics relative to the time available for the students to learn the material. The objectives thus correspond to lower levels of cognitive ability (explain, describe, estimate, apply).


2. To what extent were you able to integrate the CDIO skills specified for this subject in the Curriculum Plan of 2002?

I did not specifically implement any CDIO syllabus items since these are largely covered in the systems portion of Unified. However, some of the in-class concept questions and homeworks required 2.1.2 Modeling and 2.1.3 Estimation and Qualitative Analysis. I would say both skills were taught at the "Introduce" level.


Teaching and Assessment Methods

3. What teaching and assessment methods did you use and what evidence indicates these methods were successful or not?
a) Prepared lecture notes were available on the web for all of the material: http://web.mit.edu/16.unified/www/SPRING/propulsion/index.html. These notes have envolved over several years starting with a set of handwritten lecture notes. Each year I try to augment them when I find specific areas of difficulty from mud responses, etc. I am quite happy with them at this point. In the end-of-term evaluations 100% of the respondents rated the web page (for all of Unified) Somewhat Effective or Very Effective. 100% of the respondents rated the prepared lecture notes (for all of Unified) Somewhat Effective or Very Effective. In both cases, the majority of students felt these resources were very effective.

b) I used 16 concept questions over the 9 lectures with responses taken on the PRS system. The performance and answers to these were provided on the web page. I continue to find these very useful for engaging the class in the material while I am lecturing. 97% of the respondents on the SEF rated the in-class exercises as Very Effective or Somewhat Effective. Also several positive comments were made about the PRS/concept questions in the written comments from the end-of-term evaluations. In general my teaching reviews were good (see below), so I think the students found my lectures to be helpful to them.

 

c) I used mud cards for each lecture and responded to them the evening the lecture was delivered and put the responses up on web. These responses were linked to the relevant areas of the online notes. See for example P1 mud responses. 66% of the respondents on the end-of-term evaluations said the mud cards were Very Effective or Somewhat Effective, however the majority (52%) found they were only Somewhat Effective. Nonetheless, I still found the mud cards to be valuable to me for providing feedback on each lecture. Even in cases where there were very few respondents it was helpful, therefore I will continue to use these. However Steve Hall found (through looking at web page hits) that very few students were reading the on-line mud responses. Since this takes a great deal of time (and I already have 3 years of questions and answers posted), I will discontinue responding to these questions on-line.

.

d) I wrote short assessments of each lecture (how they went). See for example P4 mud responses. This was mostly helpful for me, although I did use it to stress the important points from the lecture. In general, based on Steve's data, I think very few students (6 or 8) read these and then only the night before the quiz. We have saturated the students in terms of available material on the web. Further information goes un-read.

e) For 4 of the lectures the students and I wheeled the CFM56 engine into the classroom so I could point to specific features as I was presenting material.As in the past, the comments from the students on this were very positive. It was particularly helpful for the section on ideal cycle analysis and the section on velocity triangles. On several occasions I found myself having conversations with students both in class and after class that would not have happened if not motivated by the presence of the engine (e.g. "Why are all the blades loose?", "How do they light the combustor?", and "How much does one of these cost?"). I also gave a homework assignment that required the students to take measurements on the engine so they could directly see how the models (in this case velocity triangles) applied to the real article.

f) Based on Steve Hall's suggestion (and good experience with the technique) I used a new method of getting the students actively involved in the recitation sections. Instead of doing any lecturing, I brought in old exam questions and had them all break up into groups of 2-4 and work them at the board. In most cases we made it through 2-3 questions. The students felt this was valuable as shown below (thanks to Steve for the data).

``Working at the board in recitations is a more useful technique than working at my desk on a problem or watching the instructor work a
problem, and is a more effective way to learn during recitation.''
1. Strongly agree
2. Agree
3. Neutral
4. Disagree
5. Strongly disagree
6. I have not been to a recitation where students work problems at the board.
Average of 1-5 responses: 2.07
32 agree or strongly agree
7 disagree

 

``I enjoy working at the board in recitations, and prefer working at
the board to working at my desk on a problem or watching the
instructor work a problem.''
Average = 2.45

Student Learning

4. How well did the students perform on each subject learning objective? (Where possible, make reference to specific data to support your conclusion.)

The use of the PRS system in-class and the collection of, and response to, the mud cards gave me a large amount of data on the class performance (formative and summative). Also, as in the past in Unified, we collected data on time spent on various activities (its primary use is to make sure the class stays within bounds).

Data Source
Formative
Summative
Feedback for Me
PRS System
X
.
X
Time spent
X
.
X
Muddiest Part of the Lecture
X
.
X
Homework
X
X
X
Quiz
.
X
X
End-of-term SEF data
.
.
X
Class attendance    
X

The performance on the learning objectives was overall fair with less than half the class scoring above the Joe B (middle B=76.5 for propulsion) level on the propulsion section of Unified. Class average was 72.2 with a median of 74.8. The class average was about 2 points lower than in the past. The homeworks were all generally within the specified time bounds. The performance on the homeworks was about 10% better than prior years whereas the quiz performance was several percent lower (65.6% for Joe B= 75% in 2004 versus 74.7% for Joe B = 78% in 2003). I attribute this change relative to previous years to students copying old homework solutions rather than working them. This was the first time all the solutions were available. In several instances I saw students with the solutions in-hand before they had even tried the homework. This is a continuing challenge since it is difficult to develop high quality new homework problems each year that reinforce the key points. Specific comments on the learning objectives follow:

A. To be able to explain at a level understandable by a high school senior or non-technical person what the various terms are in the integral momentum equation and how jet propulsion works (Not directly assessed this year).

B. To be able to apply control volume analysis and the integral momentum equation to estimate the forces produced by aerospace propulsion systems (Homeworks P1, P2, P3 and Quiz Problem 2). I do not believe the majority of the students achived the learning objective. I was a very disappointed in the performance on this material. It gets covered in Fluids also, and 3 out of 9 homeworks were devoted to it. Yet on the quiz not one student in the class received full credit and the problem was nearly identical to the first homework problem. The quiz problem was also very similar to problems on previous quizzes that we went over in recitation. The average performance was 63% with many sign errors, incorrect evaluation of dot-products, and missing or extra terms. The homework scores were 94%, 87%, 87% for P1-P3, respectively. These scores are more than 10 points better than those of the prior year. Thus they are inconsistent with the quiz scores and likely evidence that the students were relying on the solutions.

C. To be able to describe the principal figures of merit for aircraft engine and rocket motor performance and explain how they are related to vehicle performance. (Quiz Problems 1 and 4) I believe most of the students achieved this learning objective. The average on the quiz for these two problems was 73% and 69% for the aircraft/gas turbines and rockets questions, respectively. The performance was several percentage points better than the previous year -- I stressed this more this year in lecture and recitation (by going over similar problems from the prior year quiz). On the first problem, a few students were clearly confused about the unique overall arrangement of the engine and lift fan in the example and this contributed to reducing the average scores. For Problem 4, I think many students did not review or remember the relevant material from Thermodynamics in the fall semester to enable them to connect the pump performance to the exit velocity of the rocket motor.

D. Given weight, geometry, and aerodynamic and propulsion system performance information, to be able to estimate the power required for flight, the range, the endurance, and the time-to-climb for an aircraft. (Homeworks P4 and P5, and Quiz Problem 3). I believe most of the students achieved this learning objective. The students performed well on the homework problems (84% and 94%, approximately 10 points higher than in the prior year, possibly due to reliance on the solutions). Nonetheless, these are applications of straightforward dynamics problems and they seem to do well with the mechanics of solving the problems. This was only addressed partially in the quiz through checking that they could write down the major equations connecting these parameters to mission performance in Quiz Problem 3.

E. Given mass fractions, and propulsion system performance information, to be able to estimate the range and velocity of single-stage rockets. (Homework P6, Quiz Problem 4). I believe most of the students achieved this learning objective. They did well on P6 (90%, again 10 points higher than in prior years). The performance on Quiz Problem 4 was 69% as discussed above.

F. To be able to describe the principal design parameters and constraints that set the performance of gas turbine engines, and to apply ideal-cycle analysis to a gas turbine engine to relate thrust and fuel burn to component-level performance parameters and flight conditions. (Homework P7). I think most of the students did not achieve this learning objective. The one problem I gave them with this as a focus involved a bit of crunching in Excel/Matlab. Since they only had about an hour for the homework assignment, I think they spent most of the time worrying about Excel/Matlab and very little time thinking about the results. The average on the homework was 71% compared to 66% last year.

G. To be able to explain at a level understandable by a high school senior or non-technical person the energy exchange processes that underlie the workings of multistage compressor or turbine, and to be able to use velocity triangles and the Euler Turbine Equation to estimate the performance of a compressor or turbine stage. (Homeworks P8 and P9, Quiz Problems 3 and 5) I think most of the students achieved this learning objective despite poor performance on one of the quiz problems related to this area. The performance on the homeworks was 83% and 89% much higher than prior year scores of 61% and 70% -- again it is unclear how much of this resulted from students relying on the homework solutions. The class average on Quiz Problem 3 was poor at 60%. This problem required the students to link together several ideas, including the steady flow energy equation and the integral momentum equation, in a way they had not been asked to before. The poor performance didn't suprise me since it is typical of what happens when asking students to combine new concepts when they are not fully comfortable with the individual concepts. The performance on Quiz Problem 5 was 76%, 81% and 63% for the three parts with most of the points being lost in details of drawing the velocity triangles in part 5c. This performance is consistent with prior year results and suggests a fair understanding of the basics of energy exchange processes in gas turbine engines.

 

Continuous Improvement

5. What actions did you take this semester to improve the subject as a result of previous reflections or input from students or colleagues?

a) I tried to re-use all the homeworks (this did not work well -- see above).

b) Per the suggestion of Mark Spearing, I used a prepared equation sheet for the exam rather than allowing the students to make their own. I handed this out a few days before the exam (this worked well).

c) In recitations I had the students break up into groups of 3 or 4 and simultaneously work problems on the board (this worked well -- see above).

d) I reworked the learning objectives to make them more concise. This did not reflect a change in content (I am happy with these now).

 

6. What did you learn about your teaching and assessment methods this semester?
I learned that in an over-stuffed, time-competitive environment (i.e. where other disciplines and subjects are competing for the students' time) providing the solutions to one section of material (the propulsion homework problems) will cause the students to rely to a great extent on the solutions instead of working the problems themselves. This can have a negative impact on the students achievement of the learning objectives (as evidenced by historically high homework scores combined with historically low quiz performance).

7. What actions do you recommend to improve this subject in the future?
a) I recommend this material receive two additional lectures (11 instead of 9). I believe that some of the reason for the lackluster performance on some of the learning objectives is that I am stuffing too much into 9 lectures. Alternatively, we could cut the a/c performance part and put it into systems? fluids? Unified lectures? This needs to be someplace because it ties everything together, but it doesn't necessarily have to be part of the propulsion lectures.

b) Do not repeat the experiment of having all the homework solutions available. This means that I will have to create a new set of homeworks, which is a lot of work.

c) No longer respond to the mud cards online (since it takes a great deal of time and the students do not appear to be reading them). Rather, respond to them briefly at the start of the next class.

Information Sharing

8. To whom have you forwarded this reflective memo?

a) Unified teaching staff

b) 16.50 instructor

c) 16.050 instructor

d) Doris Brodeur and Diane Soderholm

 

Appendix: Subject Syllabus

• All course materials (syllabus, notes, homework, solutions, quiz) can be found on the Unified web page.