Studying these cells could lead to new treatments for diseases ranging from gastrointestinal disease to diabetes.
The iCampus celebration, "Learning Without Barriers/Technology Without Borders," featured not only a symposium to honor the MIT-Microsoft alliance, but also live demonstrations of some educational technology initiatives that emerged from the seven-year partnership.
The MIT Museum offered compelling glimpses of technologies that promise to revolutionize social arenas, from classrooms to hospital wards. These include Technology-Enabled Active Learning (TEAL); iLabs; Classroom Learning Partner (CLP) software; and the Huggable, a robotic companion animal.
Peter Dourmashkin, senior lecturer in physics at MIT and associate director of the Experimental Study Group, was on hand to present TEAL.
The TEAL classroom, Dourmashkin explained, started with real estate: Instead of having the lecturer poised before an inert mass of students, followed at some later date by a separate "hands-on" lab, the TEAL class is based on tables of nine students--three groups of three--with the professor and teaching assistants alternately lecturing and mingling at the tables as students work with test apparatus linked to laptops. The walls are lined with projection screens as well as traditional chalkboards, enabling everyone to view sophisticated visualizations and simulations that bring the material to life.
"Our model was a music class," Dourmashkin said, "in which faculty can see how everybody learns." Students learn from one another as well as from the instructor, and no one gets to hide in the back of the auditorium, passive and unengaged.
TEAL has been used in Physics 8.02 (Electricity and Magnetism) since 2005, and the road to change has not been without bumps. "We have to work hard at training our faculty, even the Nobel Prize winners," Dourmashkin said. Among other things, the TEAL instructors have learned that it's a good idea to mix up the groups halfway through the semester, to break up unhealthy work relationships and give everyone a fresh start.
Nevertheless, he and his colleague John Belcher have demonstrated a 20 to 30 percent improvement in students' conceptual understanding of the course material, relative to their peers.
Global lab partners
One of the star achievements of the iCampus alliance, iLab, was demonstrated by JesÃºs del Alamo, professor of electrical engineering. This technology allows instructors and students anywhere in the world to access electrical lab equipment set up at an iLab server; all they need is an Internet connection.
Students in Singapore, Greece, Sweden and Africa have taken advantage of the iLab, including one configured for MIT Course 6.002 (Electronics and Circuits). (Curricula and course materials can also be downloaded via MIT OpenCourseWare.)
Remote participants can set up experiments and change the parameters to suit their needs, then receive the raw data in real time. "That's the beauty of the iLab," said del Alamo. The new and improved version, 6.1, was on view for the first time at the symposium.
Notes from underground
Elsewhere at the museum, Kimberle Koile, of the Computer Science and Artificial Intelligence Laboratory, showed off a new wireless technology for classroom teachers.
Classroom Learning Partner (CLP) software, running on tablet PCs, allows teachers to get instant and anonymous feedback on how well the class is assimilating curriculum materials. The tablet PCs can display complex diagrams, maps and other visually dense materials, but more importantly, they are equipped to interpret both sketches and handwriting. Teachers therefore can ask students concept questions; their handwritten answers are instantly graphed in a histogram.
As Koile noted, "Teachers can figure out instantly how well students are understanding the material." Koile is currently using CLP in an introductory computer science course at MIT, but she also successfully tried the technology out in her son's first-grade class in Lexington.
One demo drew an interested crowd of school kids.
This was the Huggable, the plush product of a collaborative effort in the Media Lab that was supported by an iCampus grant. On hand to explain its functionality, which is still in development, was Dan Stiehl, research assistant in robotic life at the Media Lab.
As Stiehl explained, the medical profession has long recognized the therapeutic value of companion animals, but many patients who would benefit from animal company are in environments that cannot support pets: places like hospitals, nursing homes and rehab facilities.
The Huggable is essentially a robotic companion animal that is not only cute and soft but responsive, measuring how much and how it is being handled. Its eyes can act as remote cameras, and it can also gather auditory and motion data from the patient, determining, for example, whether the child holding it is rocking back and forth in a manner suggesting anxiety or fear.
Because the bearlike toy can be linked electronically to the nurses' station, Stiehl said, "The Huggable is a team member," extending the reach of busy hospital staff. The first pilot trials will get underway in 2007 in Scotland, in collaboration with Highlands and Islands Enterprises.