MIT Reports to the President 1999–2000


The MIT Artificial Intelligence Laboratory has as its principal intellectual goal the understanding of human intelligence. As a practical matter the AI Lab develops the mathematics and engineering of intelligent systems and artifacts.

The MIT AI Lab has been in continuous existence since 1959, and currently has 24 faculty and senior and principal research scientists. The majority of faculty come from the department of Electrical Engineering and Computer Science, along with some from Brain and Cognitive Science, Mechanical Engineering, and Aeronautical and Astronautical Engineering.

Financial support is provided by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research (ONR), the National Aeronautics and Space Administration (NASA), the National Science Foundation (NSF), Nippon Telegraph and Telephone (NTT), Ford Motor Corporation, Yamaha Motor Corporation, the EPOCH Foundation (Taiwan), Flex Foot Inc., Alphatech Inc., Amgen Inc., Mitsubishi Research Laboratories, and Microsoft.

During the last year the AI Lab continued in its lead role for the NTT/MIT collaboration. Through this collaboration seventeen projects were funded in the AI Lab and the Laboratory for Computer Science. This collaboration is slated to run for three more years until June, 2003.

The AI Lab began a new collaboration with the Laboratory for Computer Science on MIT Project Oxygen. This project aims to create pervasive human-centered computing as the new way that people interact with computers.

Along with LCS, the AI Lab has secured funding from DARPA for this project, and has set up an industrial collaboration with Hewlett-Packard, Acer, Nokia, Delta, Philips, and NTT.

The research activities of the laboratory are divided into eleven general areas: learning, core artificial intelligence, information management, medical vision, general vision, vision applied to people and activity, medical robotics, robotics, cognitive architectures, language, and new models of computation. Two page research abstracts of 134 individual projects at the lab can be found at Some of the highlights of the year are as follows.

Professor Leslie Kaelbing and her students made progress on making reinforcement learning practical on real robots. Professor Tommi Jaakkola has applied kernel methods and graphical models to biological problems such as protein sequence analysis and gene identification, and to large scale medical diagnosis problems. Professor Tomaso Poggio and his students have extended their theoretical results on Support Vector Machines, and have investigated learning mechanisms in visual cortex.

Professor Tomaso Poggio and his students have studied artificial markets, Professor Tomas Lozano-Perez and his students have studied search methods with applications to understanding protein folding, and Dr. Olin Shivers and his students have worked on formal methods in specialized computer languages.

Mr. Michael Coen and a large number of graduate and undergraduate students have developed new applications for the Intelligent Room–in particular they have continued the development of the MetaGlue agent programming system. Professor Randall Davis and his students have developed systems that allow users to sketch mechanical designs, and transforms those sketches into formal mechanical descriptions which can be manipulated by simulation software. Dr. Howie Shrobe and Professor Davis have worked together on complementary aspects of design automation, namely design rationale capture, so that besides the design itself being produced, it is tagged with the inner thoughts of the designers elaborating on their design choices. Dr. Shrobe and Dr. Bob Laddaga have developed new techniques for making software self adaptive. Professor Lynn Stein has collaborated with Professor David Karger of the Laboratory for Computer Science, on personalized information environments for navigating the world wide web.

Professor Eric Grimson and his students have worked on a wide variety of visual techniques specialized to many different anatomical regions to support surgery, diagnosis and training. These methods and applications have included MRA segmentation, understanding vasculature images, orthopedic imagery, brain tumors using MRI, surveying sources of pediatric epilepsy, and three dimensional reconstruction, registration, tracking, and visualization. Professor Berthold Horn and his students have worked on new methods of tomography using incoherent lights sources, rather than more specialized radiation sources.

Professor Edward Adelson and his students have worked on understanding material properties from images, intending it as a source of constraint, ultimately, in object recognition. Professor Berthold Horn and his students have studied analog circuit for image smoothing and segmentation directed towards building smarter chips for vision. Professor Paul Viola and his students have developed new techniques for understanding hand writing, and for automatic target recognition. Visiting Professor Lisa McIlrath and her students have developed new VLSI based circuits for early image processing.

Prof. Tomaso Poggio and his students built a system that produces photo realistic text to audio-visual speech synthesis, and built reliable static face detectors. Professor Paul Viola and his students built a real-time face detector that is able to recognize faces as people walk along corridors. They continued their work on a system which uses multiple cameras at fixed positions, and is able to produce three dimensional reconstructions of people moving through the area. Professor Eric Grimson and his students further developed their work with cameras tracking people and vehicles, including new classification techniques, solving for the three dimensional relationship between cameras whose viewpoints partially overlap, and carrying out detailed geometric analyses of motion. In November of 1999, Professor Trevor Darrell joined the MIT faculty and the Artificial Intelligence Lab. He has set up a new perceptual interface program, using cameras in intelligent rooms, and new theoretical constraints applied to real-time stereo of people moving about, gesturing, and producing facial expressions.

The Artificial Intelligence Lab has become involved in a number of projects in applying robotics to medical applications. These include new artificial knees and legs for amputees (Dr. Hugh Herr), laproscopic surgery simulation (Dr. Ken Salisbury), robotic wheelchairs (Ms. Holly Yanco working with Professor Rodney Brooks), simulation of arthoscopic surgery (Professor Eric Grimson), and understanding low level motor control in humans (Professor Steve Massaquoi).

Professor Gill Pratt and his students have worked on a large number of walking robots. They have built a new bidpedal walking robot called M2. They have continued their work on the dinosaur robot Troody, and they have furthered the development of series-elastic actuators. Dr. Ken Salisbury and his students have developed new techniques for haptic interaction, and have investigated autonomous digging robots for subsurface planetary exploration.

Professor Rodney Brooks and his students have worked on a number of robots, Cog, Kismet, Coco, and the M4 Head, incorporating complete cognitive architectures with them. Dr. Cynthia Breazeal working with a number of other students built a broad cognitive architecture for Kismet, based on an underlying emotional state. Subsystems included a visual attention system, a system that understand prosody in people’s voices, an affect-based phoneme generation system, and facial emotional display system, behavior releasing system, and system for ta߼king turns with people in interactive speech. Together they allow Kismet to interact with naïve subjects, engaging in emotion laden, but meaning-free conversations.

Dr. Boris Katz and his students have applied natural language techniques to information retrieval problems, especially on the Web. They have been developing systems which can understand the content of the Web, or can be annotated in English, so that the right information is retrieved in response to an English language query.

Dr. Tom Knight and his students have developed a way of getting digital control over the molecular biochemistry of living cells. The system compiles simple computation into DNA strings which get inserted into living E. Coli cells. The group demonstrated molecular signaling on one class of cells, which did a simple digital computation and molecularly signaled another class of cells, which did a further simple computation and changed the cells’ luminescence to indicate the result. Dr. Knight and his students also worked on new silicon-based architectures where fine-grain data ownership is enforced at the lowest levels in order to provide a new class of security implementations. Professors Abelson and Sussman have continued their work developing computation on amorphous structures.

More information about the AI Lab can be found on the World Wide Web at

Rodney A. Brooks

MIT Reports to the President 1999–2000