Artificial Intelligence Laboratory

The MIT Artificial Intelligence Laboratory (AI Lab) has as its principal intellectual goal the understanding of human intelligence. As a practical matter the AI Lab develops the mathematics and engineering of intelligent systems and artifacts.

The MIT AI Lab has been in continuous existence since 1959, and currently has 22 faculty and senior and principal research scientists. The majority of faculty come form the department of Electrical Engineering and Computer Science, along with some from Brain and Cognitive Science and Aeronautical and Astronautical Engineering.

Financial support is provided by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research (ONR), the National Aeronautics and Space Administration (NASA), the National Science Foundation (NSF), Nippon Telegraph and Telephone (NTT), Ford Motor Corporation, Yamaha Motor Corporation, the EPOCH Foundation (Taiwan), Flex Foot Incorporated, Alphatech Incorporated, Nokia, Philips, Hewlett Packard, Acer, Delta Electronics, Mitsubishi Research Laboratories, and Microsoft.

During the last year, the AI Lab completed its third year of partnership with NTT. The Laboratory for Computer Science also participates in this partnership, although it is administered at MIT through the AI Lab. Complementarily the MIT Oxygen Project is administered through the Laboratory for Computer Science, and the AI Lab participates in it through subcontracts. This last year has marked the completion of the first full year of industrial collaboration under Project Oxygen. The companies involved are Nokia, Philips, Hewlett Packard, Acer, Delta Electronics, and NTT.

The AI Lab also set up to host the campus portion of the Martinos Center for Structural and Functional Biomedical Imaging.


The research activities of the laboratory are divided into ten general areas: foundations of artificial intelligence and learning; applied artificial intelligence and learning; biologically inspired robots and models; computational support for people; medical robots; medical vision; mobile robotics; new models of computation; vision and sound applied to people and activity; and vision techniques. Two page research abstracts of 116 individual projects at the lab can be found at Some of the highlights of the year are as follows.

Vision and Sound Applied to People and Activity

Professor Trevor Darrell and his students have developed new methods for tracking people as they move about open spaces, and for estimating their head pose. This work is forming the basis for new modes of interaction between people and computers as part of project Oxygen.

Professor Eric Grimson and his students have developed new techniques for extracting anatomical structures from various imaging sources. These form the basis for new diagnostic tools.

Medical Robots

Professor Gill Pratt, Dr. Hugh Herr, and their students completed a new version of a prosthetic knee, and put it into clinical trials. The new knee can sense whether the wearer is walking up or down stairs, or at different speeds on flat ground, and adapt its behavior to make walking smoother and easier. The knee is now going into commercial production.

New Models of Computation

Dr. Thomas Knight and his students have successfully implemented computation inside living E. coli cells. They compile computations to DNA strings and insert these into the genome of E. coli. When the RNA transcription mechanism encounters these strings it implements the desired computation. Cells performing different computations communicated with each other via lactone molecules.

Computational Support for People

Mr. Krzysztof Gajos, the AI Lab Oxygen coordinator, has lead a group of students in developing a deployable software system to form the core of the Project Oxygen Enviro 21 systems. Metaglue is a distributed, always available, migratory, checkpointing, agent-based backbone on top of which the projects below have been built. Mr. Jack Costanza has managed an effort to install the necessary hardware, the Metaglue system, and an intelligent office application in a number of faculty and student offices throughout the AI Lab. This is part of our effort to live within the Oxygen system for pervasive human centered computation that we are building.

Professor Randall Davis and his students have built a system which can understand sketches of mechanical systems and incorporate spoken annotations by the user. The system feeds the models into physical simulators. Since the user is sketching on virtual whiteboard the effect is that the "paper" they are using is intelligent and understands what the user is sketching. Professor Davis and his students are working to extend this idea to the domain of software development.

Dr. Howard Shrobe and his students have created a collaboration tool that provides support for meetings in intelligent offices. The system keeps track of decisions that are made in meetings, files them for later use, and provides relevant video clips from the meeting as an annotation for these decisions. In this way it automatically captures the rationale for decisions for later use.

Biologically Inspired Robots and Models

Dr. Cynthia Breazeal, Dr. Brian Scassellati, and other students of Professor Rodney Brooks, extended the ideas of sociable robots. Along with Dr. Una-May O'Reilly they delivered a new version of the robot Kismet to NTT. They developed new theories of social interaction and built models of language acquisition, animate and inanimate discrimination, gaze direction detection, and simple shared attention. All these systems were implemented on the robots Kismet and Cog.

Rodney A. Brooks

More information on the Artificial Intelligence Laboratory can be found online at

return to top
Table of Contents