MIT Reports to the President 1997-98

ARTIFICIAL INTELLIGENCE LABORATORY

The Artificial Intelligence Laboratory has undertaken a reorganization into spatially adjacent groups of intellectual commonality. We are using these larger groupings to pursue new research agendas and to become more involved in corporate sponsored research. Much of the year was spent in conversations with NTT which resulted in a five year research collaboration agreement that covers both the AI Lab and the Laboratory for Computer Science. We will pursue further industrial collaborations over the coming year.

Down in the basement, Professor Gill Pratt and his students are building new generations of walking robots, with applications in urban antiterrorism situations. At the same time, their fundamental work on understanding walking has led to a commercial collaboration with a manufacturer of leg prostheses. Soon, amputees throughout the United States will be walking around on artificial legs with intelligent knees, developed at the AI Lab. Next: ankles.

On the 4th floor, Professors Gerald Sussman and Hal Abelson are trying to make stuff, i.e. matter, smart. The fundamental idea is that as computing gets cheaper, we will be able to embed disposable computers in all manufactured material, even paint. Imagine walking up to a large wall with a can of paint riddled with tiny display elements and simply painting a huge high resolution display with a standard paintbrush. The goal of the "amorphous computing project" is to develop languages and algorithms so that the display elements can self organize to allow just this.

We are just completing renovations of the 7th floor to make a large new vision and learning environment. Professor Olivier Faugeras works on fundamental geometry of computer vision. Professor Berthold Horn is studying the propagation of light through biological material so that he can build visible light "x-ray" systems for medical imaging. Professor Eric Grimson is leading an active group in medical vision systems. Over the last few years, he has collaborated with Brigham & Women's Hospital and the image guided surgery systems they have developed are being used for brain surgery on real patients on a daily basis. These computer vision systems provide the surgeons with real time updates on where their instruments are inside the heads of patients, and relate that information to automatically extracted anatomical features from MRI scans. Professor Grimson works on a number of other vision systems also, and he and his students have developed new real time surveillance systems which allow cameras to monitor human and traffic activities to learn what is normal in a given situation and to automatically notice when something strange is happening. In collaboration with Professor Paul Viola, a new variable viewpoint reality system is being built. The ultimate idea is that a sports stadium, or the surface of Mars, might be decorated with 200 video cameras and you sitting at home get to choose the actual viewpoint you want to look at during the game, or exploration, and the system synthesizes an accurate view for you at your chosen virtual viewpoint. Some NFL refs could have used this system recently! Professor Viola is also working on image database retrieval and an engineer's workbench where an engineer sketches, scribbles, and talks while they design and the power of commercial CAD and analysis systems are brought to bear on their work. Professor Viola has also been active in machine learning and has recently been joined by Professor Tommi Jaakkola, who brings great expertise in statistical learning techniques. Over the next couple of months, other faculty members are joining the 7th floor, including Professor Whitman Richards of Media Arts & Sciences, whose work has been in fundamental areas of human cognition; Professor Ted Adelson from Brain & Cognitive Sciences, who works in early vision; and Professor Steve Massaquoi who works on understanding the human cerebellum. Professor Tomas Lozano-Perez has recently taken up duties as Associate Department Head of Electrical Engineering and Computer Science, but manages to spend time on the 7th floor where he works in computational biology, applying robotics techniques to understanding the structure of proteins. Professor Tomaso Poggio is running an outpost of the AI Lab over in E25 where the research is centered around computer vision, graphics and machine learning.

On the 8th floor, Professor Patrick Winston is pursuing the fundamental nature of human memory and its coupling to the human sensory motor system and the language facility. Professor Bob Berwick is researching the relationship between evolutionary constraints and human language abilities. Dr. Boris Katz is extending his work on practical natural language systems, the START system, which enables us to use English to query both database and more generally, the web. Professor Randall Davis, besides his strong interest in legal aspects of software intellectual property, is also working in collaboration with Dr. Howard Shrobe on a number of projects. These include systems that help the design process in the mechanical and software domains, and systems which provide information access interfaces to large organizations ó they have built a number of systems which are used on a daily basis by the White House and other government agencies for the dissemination of their information. Professor Lynn Stein is working on revamping the whole approach to teaching computer science using a process rather than algorithm based model. And, she is also collaborating with Professor Lozano-Perez, Dr. Shrobe and myself on human computer interaction. On the 8th floor, we have built both an intelligent room and an intelligent office where we invert the normal relationship between people and computers and drag computers out into the physical real world where they must interact with humans doing their normal sorts of activities, providing them with computational support. This is in contrast to the standard model where people are drawn into the virtual world of the computer. Professor Tom Knight is also located on the 8th floor, but all his activity happens up on the 9th floor. Besides developing low power reversible computers in silicon, Professor Knight has embarked on a radical new approach to computation, biology, and the manipulation of matter. We have replaced our silicon clean room with a wet bio lab where Professor Knight is inserting DNA into living E. Coli cells, highjacking their natural mechanisms so that they will compute while maintaining and reproducing themselves. The ultimate idea is to couple this with the amorphous computing work of the 4th floor and have self organizing living cells become molecular engineers that carry out manufacturing processes.

Dr. Ken Salisbury also straddles the 8th and 9th floors, and his work includes planetary rovers from NASA's Jet Propulsion Laboratory and haptic interfaces. Haptic interfaces provide a person with a sense of touch and ability to feel forces, masses, texture, friction, and temperature. This new mode of human computer interface is allowing all sorts of new applications. Most recently, Dr. Salisbury and his students have developed remote laporascopic surgical systems where the surgeon controls tiny robotic manipulators inside a person. The surgeon gets feedback from cameras attached to optical fibers inserted in the patient, and more importantly, retains a sense of touch via the haptic interface. Also on the 9th floor, I and my research group are studying the fundamentals of human intelligence by building robots with human form that interact with people in human like ways, learn in human like ways and develop just as humans do.

If you are interested in seeing more information on our research, please look at our web site, http://www.ai.mit.edu, and in particular, http://www.ai.mit.edu/lab/abstracts/1998

Rodney Brooks

MIT Reports to the President 1997-98