Skip to content ↓

Robot wheelchair finds its own way

MIT invention responds to user's spoken commands
Nicholas Roy, left, assistant professor of aeronautics and astronautics, and Seth Teller, professor of computer science and engineering, stand next to the robotic wheelchair they co-designed, which can be navigated by vocal command.
Caption:
Nicholas Roy, left, assistant professor of aeronautics and astronautics, and Seth Teller, professor of computer science and engineering, stand next to the robotic wheelchair they co-designed, which can be navigated by vocal command.
Credits:
Photo / Patrick Gillooly

MIT researchers are developing a new kind of autonomous wheelchair that can learn all about the locations in a given building, and then take its occupant to a given place in response to a verbal command.

MIT Tech TV
Demonstration of an MIT-designed wheelchair that responds to verbal commands.
Video courtesy Nicholas Roy

Just by saying "take me to the cafeteria" or "go to my room," the wheelchair user would be able to avoid the need for controlling every twist and turn of the route and could simply sit back and relax as the chair moves from one place to another based on a map stored in its memory.

"It's a system that can learn and adapt to the user," says Nicholas Roy, assistant professor of aeronautics and astronautics and co-developer of the wheelchair. "People have different preferences and different ways of referring" to places and objects, he says, and the aim is to have each wheelchair personalized for its user and the user's environment.

Unlike other attempts to program wheelchairs or other mobile devices, which rely on an intensive process of manually capturing a detailed map of a building, the MIT system can learn about its environment in much the same way as a person would: By being taken around once on a guided tour, with important places identified along the way. For example, as the wheelchair is pushed around a nursing home for the first time, the patient or a caregiver would say: "this is my room" or "here we are in the foyer" or "nurse's station."

Also collaborating on the project are Bryan Reimer, a research scientist at MIT's AgeLab, and Seth Teller, professor of computer science and engineering and head of the Robotics, Vision, and Sensor Networks (RVSN) group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL). Teller says the RVSN group is developing a variety of machines, of various sizes, that can have situational awareness, that is, that can "learn these mental maps, in order to help people do what they want to do, or do it for them." Besides the wheelchair, the devices range in scale from a location-aware cellphone all the way up to an industrial forklift that can transport large loads from place to place outdoors, autonomously.

Outdoors in the open, such systems can rely on GPS receivers to figure out where they are, but inside buildings that method usually doesn't work, so other approaches are needed. Roy and Teller have been exploring the use of WiFi signals, as well as wide-field cameras and laser rangefinders, coupled to computer systems that can construct and localize within an internal map of the environment as they move around.

"I'm interested in having robots build and maintain a high-fidelity model of the world," says Teller, whose central research focus is developing machines that have situational awareness.

For now, the wheelchair prototype relies on a WiFi system to make its maps and then navigate through them, which requires setting up a network of WiFi nodes around the facility in advance. After months of preliminary tests on campus, they have begun trials in a real nursing home environment with real patients, at the Boston Home in Dorchester, a facility where all of the nearly 100 patients have partial or substantial loss of muscle control and use wheelchairs.

As the research progresses, Roy says he'd like to add a collision-avoidance system using detectors to prevent the chair from bumping into other wheelchairs, walls or other obstacles. In addition,Teller says he hopes to add mechanical arms to the chairs, to aid the patients further by picking up and manipulating objects -- everything from flipping a light switch to picking up a cup and bringing it to the person's lips.

The research has been funded by Nokia and Microsoft.

A version of this article appeared in MIT Tech Talk on September 24, 2008 (download PDF).

Related Links

Related Topics

More MIT News