The Draper Small Autonomous Aerial Vehicle, or DSAAV, is a high-end radio-control helicopter that has been instrumented and computerized to fly autonomously. As a demonstration of its capabilities, the DSAAV team, lead by Draper engineer Paul DeBitetto, entered the helicopter in the Sixth Annual International Aerial Robotics Contest held at the Epcott Center, and won. The objective of vehicles entered in the contest was to autonomously survey a field containing hazardous waste barrels and report the barrel locations within one meter. The DSAAV located all five hazardous waste barrels on the contest field within the required accuracy.
Click HERE to see a short video of one contest flight (MPEG format, 1.4 MB)
The sensor suite of the DSAAV includes:
Thanks to Mike Bosse for a great picture! Visit his site
The computer is a PC-104 format 486 running at 50MHz. It drives the helicopter's servos through a modified radio-control receiver designed and built by Draper engineer Bob Butler. The human test pilot can take over control of the helicopter by flipping a switch on his transmitter, causing the receiver to follow the test pilot's commanded servo positions rather than those of the onboard computer. More Details.
Eric Johnson, a Draper Simulation Laboratory employee, wrote the guidance and control laws for the helicopter. To test his controller and guidance algorithms, he wrote a simulation of the helicopter based on a NASA model and implemented it using the laboratory's Simulation Framework. The simulation proved invaluable in that Eric's control laws required only minor tuning at the field; the majority of the control law development was completed without running the helicopter.
Sample graphical output of the helicopter simulation is shown below. Another testament to the quality of the simulation is that Eric learned to fly radio control (RC) helicopters using his simulation. When he tried his hand at a real RC helicopter, he was able to fly it right away. Most beginning RC helicopter pilots must spend considerable time with the helicopter on the ground taking short hops before they are able to fly.
The onboard video camera is used by the DSAAV to detect objects on the ground below it. Images from the camera are transmitted to a Silicon Graphics workstation on the ground. Software written by Boston University student Michael Bosse analyzes the images one at a time for objects of interest, and reports their locations. This information is then used by a target-tracking Kalman filter written by MIT student Bill Hall. This filter intelligently combines the single image information from all images seen during the mission, throwing away false returns and filling in consistent returns that were missed occasionally. Mike Bosse's single-frame recognition software is described at this web site. A paper is forthcoming on the multiple-frame filter.
In the future the team would like to use the images from the onboard camera as a third position measurement sensor and as a terrain mapping sensor. This would allow the vehicle to fly and / or land in areas of poor GPS coverage.
Another goal of the DSAAV is to develop and demonstrate cooperative behavior among autonomous agents. We have developed algorithms to allow cooperating aerial vehicles to work in an airspace without colliding, and plan to extend the concept to ground-air cooperation.
Brought to you by
The Charles Stark Draper Laboratory
and The Massachusetts Institute of Technology
The counter is gone at 1608 accesses on 2 June 1997.