A drone’s eye view introduces a novel way to transform any surface into an interactive interface. The struggle of fumbling around with a phone or laptop connecting to a full sized projector will be a thing of the past with our system. By eliminating this large barrier of size and portability, the Drone’s Eye View creates a truly portable interaction generator. Simply put a small projector attached to a drone. By combining the portability of "picoprojectors"with the transportation mechanism of drone flight, this system allows a user to show large interfaces on flat surfaces. However, this doesn’t quite capture the power of our system. The incorporation of a leap motion gesture recognition system allows not only control of the drone, but also interaction with the interface being projected onto the surface.
Our system is composed of a drone flying around with reflective markers attached to it. An external tracking system, OptiTrack, picks up these markers and tracks the drone as well as the user. The user is also wearing a Leap Motion – a hardware sensor that can track fingers and hands. The user can provide input by making hand gestures in mid-air to control the application. Attached to the drone is a 3D printed enclosure with a picoprojector, microcomputer, voltage regulation and portable power circuitry, USB hub, Bluetooth transmitter, USB audio sound card, and USB Wifi dongle. These parts when connected appropriately make up our system: Drone’s Eye View.
The application that we are using to demo this new interface is a music interface. You walk over to your home theatre system, and the drone approaches you and projects the music app interface. With the swipe of a hand you can change songs, increase the volume, play and pause. There’s a natural interface to interact with content and even when you move away from your speakers, as the drone follows you to the next surface.
This is the splash screen for the music application. This second screen is the song album cover as the song is playing on the Bluetooth speakers.
Our system takes in 4 custom gestures as input. Using these gestures, the user can control the drone as well as the application being projected.
Confirmation Gesture: This gesture is similar to pressing a big button on your wrist. Bring your hand down swiftly to trigger a confirmation for drone commands. This can mean takeoff the drone, turn to face user, move towards wall, or land depending on what state the drone is currently in.
The swipe gesture is used to change songs. A quick swipe of your hand either left or right will switch the song.
With your palm facing down, make a fist. This can stop the current song or play it, if it is paused. s
The volume can be controlled as if there is a slider in mid-air. Simply pick a point above the leap motion and pinch your index finger and thumb. Then slowly raise your entire hand and watch the volume go up. Similarly lower your hand and watch the volume go down.
This application was one of many possible applications. We can envision several applications where portable and intuitive projection can be quite useful. Beyond just portable, projection while in motion can be thought of as a new medium for consuming content and augmenting our environments.
Some possible applications are:
You can learn more about what's under the hood and how our system works:How It Works