The WallComputer is a new paradigm in time-based information interactions. The system provides similar ease of use and rapid accessibility of devices such as calendars and clocks, while also providing a rich and extensive set of interactions generally only found on a multipurpose PC. The system incorporates two input modalities: gesture and speech. To perform gesture recognition, I wrote a gesture recognition engine designed for touch pads that is able to acheive 99% accuracy with a minimal training set in real-world conditions. A contextual booster dynamically restricts the command set of both modalities to improve recognition accuracy. The recognition software was written in Python, and the rich user interface was written in ActionScript.
The system has multiple modes: news, weather, calendar, stocks, and automation system control.
The assembled WallComputer is an LCD screen mounted behind a waterjet-cut round, steel frame. The WallComputer can be either embedded in a wall or hung from it. The picture above demonstrates the software in the News mode. Saying "next" or "previous," or performing a spin gesture (think iPod scroll wheel) on the gesture pad (the small circular grey pad at right) will load the next news story. Flick gestures and other voice commands switch the system to other modes.
You can find a presentation on the WallComputer here.
You can find an academic paper on the WallComputer here. The paper goes into further detail about implementation and performance.