Thursday, March 12, 2009

SixthSence Wearable Device by MIT Media Lab

At the MIT Media Lab's new Fluid Interfaces Group, Pattie Maes researches the tools we use to work with information and connect with one another. She works with Pranav Mistry, who is the genius behind Sixth Sense, a wearable device that enables new interactions between the real world and the world of data. Their Sixth Sense demo was the buzz of this year's TED conference. It's a wearable device with a projector that paves the way for profound interaction with our environment. Imagine "Minority Report" and then some.



We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, there is no link between our digital devices and our interactions with the physical world. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.

The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers at the tip of the user’s fingers. The movements and arrangements are interpreted into gestures that act as interaction instructions for the projected application interfaces.

The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures. For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and recognizes those symbols as interaction instructions. For example, drawing a magnifying glass symbol takes the user to the map application or drawing an ‘@’ symbol lets the user check his mail. The SixthSense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a regular piece of paper. The gesture of drawing a circle on the user’s wrist projects an analog watch.

The current prototype system costs approximate $350 to build.

The innovative Sixth Sense project demonstrates that the computer is no longer a distinct object, but a source of intelligence that’s embedded in our environment. By outfitting ourselves with digital accessories, we can continually learn from (and teach) our surroundings. The uses of this tech -- from healthcare to home furnishings, warfare to supermarkets -- are powerful and increasingly real.

No comments: