Computing with a wave of the hand (w/ Video)

Friday, December 11, 2009 - 13:42 in Mathematics & Economics

(PhysOrg.com) -- The iPhone`s familiar touch screen display uses capacitive sensing, where the proximity of a finger disrupts the electrical connection between sensors in the screen. A competing approach, which uses embedded optical sensors to track the movement of the user`s fingers, is just now coming to market. But researchers at MIT`s Media Lab have already figured out how to use such sensors to turn displays into giant lensless cameras. On Dec. 19 at Siggraph Asia -- a recent spinoff of Siggraph, the premier graphics research conference -- the MIT team is presenting the first application of its work, a display that lets users manipulate on-screen images using hand gestures.

Read the whole article on Physorg

More from Physorg

Latest Science Newsletter

Get the latest and most popular science news articles of the week in your Inbox! It's free!

Check out our next project, Biology.Net