Skinput turns your arm into a touchscreen

March 1, 2010 | Source: Physorg.com

Researchers at Carnegie Mellon University and Microsoft Research have developed a new skin-based interface called Skinput that allows for using hands and arms as touchscreens.

Skinput works by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin, allowing users to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems.

A keyboard, menu, or other graphics are beamed onto a user’s palm and forearm from a pico projector embedded in an armband. An acoustic detector in the armband then determines which part of the display is activated by the user’s touch. Variations in bone density, size, and mass, as well as filtering effects from soft tissues and joints, mean different skin locations are acoustically distinct. Their software matches sound frequencies to specific skin locations, allowing the system to determine which “skin button” the user pressed.