Turn Your Body Into A Touchscreen

As tech-savvy consumers fresh out of the aughts, most of us are unlikely to be impressed by new touch interfaces. But researchers from Microsoft and Carnegie Mellon University have something remarkable up their sleeve with Skinput, a bio-acoustic sensing technology that allows our body to be used as a large finger-input surface without any electronics touching the skin.

Though the average adult possesses approximately two square meters of external surface area, it may have never occurred to you that this space could act as a way to control the devices you carry around. Chris Harrison, however, a PhD student in the Human-Computer Interaction Institute at Carnegie Mellon and one of the primary researchers for Skinput, believes the easy accessibility of our various limbs is quite conducive to such interactions.

When a finger taps the skin, the impact creates a variety of acoustic signals that travel through its surface. To capture and harness these signals, Harrison and his team created a special purpose bio-acoustic sensing array armband, accompanied by software that listens for skin surface impacts and classifies them appropriately. Variations in bone density, size, and mass, along with filtering effects from soft tissues and joints means different body locations are acoustically unique; the current Skinput prototype is built to gather signals from a user’s arms and hands.

Just how accurate can tapping on your arm be? “In our user study, we evaluated several input location sets, which demonstrated our approach could achieve accuracy as high as 95.5 percent for five locations, a sufficient number of buttons for many mobile interactions,” said Harrison. An audio player or portable PC, for example, could be controlled simply by tapping various fingers together.

Skinput is also capable of turning your body into a touchscreen of sorts. Harrison’s team outfitted its prototype with a Microvision pico projector, transforming a user’s arms into virtual buttons that respond in real time to various inputs.

While this technology is at least ten years away from being commercialized, its successful implementation could change the way we interact with the web and our data.