News Archive (1999-2012) | 2013-current at LinuxGizmos | Current Tech News Portal |    About   

Electromagnetic tech turns walls into touch-sensitive inputs

May 6, 2011 — by LinuxDevices Staff — from the LinuxDevices Archive — 1 views

Researchers at Microsoft Research claim to have come up with a way to turn any wall into a touch-sensitive surface. The technology, which relies on the body's ability to deflect electromagnetic radiation, could provide a new way for devices to be controlled from anywhere in the house, according to a report in MIT's Technology Review.

As the May 3 report by Kate Greene notes, the ambient electromagnetic radiation emitted by the electrical wiring inside walls is usually considered just noise. But a Microsoft Research staffer and three colleagues have harnessed it as a way to control devices, the story says.

Microsoft researcher Desney Tan (along with Shwetak Patel, a University of Washington computer science professor, plus Dan Morris and Gabe Cohn, whose affiliations weren't noted) is set to present the paper "Your Noise is My Command: Sensing Gestures Using the Body as an Antenna" at next week's ACM CHI Conference on Human Factors in Computing Systems. In it, they'll describe how a body can turn electromagnetic noise into a usable signal for a gesture-based interface, Greene writes.

A remote control you can't lose

When a person touches a wall with electrical wiring behind it, he or she becomes an antenna that tunes the background radiation, producing a distinct electrical signal that depends on proximity to and location on the wall, according to the story. As an example, it's said that when a person touches a spot on the wall behind a couch, the gesture could be recognized and used to perform an operation such as turning down the volume on a stereo.


Turning a wall into a remote control that can't get lost
Source: Microsoft Research via MIT Technology Review

According to the Technology Review report, test subjects wore grounding straps that were connected to analog-to-digital converters. The resulting data was then fed to backpack-worn laptops (above) for processing via machine-learning algorithms, the story adds.

"Now we can turn any arbitrary wall surface into a touch-input surface," Patel is quoted as saying. The next steps, he's said to have noted, are making the data analysis real-time and to make the system smaller: Ultimately, a smartphone or a watch would be used to analyze inputs and transmit the resulting control data to other devices.

Greene quotes several professors in MIT's Media Lab as saying the body-as-antenna gesture interface shows promise, but a couple of potential stumbling blocks are noted. One is that walls might well require stickers or other markings so users remember where to touch; another is that the signals produced by touches could vary depending on exactly how a person wears the device that's collecting them.

Jonathan Angel can be followed at www.twitter.com/gadgetsense.


This article was originally published on LinuxDevices.com and has been donated to the open source community by QuinStreet Inc. Please visit LinuxToday.com for up-to-date news and articles about Linux and open source.



Comments are closed.