Chris Harrison at Carnegie Mellon University and Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington, recently came up with their latest invention called Skinput, which represents a skin-based interface that makes it possible for a person to use his or her palm as a touchscreen.
The Skinput can be used to play games, control various devices, make phone calls and surf the Internet. The invention features a keyboard, menu and a number of other graphics that appear of the user's palm and forearm. The graphics are generated by a pico projector that in incorporated in an armband.
When the user touches a certain point on his or her palm, the acoustic detector in the armband identifies the part that was activated and performs the respective action. Scientists explain that the differences in bone density, size and mass, along with filtering effects from a person's soft tissues and joints, imply that various locations on the user's skin have different acoustic features. It is worth mentioning that the acoustic detector used in this invention is able to identify five skin locations, registering an accuracy of about 95.5 percent.
Using wireless technology, the researchers' latest invention can convey the signals to a cell phone, iPod or computer. The system was tested by 20 volunteers who gave a positive response to the device and its ability to provide fast navigation.
Researchers look forward to present their latest invention in April at the Computer-Human Interaction conference which will take
.
Microsoft (NSDQ: MSFT) is working on a new flesh-control input technology called “skinput.” But, it’s not what you’re (probably) thinking. While it might be possible to one day adapt this tech to more, shall we say, “erotic” applications, the first iteration of the skinput technology focuses on using the flesh as input controls for mobile devices. The implication here is that everything from smartphones to