Chris Harrison1,2, Desney Tan2, Dan Morris2 Human-Computer Interaction Institute Carnegie Mellon University 5000 Forbes Avenue, Pittsburgh, PA 15213 chris.harrison@cs.cmu.edu
ABSTRACT
1
Microsoft Research One Microsoft Way Redmond, WA 98052 {desney, dan}@microsoft.com propriated surfaces with them (at this point, one might as well just have a larger device). However, there is one surface that has been previous overlooked as an input canvas, and one that happens to always travel with us: our skin. Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, proprioception – our sense of how our body is configured in three-dimensional space – allows us to accurately interact with our bodies in an eyes-free manner. For example, we can readily flick each of our fingers, touch the tip of our nose, and clap our hands together without visual assistance. Few external input devices can claim this accurate, eyes-free input characteristic and provide such a large interaction area. In this paper, we present our work on Skinput – a method that allows the body to be appropriated for finger input using a novel, non-invasive, wearable bio-acoustic sensor. The contributions of this paper are: 1) We describe the design of a novel, wearable sensor for bio-acoustic signal acquisition (Figure 1). 2) We describe an analysis approach that enables our system to resolve the location of finger taps on the body.
2
We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of
References: 1. Ahmad, F., and Musilek, P. A Keystroke and Pointer Control Input Interface for Wearable Computers. In Proc. IEEE PERCOM ’06, 2-11. 2. Amento, B., Hill, W., and Terveen, L. The Sound of One Hand: A Wrist-mounted Bio-acoustic Fingertip Gesture Interface. In CHI ‘02 Ext. Abstracts, 724-725. 3. Argyros, A.A., and Lourakis, M.I.A. Vision-based Interpretation of Hand Gestures for Remote Control of a Computer Mouse. In Proc. ECCV 2006 Workshop on Computer Vision in HCI, LNCS 3979, 40-51. 4. Burges, C.J. A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery, 2.2, June 1998, 121-167. 5. Clinical Guidelines on the Identification, Evaluation, and Treatment of Overweight and Obesity in Adults. National Heart, Lung and Blood Institute. Jun. 17, 1998. 6. Deyle, T., Palinko, S., Poole, E.S., and Starner, T. Hambone: A Bio-Acoustic Gesture Interface. In Proc. ISWC '07. 1-8. 7. Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., and Twombly, X. Vision-based hand pose estimation: A review. Computer Vision and Image Understanding. 108, Oct., 2007. 8. Fabiani, G.E. McFarland, D.J. Wolpaw, J.R. and Pfurtscheller, G. Conversion of EEG activity into cursor movement by a brain-computer interface (BCI). IEEE Trans. on Neural Systems and Rehabilitation Engineering, 12.3, 331-8. Sept. 2004. 9. Grimes, D., Tan, D., Hudson, S.E., Shenoy, P., and Rao, R. Feasibility and pragmatics of classifying working memory load with an electroencephalograph. Proc. CHI ’08, 835-844. 10. Harrison, C., and Hudson, S.E. Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile finger Input Surfaces. In Proc. UIST ’08, 205-208. 11. Hirshfield, L.M., Solovey, E.T., Girouard, A., Kebinger, J., Jacob, R.J., Sassaroli, A., and Fantini, S. Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy. In Proc. CHI ’09, 2185-2194. 12. Ishii, H., Wisneski, C., Orbanes, J., and Chun, B., Paradiso, J. PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play. Proc. CHI ’99, 394-401. 13. Lakshmipathy, V., Schmandt, C., and Marmasse, N. TalkBack: a conversational answering machine. In Proc. UIST ’03, 41-50. 14. Lee, J.C., and Tan, D.S. Using a low-cost electroencephalograph for task classification in HCI research. In Proc. CHI ’06, 81-90. 15. Lyons, K., Skeels, C., Starner, T., Snoeck, C. M., Wong, B.A., and Ashbrook, D. Augmenting conversations using dualpurpose speech. In Proc. UIST ’04. 237-246. 16. Mandryk, R.L., and Atkins, M.S. A Fuzzy Physiological Approach for Continuously Modeling Emotion During Interaction with Play Environments. Intl Journal of Human-Computer Studies, 6(4), 329-347, 2007. 17. Mandryk, R.L., Inkpen, K.M., and Calvert, T.W. Using Psychophysiological Techniques to Measure User Experience with Entertainment Technologies. Behaviour and Information Technology, 25(2), 141-58, March 2006. 18. McFarland, D.J., Sarnacki, W.A., and Wolpaw, J.R. Brain– computer interface (BCI) operation: optimizing information transfer rates. Biological Psychology, 63(3), 237-51. Jul 2003. 19. Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI ‘09 Ext. Abst., 4111-4116. 20. Moore, M., and Dua, U. A galvanic skin response interface for people with severe motor disabilities. In Proc. ACM SIGACCESS Accessibility and Comp. ‘04, 48-54. 21. Paradiso, J.A., Leo, C.K., Checka, N., and Hsiao, K. Passive acoustic knock tracking for interactive windows. In CHI ‘02 Extended Abstracts, 732-733. 22. Post, E.R. and Orth, M. Smart Fabric, or “Wearable Clothing.” In Proc. ISWC ’97, 167. 23. Rosenberg, R. The biofeedback Pointer: EMG Control of a Two Dimensional Pointer. In Proc. ISWC ’98, 4-7. 24. Saponas, T.S., Tan, D.S., Morris, D., and Balakrishnan, R. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In Proc. CHI ’09, 515-24. 25. Sturman, D.J. and Zeltzer, D. A Survey of Glove-based Input. IEEE Comp Graph and Appl, 14.1, Jan 1994. 26. Wilson, A. PlayAnywhere: a compact interactive tabletop projection-vision system. In Proc. UIST ‘05, 83-92. 27. Wilson, A.D. Robust computer vision-based detection of pinching for one and two-handed gesture input. In Proc. UIST ’06, 255-258. 28. Witten, I.H. and Frank, E. Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco, 2005.