COLLOQUIUM REPORT
ON
FINGER TRACKING IN REAL TIME HUMAN COMPUTER INTRACTION
For the partial fulfillment of MASTER OF COMPUTER APPLICATION
SUBMITTED BY
DEVENDRA SHARMA (1006814016)
2012 ‐ 2013
Under the supervision of
Mr. RUBAN AGGARWAL
CERTIFICATE
ABSTRACT
For a long time research on human-computer interaction (HCI) has been restricted to techniques based on the use of monitor, keyboard and mouse. Recently this paradigm has changed. Techniques such as vision, sound, speech recognition, projective displays and location aware devices allow for a much richer, multi-modal interaction between man and machine.
Finger-tracking is usage of bare hand to operate a computer in order to make human-computer interaction much more faster and easier.
Fingertip finding deals with extraction of information from hand features and positions. In this method we use the position and direction of the fingers in order to get the required segmented region of interest.
The system is able to track the finger movement without building the 3D model of the hand. Coordinates and movement of the finger in a live video feed can be taken to become the coordinates and movement of the mouse pointer for human-computer interaction purpose.
In this report we will discuss that what is human computer interaction, how this interaction take place and what the importance of this human computer interaction in our daily life and research field. In this report we discuss some current research topic like postmodern phenomena, that how human and computer interaction influence this postmodern phenomena and how virtual reality work in the context of human and computer interaction.
INTRODUCTION
In the field of technology and image processing, finger tracking is a high-resolution technique that is employed to know the consecutive position of the fingers of the user and
Bibliography: 1. ^ Goebl, W.; Palmer, C. (2013). "Temporal Control and Hand Movement Efficiency in Skilled Music Performance". In Balasubramaniam, Ramesh. PLoS ONE 8 (1): e50901. doi:10.1371/journal.pone.0050901. PMC 3536780. PMID 23300946. edit 2. Anderson, D., Yedidia, J., Frankel, J., Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., & Sullivan, E. (2000). Tangible interaction + graphical interpretation: a new approach to 3D modeling. SIGGRAPH. p. 393-402. 3. Angelidis, A., Cani, M.-P., Wyvill, G., & King, S. (2004). Swirling-Sweepers: Constant-volume modeling. Pacific Graphics. p. 10-15. 4. Grossman, T., Wigdor, D., & Balakrishnan, R. (2004). Multi finger gestural interaction with 3D volumetric displays. UIST. p. 61-70. 5. Freeman, W. & Weissman, C. (1995). Television control by hand gestures. International Workshop on Automatic Face and Gesture Recognition. p. 179-183. 6. Ringel, M., Berg, H., Jin, Y., & Winograd, T. (2001). Barehands: implement-free interaction with a wallmounted display. CHI Extended Abstracts. p. 367-368. 7. Cao, X. & Balakrishnan, R. (2003). VisionWand: interaction techniques for large displays using a passive wand tracked in 3D. UIST. p. 173-182. 8. A. Cassinelli, S. Perrin and M. Ishikawa, Smart Laser-Scanner for 3D Human-Machine Interface, ACM SIGCHI 2005 (CHI '05) International Conference on Human Factors in Computing Systems, Portland, OR, USA April 02 - 07, 2005, pp. 1138 - 1139 (2005).