Junisilver Taij
This report is submitted as partial fulfilment of the requirements for the Honours Programme of the School of Computer Science and Software Engineering, The University of Western Australia, 2005
Abstract
Programming is a complicated skill to master, and learning to program is complex. The difficulty of first year students studying computer science is that they generally do not already have a substantial understanding of computer programming. This results in either student retention, or student carefully selecting units in later years that have less exposure to programming. Associated with this is a desire to spot potential problem students have as early as possible. Timely warning can aid the teacher to provide more assistance to students. Furthermore, students would be able to change their current methods of study and make more informed choices before it is too late. The goal of the present research is to replicate previous investigations and test into whether the latency, or delay between certain keystrokes, correlates with the objective measure of programming performance. Controlled experiment was conducted in UWA with a total of 34 participants to test the hypotheses. Complete records of the keys being pressed and the millisecond timing were captured by using a logging tool known as User Action Recorder (UAR). These keystroke data were then recovered into digraphs according to their types. Spearman Rank Correlation Test was performed for each digraph type against programming score. This experiment had a goal to discover whether the previous typing pattern and results hold in a new setting. We examine the results from the experiment in UWA against the results from the previous two experiments. The results show that the correlation theory holds stronger in UWA. However, these results were not as significantly strong compared to the
Bibliography: [1] Altman, D. Spearman’s rank correlation http://www.blackwellpublishing.com/specialarticles/jcn 8 763.pdf. test. [2] Arisholm, E., and Sjoberg, D. Assessing the changeability of two object-oriented design alternatives - a controlled experiment. In Empirical Software Engineering (2001), vol. 6(3), pp. 231–277. [3] Arisholm, E., and Sjoberg, D. A web-based support environment for software engineering experiments. In Nordic Journal of Computing (2002), vol. 9(4), pp. 231–247. [4] Bechtel, J., Serpen, G., and Brown, M. Passphrase authentication based on typing style through an art 2 neural network. In International Journal of Computational Intelligence and Applications (2002), vol. 2(2), pp. 1–22. [5] Beck, J., Jia, P., Sison, J., and Mostow, J. Predicting student helprequest behavior in an intelligent tutor for reading. In User Modelling 2003 Conference, Johnstown, PA, USA, June 22-26 (2003), pp. 303–312. [6] Borland. Borland: Jbuilder. http://www.borland.com.us/products/jbuilder /index.html. [7] Buxton, W. Chunking and phrasing and the design of human-computer dialogues. In Proceedings of the IFIP World Computer Congress, Dublin, Ireland (1986), pp. 475–480. [8] Card, S., Moran, T., and Newell, A. The keystroke-level model for user performance time with interactive systems. In Communications of the ACM (1980), vol. 23, pp. 396–410. [9] Card, S., Moran, T., and Newell, A. The psychology of humancomputer interaction. In Hillsdale N.J., Lawrence Erlbaum (1983). [10] Crease, M. Private communication to r thomas, March 2003. [11] Directorate, M. H. R. Human research ethics committee. http://mh1.mh.org.au/research/hrec/default.htm. 51 [12] Dix, A., Finlay, J., Abowd, G., and Beale, R. Human-Computer Interaction. Prentice Hall, 2004, pp. 436–441. [13] Draper, S. Tinto’s model. http://www.psy.gla.ac.uk/ steve/localed /tinto.html. [14] Eriksson, H., and Penker, M. UML Toolkit. New York, John Wiley & Sons, Inc, 1998, ch. Case Study. [15] Garcia, J. Personal identification apparatus. In U.S Patent and Trademark Office (1986). [16] Genter, D. The acquisition of typing skill. In Acta Psychologica (1983), vol. 54, pp. 233–248. [17] Hughes, J., and Parkes, S. Trends in the use of verbal protocol analysis in software engineering research. In Behaviour and Information Technology (2003), vol. 22(2), pp. 127–140. [18] Jenkins, T. On the difficulty of learning http://www.psy.gla.ac.uk/ steve/localed/jenkins.html. to program. [19] John, B., and Newell, A. Cumulating the science of hci: From s-r compatibility to transcription typing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems: Wings for the mind (1989), pp. 109–114. [20] Joyce, R., and Gupta, G. Identity authentication based on keystroke latencies. In Communications of the ACM (1990), vol. 33(2), pp. 168–176. ´ [21] Karahasanovic, A., Anda, B., Arisholm, E., Hove, S., Jorgensen, M., Sjoberg, D., and Welland, R. Collecting feedback during software engineering experiment. In Empirical Software Engineering (2005), vol. 10(2), pp. 113–147. ´ [22] Karahasanovic, A., Hinkel, U., Sjoberg, D., and Thomas, R. Feedback collection versus think-aloud in software engineering research: A controlled experiment. In Behaviour and Information Technology (2004). ´ [23] Karahasanovic, A., and Thomas, R. Experiment on the comprehension of object-oriented systems. Tech. rep., University of Oslo and University of Western Australia, 2005. 52 [24] Monrose, F., and Rubin, A. Authentication via keystroke dynamics. In Proceedings of the 4th ACM conference on Computer and Communications Security (1997), pp. 48–56. [25] Morgan, D. Successful focus groups. advancing the state of the art. In Sage (1993). [26] Newell, A. Unified Theory of Cognition. First Harvard University Press, 1994, ch. 5. [27] Perry, D., Porter, A., and Votta, L. Empirical studies of software engineering: A roadmap. In Proceedings of the Conference on the Future of Software Engineering (2000). [28] Salthouse, T. Perceptual, cognitive, and motoric aspects of transcription typing. In Psychological Bulletin (1986), vol. 99(3), pp. 303–319. [29] Seaman, C. Qualitative methods in empirical studies of software engineering. In IEEE Transactions on Software Engineering (1999), vol. 25(4), pp. 557–572. [30] Shaft, T. Responses to comprehension questions and verbal protocols as measures of computer program comprehension processes. In Behaviour and Information Technology (1997), vol. 16(6), pp. 320–336. [31] Sjoberg, D., Anda, B., Arisholm, E., Dyb˚ T., Jorgensen, M., a, ´ ´ Karahasanovic, A., Koren, E., and Vokac, M. Conducting realistic experiments in software engineering. In Proc 18th Int’l Symp. Empirical Software Engineering (2002), pp. 17–26. [32] StatSoft, E. T. Statsoft. http://www.statsoft.com/textbook/glosn.html. [33] Thomas, R. Long Term Human-Computer Interaction: An Exploratory Perspective. Springer, 1998, ch. 2, pp. 11–15. [34] Thomas, R. The comprehension of object-oriented systems. application to uwa human research ethics committee, April 2005. ´ [35] Thomas, R., Karahasanovic, A., and Kennedy, G. An investigation into keystroke latency metrics as an indicator of programming performance. In Conference in Research and Practice and Information Technology Series (2003), vol. 42, pp. 127–134. 53 [36] Thomas, R., and Kennedy, G. Generic usage monitoring of programming students. In ASCILITE 2003 Conference, University of Adelaide, Australia (2003). [37] Using spss 10.0 for windows (for statistical http://www.urban.uiuc.edu/courses/Varkki/spss/. analysis). [38] Whiteside, J., Archer, N., Wixon, D., and Good, M. How do people really use text editors? In Proceedings of the SIGOA Conference on Office Information Systems (1982), pp. 29–40. 54