The paper presents a short literature review comparing online evaluations with paper. The Eco nomics department at University of Belgrade, Serbia conducted a small pilot in a course of 800students in May of 2006. Half the students received paper evaluations in class and half were directed to complete an identical online evaluation. The paper evaluation received a 92.5% response rate and the online received a 52% response rate after an incentive was introduced. They found thatnearly twice as many students filled out the open‐ended question online when compared to the paper group. On the instructor‐related questions they found a variation of 0.09 to0.22 on a 10‐point scale. No statistical analysis was done for significance.
Lovric, M. (2006). Traditional and web‐based course evaluations‐comparison of their response rates and efficiency. Paper presented at 1st Balkan Summer School on Survey Methodology.
Site viewed December 2012. http://www.balkanprojectoffice.scb.se/Paper%20Miodrag%20Lovrich_University%20of%20Belg rade.pdf
Georgia State University College of Business ran a voluntary pilot from 2002 to 2003 using an identical online version of their paper course evaluation form in the Department of Computer information Systems. Faculty feared an online form would yield lower scores and lower response rates. In particular, the fear was that few students would submit online evaluations, poor students would “take revenge” on the faculty and good students wouldn’t bother. The paper form had a 67% response rate and the online form had an 82% response rate. This likely due to the fact that the CIS department had easy access to computer labs for students to take the evaluations online. Using a question on teacher effectiveness, the study found no significant difference between the methods. Good students participated in the same numbers and weaker students did fewer online evaluations.
Liegle, J O and D S McDonald. Lessons Learned From