I recently encountered a compelling study from 2012 that investigated the viability of screen capture technology as a method of providing student feedback. The study was from vol. 64 of Higher Education, and involved 119 MBA and undergraduate students. The study provided students with feedback created using a screen-capture technology, coupled with microphone audio, and students were asked to respond to their interpretations of the system as a viable option for instructor feedback. The results were overwhelmingly positive and the technology seems to offer online instructors a compelling new avenue for providing feedback to students, and the method has rich-media capabilities. Specifically, the researchers in the study found that: “(a) this medium has advantages over traditional methods of communicating feedback, (b) that students enjoy this new form of feedback, and (c) that this encourages them to engage with and learn from the tutor assessment of answers, rather than concentrating only on obtaining marks” (Jones et al. 593).
Specifically, the students in the study were broken into two groups: undergraduates (n=75) and MBA students (n=119). Each student was provided with a short, screen-captured video of the tutor’s evaluation of their assignment, overlaid with audio comments and mouse action. The videos were dynamic, with tutor’s highlighting sections of the student assignment and indicating where corrections were necessary, where additional work was needed, etc. The students were then given a questionnaire pertaining to their interpretation of the viability of the feedback, its richness, and a series of audio-taped, open-ended discussions were conducted, wherein the students were asked questions concerning, among other things, what their idea of good feedback entailed, what characterized good feedback, what types of feedback they had received in the past, and how the screen-capture video feedback fit into their experience and expectations (Jones et al. 598). The students were also asked what they normally do after written feedback and what they had done with the video file since receiving it (Jones et al. 598).
The responses were compelling. Within both groups, the students were unanimous in their answer of “strongly agree” to the question: “I know what I have to do in order to improve with the information given in the feedback” (Jones et al. 600). Similarly, high agreements were seen for questions pertaining to the helpfulness and clarity of the feedback, as well as its accuracy and relevance to their work (Jones et al. 600). The taped interview sessions elicited dialog indicating that personalization was important to the students (Jones et al. 601). Consider the following comments:
“Either in audio or video format I find it easier to understand than written”; “Video you can actually see your work in front of you, listen to your tutor and go through it together as if you were sitting in the same room”; it shows you your own problems as opposed to just general”; “The video feels a bit more, well more personal; it’s usually more or less tailored”; “Actually pointing the cursor saying that is wrong, like you say, it’s a fresh pair of eyes”; “I could stand there and look at it for an hour and still not be able to see the problem, whereas although you’re not telling me the answer you’re saying listen this is the area you need to concentrate on” (Jones et al. 601)
Figure 1: Screenshot of Virtual Learning Environment (VLE) and Steps in the Feedback Process
The platform used for the video capture was Microsoft Media Player 9 Video Codec, a free software built around Direct X and compatible with Windows 7 and XP. The file sizes for the videos averaged about 6 MB for a 6-min video, and the largest file was 9 MB (Jones et al 605). Comparatively, an equivalent DVD file would be 240 MB, hinting at the efficiency of the Windows Media Player 9 Video Codec. In an appendix to the study, when commenting on this efficiency, the researchers stated: “It is about 50-100 times more efficient than the commonly-used run length codecs (as used in other similar programs) or MPEG-2 video as delivered on DVD” (Jones et al. 606).
The idea of digital screen-capture video is not a new one, and has been used by gamers and software-developers since the late 1990’s. The last two iterations of the Microsoft XBOX had a screen-capture utility built into its operating system, which allowed users to capture in-game footage and share it with the others. Indeed, YouTube is replete with such game footage, often edited quite professionally and overlaid with audio commentary. While the realm of the online classroom is, in most respects, a far cry from the realm of online gaming, they both exist in a virtual sphere, and they both aim to pursue technology in opportunity cost, although to somewhat different ends. Likewise, they both involve a virtual connection between people within a community of discourse. Perhaps the two realms are not so different, after all. Maybe online instructors can learn something from online gamers, and incorporate screen-capture digital video into their virtual classrooms. If the results of Jones et al.’s 2012 study are any indicator, our students are eager for feedback in this more cutting-edge way, and they find it more helpful than old ways of connecting.
Works Cited
Bures, Eva Mary, et al. “Student Motivation to Learn via Computer Conferencing.” Research in
Higher Education, vol. 41, no. 5, 2000, pp. 593–621., www.jstor.org/stable/40196404
Jones, Nigel, et al. “Student Feedback via Screen Capture Digital Video: Stimulating Student’s
Modified Action.” Higher Education, vol. 64, no. 5, 2012, pp. 593-607– www.jstor.org/stable/23275715
Kim, Loel. “Online Technologies for Teaching Writing: Students React to Teacher Response in
Voice and Written Modalities.” Research in the Teaching of English, vol. 38, no. 3, 2004, pp. 304- 337., www.jstor.org/stable/40171616.