REFLECTIVE PRACTICE AND FEED-FORWARD: A TURNITIN PEERMARK PILOT
During the 17/18 academic session the School of Law and the E-Learning Team collaborated to explore and ultimately embed the use of technology enhanced peer marking of student work.
The aims of this project were varied and reflected the needs of the two departments involved:
This approach has now been embedded into the design and delivery of the course. We are about to launch the PeerMark activity again this week.
Assessment need not be passive (Dochy et al, 1999) and where students mark work, they are often both accurate and, in so doing, reflect on their own performance – often more than once – with the result of achieving better outcomes in future assignments (Gentle, 1994). Pedagogically, peer-assessment improves student learning (Falchikov & GoldFinch, 2000) through “a sense of ownership and responsibility, motivation, and reflection of the students’ own learning” (Saito & Fujita 2009), and has proven to be an effective example of giving students the opportunity to ‘feed forward’ (Wimhurst and Manning, 2013), improving participants’ conception of quality, and hence improving the quality of summative work. From an academic integrity perspective, peer marking activities can positively impact upon eradicating the possibility of plagiarism (Davis (2004).
The second year Criminology module, Key Perspectives and Debates in Criminology (30 credits) has since 2015 involved a peer marked formative essay. In previous years, the assessment was a stand-alone, peer-marked in hard copy and then moderated by the module convenor. For 2017-18, Alex Dymock revalidated the module to incorporate the peer marking exercise into a two-stage summative assessment to improve student engagement. Davis (2004) found evidence in support of previous claims that awarding a ‘mark for marking’ rewards the demonstration of higher order skills of assessment. The peer marking exercise is now rewarded with an automatic 5% towards the final module mark. Alex also introduced a further summative feed-forward activity. Duncan (2007) notes that some students only read qualitative comments if the quantitative mark is outside their expectations, failing to recognise their potential value. To mitigate this problem, students completed a 300 word reflective paragraph, also with a reward of an automatic 5%, submitted with their summative coursework. The aim of this was to push students to reflect on what steps have been taken to improve the quality of their work based on both peer feedback and feedback from Alex, and how taking part in peer marking has changed their conception of quality.
While efforts have been made in previous years to streamline the peer marking activity and enhance student engagement with it, for 2017-18 Alex worked intensively with Martin King in the E-Learning Team to pilot the PeerMark facility in Turnitin. This is the first time across the college that this facility has been used for assessment, so bespoke materials were produced by Martin to give students guidance on how to access essays, how to mark them, and how to submit reviews. We trialled the facility using a dummy assessment, which allowed us to evaluate the tool and develop an appropriate and sustainable workflow. Students were provided with an anonymised sample essay (with permission) from a previous year on the same topic to get a sense of what an outstanding essay might look like. They were also provided with substantive guidance on completing the reflective paragraph, including some FAQs, and guidelines on the benefits of reflective writing.
There is substantial evidence that the use of peer marking technology, and the feed forward activity, improved student engagement and added a reflective component to their learning:
Part of the Turnitin suite, PeerMark is a peer review assignment tool. Academic staff can create and manage PeerMark assignments that allow students to read, review, and evaluate one or many papers submitted by their classmates.
While we believe the assessment structure of CR2013 was much improved by the use of the PeerMark on Turnitin, and the inclusion of a feed forward task, further improvements could be made both within the technology itself and the structure of assessment. While almost all students did undertake the peer marking activity, some students engaged and contributed significantly more than others, with the result that some recipients of peer reviews received better quality feedback than others. Efforts were made to reward those students who exerted considerable effort in undertaking the peer marking exercise with small prizes handed out in front of the cohort in a lecture, to improve engagement with the feed forward task. Where peer feedback was lacking, more extensive feedback was provided by Alex via Feedback Studio, but in future other strategies could be used to improve the quality of engagement and student involvement, such as:
In future years, green sticker students should be encouraged to include their status on the body of their work, and peer markers should receive guidance on the adjustments they should make to feedback to accommodate green sticker students. This would not only improve the inclusivity of the exercise itself, but encourage students across the cohort to recognise and be sensitive to a range of disabilities that might affect their peers.
More informal opportunities for discussion of the peer marking activity could be provided, such as further preparation on the learning outcomes of the exercise, and post-activity student-led evaluation.
“Having just submitted our CR2013 Summative, I’d like to pass on my thanks and appreciation for setting the peer mark exercise both for our formative and summative. It really helped me understand explicitly what a marker is looking for in an essay, and I often referred to both my peer’s feedback and your feedback from my last formative when writing the recent summative essay.”
“By marking someone else’s essay, I was able to extensively use the marking criteria as an examiner would. I was able to look for specific criteria in their work, which ultimately guided me for what I should be including in future essays, such as showing excellent as opposed to good understanding of the topic, and to do this by defining key concepts and providing examples/evaluations.”
”I feel that it has been useful having feedback from a peer as I know that they have been through the same process in writing their assignment and so can therefore use their own experience to feedback on my work. Being able to receive praise from another student is very motivational as I feel that they are on the same level as I am within this degree. The fact that Dr Dymock’s feedback stated similar things to the peer feedback was also interesting as it made me realise that when looking at my work and judging its quality, it is possible for me, as a student, to give a similar perspective on whether it is of a good standard or not, as a lecturer would. This has allowed me to further my ability to properly check over my work and make changes after finishing to ensure I can get the best possible mark.”
Ashenafi, M.M. (2015) ‘Peer-assessment in higher education – twenty-first century practices, challenges and the way forward’, Assessment & Evaluation in Higher Education, 42(2), pp. 226-251.
Davies, P (2004) ‘Don’t write, just mark: the validity of assessing student ability via their computerized peer-marking of an essay rather than their creation of an essay’, Research in Learning Technology, 12(3), pp. 261-277 [Online]. Available at: http://repository.alt.ac.uk/611/1/ALT_J_Vol12_No3_2004_Dont%20write%2C%20just%20mark_%20the%20val.pdf (Accessed: 17th October 2017).
Dochy, F., Segers, M., Sluijsmans, D. (1999) ‘The use of self-, peer and co- assessment in higher education: A review’, Studies in Higher Education, 24(3), pp. 331-350.
Duncan, N. (2007) ‘‘Feed-forward’: improving students’ use of tutors’ comments’, Assessment & Evaluation in Higher Education, 32(3), pp.271-283.
Duret, D. et al (2018) Collaborative learning with PeerWise, Available at: https://journal.alt.ac.uk/index.php/rlt/article/view/1979 (Accessed: 12nd March 2018).
Falchikov, N., Goldfinch, J. (2000) ‘Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks’, Review of Educational Research, 70(3), pp. 287-322.
Gentle, C.R. (1994) ‘Thesys: an expert system for assessing undergraduate projects’, in Thomas, M. et al (ed.) Deciding our Future: technological imperatives for education. Austin, TX: University of Texas, pp. 1158-1160.
Johnson L., etal (2016) NMC Horizon Report: 2016 Higher Education Edition, Available at: http://cdn.nmc.org/media/2016-nmc-horizon-report-he-EN.pdf (Accessed: 17th October 2017).
Orsmond, P., Merry, S., Callaghan, A. (2007) ‘Implementation of a formative assessment model incorporating peer and selfassessment’, ‐Innovations in Education and Teaching International, 41(3), pp. 273-290 [Online]. Available at: https://srhe.tandfonline.com/doi/full/10.1080/14703290410001733294 (Accessed: 17th October 2017).
Saito, H., Fujita, T. (2009) ‘Peer-assessing peers’ contribution to EFL group presentations’, RELC Journal, 40(2), pp. 149–171.
Yu, T-C. et al (2011) ‘Medical students-as-teachers: a systematic review of peer-assisted teaching during medical school’, Advances in Medical Education and Practice, 2: 157-172.