During the 17/18 academic session the School of Law and the E-Learning Team collaborated to explore and ultimately embed the use of technology enhanced peer marking of student work.
The aims of this project were varied and reflected the needs of the two departments involved:
- to enhance, promote and streamline pedagogically sound approaches to assessment
- to provide assessment-based opportunities for reflective practice and to provide feed-forward
- to expand the range of credible assessment models at Royal Holloway, University of London
Update Nov 5 2018: This approach has now been embedded into the design and delivery of the course. We are about to launch the PeerMark activity again this week.
Development & practice
Assessment need not be passive (Dochy et al, 1999) and where students mark work, they are often both accurate and, in so doing, reflect on their own performance – often more than once – with the result of achieving better outcomes in future assignments (Gentle, 1994). Pedagogically, peer-assessment improves student learning (Falchikov & GoldFinch, 2000) through “a sense of ownership and responsibility, motivation, and reflection of the students’ own learning” (Saito & Fujita 2009), and has proven to be an effective example of giving students the opportunity to ‘feed forward’ (Wimhurst and Manning, 2013), improving participants’ conception of quality, and hence improving the quality of summative work. From an academic integrity perspective, peer marking activities can positively impact upon eradicating the possibility of plagiarism (Davis (2004).
The second year Criminology module, Key Perspectives and Debates in Criminology (30 credits) has since 2015 involved a peer marked formative essay. In previous years, the assessment was a stand-alone, peer-marked in hard copy and then moderated by the module convenor. For 2017-18, Alex Dymock revalidated the module to incorporate the peer marking exercise into a two-stage summative assessment to improve student engagement. Davis (2004) found evidence in support of previous claims that awarding a ‘mark for marking’ rewards the demonstration of higher order skills of assessment. The peer marking exercise is now rewarded with an automatic 5% towards the final module mark. Alex also introduced a further summative feed-forward activity. Duncan (2007) notes that some students only read qualitative comments if the quantitative mark is outside their expectations, failing to recognise their potential value. To mitigate this problem, students completed a 300 word reflective paragraph, also with a reward of an automatic 5%, submitted with their summative coursework. The aim of this was to push students to reflect on what steps have been taken to improve the quality of their work based on both peer feedback and feedback from Alex, and how taking part in peer marking has changed their conception of quality.
While efforts have been made in previous years to streamline the peer marking activity and enhance student engagement with it, for 2017-18 Alex worked intensively with Martin King in the E-Learning Team to pilot the PeerMark facility in Turnitin. This is the first time across the college that this facility has been used for assessment, so bespoke materials were produced by Martin to give students guidance on how to access essays, how to mark them, and how to submit reviews. We trialled the facility using a dummy assessment, which allowed us to evaluate the tool and develop an appropriate and sustainable workflow. Students were provided with an anonymised sample essay (with permission) from a previous year on the same topic to get a sense of what an outstanding essay might look like. They were also provided with substantive guidance on completing the reflective paragraph, including some FAQs, and guidelines on the benefits of reflective writing.
There is substantial evidence that the use of peer marking technology, and the feed forward activity, improved student engagement and added a reflective component to their learning.
1) The use of the Turnitin PeerMark facility allowed Alex and Martin to set a number of open or closed questions to guide peer feedback, and set a minimum word count for responses to those questions before feedback can be submitted. Using the School of Law marking rubric, questions were set such as ‘Is there evidence of wide independent reading, beyond lecture content and core texts’? and ‘Does the essay demonstrate a critical understanding of the topic, supported by relevant evidence, and respond fully to the question? How could the writer improve this?’ The use of PeerMark technology ensured that students left substantial feedback and were required to engage directly with the marking rubric.
2) Essays can be distributed anonymously, and Turnitin allows tutors to automate the assignation of one essay to each student. Students who did not submit formative work (though this only accounted for 1 or 2 students – a great improvement on previous years) were not given the opportunity to peer mark and receive the automatic 5%. That the use of technology to streamline and partially-automate the previously manual, paper-based processes coincides with increased engagement is perhaps explained by Ashenafi (2015), who highlights the challenges that have hampered the advance of peer-marking, which was a concern with this exercise in previous years. Most of these challenges arise from the manual nature of peer assessment practices, which prove intractable as the number of students involved increases.
3) The use of technology responds to changing student expectations and needs by streamlining the assessment process from submission to providing and subsequently receiving peer feedback, and then marks and feedback from the convenor; with all associated activities consolidated in the Moodle course. That students can use their own device at a location and time of their choosing (within the due dates set by the convenor) is a better fit with current student expectations and lifestyles (Johnson et al, 2016). This contrasts with the first iteration of this peer marking exercise, in which students previously completed the activity in a guided seminar.
4) Essays can also be marked by the convenor, as usual, using Turnitin Feedback Studio (formerly known as GradeMark). This gave Alex the opportunity to add any correctives as to the peer feedback, assign essays a final mark, and also write additional feedback. Students therefore receive two sets of feedback on their formative work, which they are able to feed forward into their summative work.
5) The use of the Turnitin PeerMark facility improves the inclusivity of peer marking activities. The anonymity of the technology creates a ‘safe space’ for students to provide peer feedback. The open and closed questions balance prescriptive and autonomous elements to the assessment. Using technology also accommodates students with a broad range of learning needs. For example, students who experience visual impairment are able to more easily participate when essays are not distributed in hard copy. The prescriptive element also assists students who experience social/communication impairment.
6) The use of PeerMark considerably reduced the volume of administrative work required in ensuring essays and feedback were returned to students, and ensuring the exercise remained entirely anonymous.
7) Qualitative evidence produced by Wimhurst and Manning (2013) suggests combining peer-marking with a feed-forward activity does enhance student engagement. Our findings are congruent with this. The reflective paragraphs, and other feedback collected from students, confirms that the opportunity to review the work of a peer and reflect on their own learning and skills development, offered students clarity and better understanding of what constitutes essay quality.
8) The benefits of peer-teaching and assessment to enhance cognitive and social congruence between students have been recognised (Yu et al, 2011), and were the basis for introducing the peer-marking exercise on the module. However, our findings suggest that the prescriptive facilities on PeerMark based on marking rubric further enhanced this, because it requires students to look for the same qualities in work across the cohort.
What is PeerMark
Part of the Turnitin suite, PeerMark is a peer review assignment tool . Academic staff can create and manage PeerMark assignments that allow students to read, review, and evaluate one or many papers submitted by their classmates.
- Peer review activities sit alongside Turnitin assignments; and participation occurs between the Due and Post Dates
- Review questions can be free text or numeric scale
- Minimum word counts can be enforced
- Anonymity is maintained
- Students can be automatically assigned one or more scripts to review
- Students can select scripts to review
- Students can, if required, to review their own submission
- Marks can be awarded for participation
- Lecturers can add to reviews
What we learned from the pilot
While we believe the assessment structure of CR2013 was much improved by the use of the PeerMark on Turnitin, and the inclusion of a feed forward task, further improvements could be made both within the technology itself and the structure of assessment. While almost all students did undertake the peer marking activity, some students engaged and contributed significantly more than others, with the result that some recipients of peer reviews received better quality feedback than others. Efforts were made to reward those students who exerted considerable effort in undertaking the peer marking exercise with small prizes handed out in front of the cohort in a lecture, to improve engagement with the feed forward task. Where peer feedback was lacking, more extensive feedback was provided by Alex via Feedback Studio, but in future other strategies could be used to improve the quality of engagement and student involvement, such as:
- Inserting maximum as well as minimum word counts in response to rubric-based feedback questions, to ensure the volume of feedback is more consistent across the cohort
- Providing guidance on using QuickMark comments. Although this was not part of their brief, some students figured out how to insert in-text comments and corrections without assistance. That students left such extensive feedback not only in general comments but on the text of the essays was not anticipated. Further student-led initiatives around the design of the task, or feedback questions, should be considered in future years, drawing on the rationale of Duret et al (2018).
- More staff should receive targeted training on the use of PeerMark on Turnitin if peer marking is an assessment component, so that the exercise can be replicated by other staff members and ensure the sustainability of the use of the facility. Currently, only Alex is trained to use the facility in School of Law, and will be on sabbatical leave over the period in which it is next due to take place.
- Some students found that the facility lagged, while the E-Learning Team pointed out that the interface is aged; its development having not kept up with that of the Turnitin FeedbackStudio. Martin King is engaged in discussions relating to the roadmap for PeerMark and is assured that the service will be updated and maintained as a part of the Turnitin toolset.
- A more substantive reward for the exertion of effort in undertaking the peer marking exercise could be considered, such as Passport points to recognise participation.
- Currently, information about DDS students is provided only to the course convenor, and thus adjustments to marks and feedback could only be made via FeedbackStudio and not PeerMark. In future years, green sticker students should be encouraged to include their status on the body of their work, and peer markers should receive guidance on the adjustments they should make to feedback to accommodate green sticker students. This would not only improve the inclusivity of the exercise itself, but encourage students across the cohort to recognise and be sensitive to a range of disabilities that might affect their peers.
More informal opportunities for discussion of the peer marking activity could be provided, such as further preparation on the learning outcomes of the exercise, and post-activity student-led evaluation.
What the students thought
“Having just submitted our CR2013 Summative, I’d like to pass on my thanks and appreciation for setting the peer mark exercise both for our formative and summative. It really helped me understand explicitly what a marker is looking for in an essay, and I often referred to both my peer’s feedback and your feedback from my last formative when writing the recent summative essay.”
“By marking someone else’s essay, I was able to extensively use the marking criteria as an examiner would. I was able to look for specific criteria in their work, which ultimately guided me for what I should be including in future essays, such as showing excellent as opposed to good understanding of the topic, and to do this by defining key concepts and providing examples/evaluations.”
”I feel that it has been useful having feedback from a peer as I know that they have been through the same process in writing their assignment and so can therefore use their own experience to feedback on my work. Being able to receive praise from another student is very motivational as I feel that they are on the same level as I am within this degree. The fact that Dr Dymock’s feedback stated similar things to the peer feedback was also interesting as it made me realise that when looking at my work and judging its quality, it is possible for me, as a student, to give a similar perspective on whether it is of a good standard or not, as a lecturer would. This has allowed me to further my ability to properly check over my work and make changes after finishing to ensure I can get the best possible mark.”
Ashenafi, M.M. (2015) ‘Peer-assessment in higher education – twenty-first century practices, challenges and the way forward’, Assessment & Evaluation in Higher Education, 42(2), pp. 226-251.
Davies, P (2004) ‘Don’t write, just mark: the validity of assessing student ability via their computerized peer-marking of an essay rather than their creation of an essay’, Research in Learning Technology, 12(3), pp. 261-277 [Online]. Available at: http://repository.alt.ac.uk/611/1/ALT_J_Vol12_No3_2004_Dont%20write%2C%20just%20mark_%20the%20val.pdf (Accessed: 17th October 2017).
Dochy, F., Segers, M., Sluijsmans, D. (1999) ‘The use of self-, peer and co- assessment in higher education: A review’, Studies in Higher Education, 24(3), pp. 331-350.
Duncan, N. (2007) ‘‘Feed-forward’: improving students’ use of tutors’ comments’, Assessment & Evaluation in Higher Education, 32(3), pp.271-283.
Duret, D. et al (2018) Collaborative learning with PeerWise, Available at: https://journal.alt.ac.uk/index.php/rlt/article/view/1979 (Accessed: 12nd March 2018).
Falchikov, N., Goldfinch, J. (2000) ‘Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks’, Review of Educational Research, 70(3), pp. 287-322.
Gentle, C.R. (1994) ‘Thesys: an expert system for assessing undergraduate projects’, in Thomas, M. et al (ed.) Deciding our Future: technological imperatives for education. Austin, TX: University of Texas, pp. 1158-1160.
Johnson L., etal (2016) NMC Horizon Report: 2016 Higher Education Edition, Available at: http://cdn.nmc.org/media/2016-nmc-horizon-report-he-EN.pdf (Accessed: 17th October 2017).
Orsmond, P., Merry, S., Callaghan, A. (2007) ‘Implementation of a formative assessment model incorporating peer and self‐assessment’, Innovations in Education and Teaching International, 41(3), pp. 273-290 [Online]. Available at: https://srhe.tandfonline.com/doi/full/10.1080/14703290410001733294 (Accessed: 17th October 2017).
Saito, H., Fujita, T. (2009) ‘Peer-assessing peers’ contribution to EFL group presentations’, RELC Journal, 40(2), pp. 149–171.
Yu, T-C. et al (2011) ‘Medical students-as-teachers: a systematic review of peer-assisted teaching during medical school’, Advances in Medical Education and Practice, 2: 157-172.