Peering into large lectures: examining peer and expert mark agreement using peerScholar, an online peer assessment tool
ABSTRACT: As class sizes increase, methods of assessments shift from costly traditional approaches (e.g. expert-graded writing assignments) to more economic and logistically feasible methods (e.g. multiple-choice testing, computer-automated scoring, or peer assessment). While each method of assessment has its merits, it is peer assessment in particular, especially when made available online through a Web-based interface (e.g. our peerScholar system), that has the potential to allow a reintegration of open-ended writing assignments in any size class – and in a manner that is pedagogically superior to traditional approaches. Many benefits are associated with peer assessment, but it was the concerns that prompted two experimental studies (n = 120 in each) using peerScholar to examine mark agreement between and within groups of expert (graduate teaching assistants) and peer (undergraduate students) markers. Overall, using peerScholar accomplished the goal of returning writing into a large class, while producing grades similar in level and rank order as those provided by expert graders, especially when a grade accountability feature was used.