Abstract
Peer assessment is often used for formative learning, but few studies have examined the validity of group-based peer assessment for the summative evaluation of course assignments. The present study contributes to the literature by using online technology (the course management system Moodle™) to implement structured, summative peer review based on an anchored rubric in an ecological statistics course taught to graduate students. We found that grade discrepancies between students and the instructor were fairly common (60% of assignments), relatively low in value (mean = 3.3 ± 2.5% on assignments that had discrepancies) and proportionally higher for criteria related to interpretation of statistical results and code quality and organisation than for criteria related to the successful completion of analysis or instructional tasks (e.g. fitting particular statistical methods, de-identification of one’s submission). Students reported that the peer assessment process increased their exposure to alternative ways of approaching statistical and computational problem-solving, but there were concerns raised about the fairness of the process and the effectiveness of the group component. We conclude with some recommendations for implementing peer assessment to maximise student learning and satisfaction.
Original language | English (US) |
---|---|
Pages (from-to) | 1208-1220 |
Number of pages | 13 |
Journal | Assessment and Evaluation in Higher Education |
Volume | 42 |
Issue number | 8 |
DOIs | |
State | Published - Nov 17 2017 |
Bibliographical note
Publisher Copyright:© 2016 Informa UK Limited, trading as Taylor & Francis Group.
Keywords
- Moodle
- Peer assessment
- rubrics
- statistics education