PeerWise: Students creating questions for their peers

Claire Ketnor, Amanda Shaker, Karishma Raghupati, Van Tien Pham

Abstract


PeerWise (https://peerwise.cs.auckland.ac.nz/) is a system which allows students to create multiple-choice questions for other students, answer questions posed by their peers and then provide feedback (Denny et al, 2008). There is evidence in the literature to show this method of assessment has a positive impact (e.g., Guilding et al. 2021; Fergus et al. 2021; Feeley and Parris 2012), particularly on students’ attainment and engagement. In a funded project, we introduced PeerWise into the assessment for a module at Sheffield Hallam University (U.K.) and another at La Trobe University (Australia). In this case study, we give an overview of PeerWise and the activities within the platform, results from our evaluation of the activity, and advice for implementation collected within the project from other practitioners around the world who have experience using PeerWise. Cohesive themes arising from our evaluation and the advice collected are summarised to form recommendations for improved student experience and outcomes, for future implementation of the PeerWise platform by practitioners.


Keywords


PeerWise; student-generated questions; problem posing; peer feedback; assessment.

Full Text:

PDF

References


Bloom, B.S., 1956. Taxonomy of educational objectives. Handbook 1: Cognitive domain. New York: McKay.

Denny, P., Luxton-Reilly, A. and Hamer, J., 2008. The PeerWise system of student contributed assessment questions. In Proceedings of the tenth conference on Australasian computing education - Volume 78 (ACE ’08), pp.69-74. https://doi.org/10.1145/1384271.1384293

Feeley, M. and Parris, J. 2012. An Assessment of the PeerWise Student-Contributed Question System's Impact on Learning Outcomes: Evidence from a Large Enrollment Political Science Course. SSRN, pp.1-30. https://doi.org/10.2139/ssrn.2144375

Fergus, S., Hirani, E., Parkar, N. and Kirton, K. 2021. Strategic Engagement: Exploring Student Buy-in across a Formative and Summative Online Assessment. All Ireland Journal of Higher Education, 13(1), pp.1-24. Available at: https://ojs.aishe.org/index.php/aishe-j/article/view/441 [Accessed 14 October 2021].

Guilding, C., Pye, R.E, Butler, S., Atkinson, M. and Field, E. 2021. Answering questions in a co-created formative exam question bank improves summative exam performance, while students perceive benefits from answering, authoring, and peer discussion: A mixed methods analysis of PeerWise. Pharmacology Research & Perspectives, 9(4), pp.1-12. https://doi.org/10.1002/prp2.833

Li, L., Liu, X. and Steckelberg, A.L., 2010. Assessor or assessee: How student learning improves by giving and receiving peer feedback. British Journal of Educational Technology, 41(3), pp.525-536. https://doi.org/10.1111/j.1467-8535.2009.00968.x

Scully, D., 2017. Constructing multiple-choice items to measure higher-order thinking. Practical Assessment, Research, and Evaluation, 22(1), p.4. https://doi.org/10.7275/swgt-rj52

Tarrant, M. and Ware, J., 2008. Impact of item‐writing flaws in multiple‐choice questions on student achievement in high‐stakes nursing assessments. Medical education, 42(2), pp.198-206. https://doi.org/10.1111/j.1365-2923.2007.02957.x




DOI: https://doi.org/10.21100/msor.v20i2.1309

Refbacks

  • There are currently no refbacks.