Proceedings of the Pacific-Rim Objective Measurement Symposium (PROMS 2023)

Evaluating the Accuracy of Peer Assessment of ESL Argumentative Writing Using a Mixed-Methods Approach

Authors
Xiao Xie1, *, Vahid Nimehchisalem1, Mei Fung Yong1, Ngee Thai Yap1
1Universiti Putra Malaysia, Persiaran Universiti 1, Serdang, Malaysia
*Corresponding author.
Corresponding Author
Xiao Xie
Available Online 22 August 2024.
DOI
10.2991/978-94-6463-494-5_16How to use a DOI?
Keywords
Mixed-Methods Approach; Partial Credit Model; Peer Assessment; Rater Accuracy; Rater Perception
Abstract

In the context of ESL writing, the prevalent approach of utilising inter-rater reliability measures, particularly Pearson’s r coefficient, for the scrutiny of peer assessment comes with inherent constraints. Rasch models have emerged as an alternative method to conventional correlation analysis for assessing rater accuracy, as they show the absolute match between peer ratings and expert ratings, and compute individual-level statistics for each element of each assessment facet. This study aims to evaluate the accuracy of peer assessment regarding ESL argumentative writing, and to explore why some writing domains are difficult for peer raters to score accurately. Peer assessment training was conducted over a five-week period with 24 undergraduate students enrolled in an ESL argumentative writing course at a Malaysian university. A mixed-methods approach was used to examine the relationship between peer raters’ quantitative ratings and their judgemental process. The quantitative data were analysed using Rasch Partial Credit Model (PCM), and the qualitative data were examined using constant comparative method and thematic analysis. The quantitative analyses reveal that the domain of Relevance and Adequacy of Content (RAC) was most likely to peer assess accurately, while the other two domains, Compositional Organisation (CO) and Cohesion (C) were most difficult to assess accurately by this cohort of peer raters. The qualitative analyses suggest that peer raters’ justifications for their scores were not consistent with the reasoning of expert raters, partially explaining inaccurate ratings in certain domains. This comprehensive information has the potential to improve peer assessment in an accurate and consistent manner, and to better organise peer assessment in tertiary ESL writing training programmes.

Copyright
© 2024 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the Pacific-Rim Objective Measurement Symposium (PROMS 2023)
Series
Atlantis Highlights in Social Sciences, Education and Humanities
Publication Date
22 August 2024
ISBN
978-94-6463-494-5
ISSN
2667-128X
DOI
10.2991/978-94-6463-494-5_16How to use a DOI?
Copyright
© 2024 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Xiao Xie
AU  - Vahid Nimehchisalem
AU  - Mei Fung Yong
AU  - Ngee Thai Yap
PY  - 2024
DA  - 2024/08/22
TI  - Evaluating the Accuracy of Peer Assessment of ESL Argumentative Writing Using a Mixed-Methods Approach
BT  - Proceedings of the Pacific-Rim Objective Measurement Symposium (PROMS 2023)
PB  - Atlantis Press
SP  - 251
EP  - 268
SN  - 2667-128X
UR  - https://doi.org/10.2991/978-94-6463-494-5_16
DO  - 10.2991/978-94-6463-494-5_16
ID  - Xie2024
ER  -