ONSCREEN MARKING FOR ELECTRONICALLY SUBMITTED ESSAY TYPE QUESTIONS
North West University (SOUTH AFRICA)
About this paper:
Conference name: 9th International Conference on Education and New Learning Technologies
Dates: 3-5 July, 2017
Location: Barcelona, Spain
Abstract:
This paper is written at the start of a project and will elaborate on how an Onscreen Marking Tool was developed and describe how the tool is proposed to work. Presenting at an International conference, provides an opportunity to gather valuable inputs that can contribute to improvements of the Onscreen Marking Tool. We believe this tool can offer a solution that will save time and improve accuracy and consistency when essay-type assignments are graded.
Assessment of students is an ongoing practice for educators worldwide. Large class sizes in South African Universities, the demand for increased research outputs and community involvement, add to the growing workload on faculty members. A wide variety of assessment techniques are favoured for assessment in different subject areas. Formative assessment methods (observation, multiple choice tests, portfolios, classwork assignment etc.) should include objective feedback that in turn provides valuable learning opportunities. However, large class sizes and heavy workloads make consistent, individualised feedback problematic and relegate the educational impact of formative assessment to a futile exercise.
Essay-type assessments remain an important form of assessment. The practice of grading essays in an environment where large classes are the norm, includes the appointment of assessment teams (tutors, multiple lecturers, externally contracted assistance), leading to the repetitive question: How can one ensure quality and consistency, and provide constructive, personalised feedback to large student numbers in the case where assessment teams grade the same assessment? Another challenge relates to students’ propensity to compare feedback and marks among themselves, as well as relate the feedback to the marks they received. If different assessors were employed, the risk of low inter-rater reliability is evident, especially when their work is not moderated. This often leads to an unsustainable number of student consultations and regular changes in grades.
An established grading strategy to overcome these challenges is the use of rubrics and assessor calibration, in conjunction with an agreed-upon set of feedback prompts, as this enhances inter-rater reliability and elevates the quality of feedback. Assessors apply different techniques when they use rubrics, such as ticking relevant boxes and writing additional individualised comments on the rubric. The rubric is then attached to the graded assignment.
In the digital age, electronic submission of assignments via the learner management systems (LMS) have become the norm. Electronic or e-grading (as opposed to printing and then marking paper-based artefacts) through the use of rubrics have brought a new set of technical difficulties, which requires a revisit on the way rubrics are used to grade essay type assessments. Technical difficulties include the inconsistent techniques used by assessors, as well as inconsistent file formats submitted by students and assessors alike. If the assessment was submitted as a .doc file, some markers would prefer the review function whereas others would prefer to use track changes in word processors. If the assignment was submitted in portable document format (.pdf) or as a scanned handwritten assignment (pdf or image files) it becomes even more problematic to grade and ensure consistency in the manner in which feedback is given. These issues make grading, commenting and personalized feedback on essay type assessments a great challenge for many lecturers.
To address these needs, and to safeguard consistency and quality in terms of the way in which essay- type assessments are marked and feedback is given, an onscreen marking solution was envisaged. The marking tool needs to take care of repetitive tasks, such as adding marks and also improve the use of rubrics with standardized feedback, as well as the option to give personalise feedback for individual students. The Onscreen Marking Tool should not add to the workload of assessors, but improve on the quality of feedback and marking. The solution was implemented at North West University in January 2017.Keywords:
Marking assignments, electronic marking, marking rubric, assessment.