DIGITAL LIBRARY
TOOLS TO IMPROVE EFFICIENCY AND CONSISTENCY IN ASSESSMENT PRACTICES WHILST DELIVERING MEANINGFUL FEEDBACK
Solent University (UNITED KINGDOM)
About this paper:
Appears in: ICERI2022 Proceedings
Publication year: 2022
Pages: 1069-1078
ISBN: 978-84-09-45476-1
ISSN: 2340-1095
doi: 10.21125/iceri.2022.0296
Conference name: 15th annual International Conference of Education, Research and Innovation
Dates: 7-9 November, 2022
Location: Seville, Spain
Abstract:
In some STEM subjects in higher education both theoretical knowledge and technical skills are important. When assessing students it is important to assess understanding rather than their ability to recount facts or follow instructions and this should be emphasised in learning outcomes. In order to facilitate understanding it is useful to apply knowledge to real world problems in which students may be encouraged to develop solutions to requirements defined by a case study. It is the ability to successfully apply solutions that can be used in the assessment of understanding, where the solutions may be expressed in the application of both theory and practice. Students may be asked to write about their solutions in a report and also implement them in practical exercises that may be conducted in a laboratory. Assessment criteria will need to be established that address knowledge and technical skills that are reflected in learning outcomes. Rubrics are useful in mapping assessment criteria against expectations, but they don’t provide much in the way of feedback to students, whereas individually targeted comments are more useful. The assessment process requires reasoned academic judgement and careful feedback, but this can be time consuming to achieve. If the process could be made more efficient then academic time may be saved. An application has been developed to facilitate efficiency improvements in the assessment process and especially in the generation of meaningful individualised feedback involving limited automation. This is well suited to assessment methods that require substantial academic judgment, for example in the case of written reports. Nevertheless there is more scope for automation in the assessment of technical skills and a second application has been developed to address this need involving automation practices. In a computer network security module two assessments were used to assess both theory and practice, the first in the form of a written report and the second in the form of a time constrained practical assignment. A case study involving a fictitious company was used as the focus of the module. The first application was used to assess the report and generate feedback, whilst the second application was used to automate the assessment of the practical assignment. The feedback from the second application was limited, but it could be translated into more descriptive feedback once imported into the first application. In the practical assignment students were asked about the feedback that they had received. Whilst the sample size was quite small some initial feedback from students gave an insight into their perceptions of the nature of feedback that they had received.
Keywords:
Assessment, case study, laboratory, automation, computer networks, feedback.