1 Open University (UNITED KINGDOM)
2 University of Southampton (UNITED KINGDOM)
3 King's College London (UNITED KINGDOM)
4 Ontario Institute for Cancer Research (CANADA)
About this paper:
Appears in: INTED2012 Proceedings
Publication year: 2012
Pages: 3354-3359
ISBN: 978-84-615-5563-5
ISSN: 2340-1079
Conference name: 6th International Technology, Education and Development Conference
Dates: 5-7 March, 2012
Location: Valencia, Spain
Assessment has been identified as one of the major challenges faced by Higher Education Institutions (Whitelock, et al, 2007). As a response to the challenge, in a project funded by the Joint Information Systems Committee (JISC) Open Mentor (OM) was developed as a learning support tool for tutors to help them reflect on the quality of feedback given to their students on assignments submitted electronically. It was developed on the fundamental theory that there was convincing evidence of systematic connections between different types of tutor comments and the level of attainment in an assignment (Whitelock, et al 2004). OM analyses, filters, and classifies tutor comments through an algorithm based on Bale’s Interaction Process. As a result, tutor’s feedback comments are classified into four categories namely: Positive reactions, Teaching points, Questions and Negative reactions. The feedback provided is analysed against an ideal number of feedback comments that an assignment given a mark of a specific band should have. Reports are provided in OM to support tutors in the task of reflecting on their feedback structure and style.

The JISC-funded Open Mentor technology transfer (OMtetra) project is continuing the work initiated by the Open University at the University of Southampton and King’s College London. OMtetra aims at taking up OM and extending its use by developing the system further and ultimately offering better support to tutors and students in the assessment process. A group of tutors from the University of Southampton and Kings’ College are at present using OM on their teaching and assessment. In this paper, we explore improvements to OM in three aspects: user interface, technology implementation and algorithm construction.

From the user experience suggested additions to OM include the creation of a simple entry form where tutors may expand the contents of the algorithm used in the analysis of the feedback comments. In addition, enhancements to OM will facilitate uploading of students and modules information into the system.

Presently, OM utilises a built-in database of users that needs to be maintained separately from institutional systems. Improvements from the technology perspective include a more flexible authentication module which would simplify the deployment of the system in new environment and thus promote uptake by a larger number of institutions. In order to reach this goal, the system will be migrated to an open source framework which provides out-of-the-box integration with various authentication systems.

The last to improve is the analysis algorithm. Currently, OM classifies tutors’ comments into four categories by applying an underlying text matching algorithm. This method could be improved if tutors are allowed to add key words through the OM interface. As the number of users grow, so it will the algorithm and analysis process, making it more comprehensive and intelligent as the keywords are dynamically expanded.

OMtetra is an on-going project with a lot of potential. We believe that the tools that the outcomes from the development and trial implementations of OM will contribute highly to the area of assessment in higher education.
E-assessment, electronic assignments, feedback evaluation, feedback analysis.