DIGITAL LIBRARY
METHODOLOGY TO ASSESS HIGH SCHOOL STUDENTS LEARNING TO EVALUATE INFORMATION SYSTEM USABILITY
University of Applied Sciences (LITHUANIA)
About this paper:
Appears in: INTED2023 Proceedings
Publication year: 2023
Pages: 3460-3468
ISBN: 978-84-09-49026-4
ISSN: 2340-1079
doi: 10.21125/inted.2023.0938
Conference name: 17th International Technology, Education and Development Conference
Dates: 6-8 March, 2023
Location: Valencia, Spain
Abstract:
The usability of information systems determines their success in the market. Therefore, it is important to assess the usability of a system during its development. In higher education, we have to assess how students have learned to evaluate the usability of information systems. The paper is aimed to present a methodology for assessing the learning of students in higher education.

Students were involved in the development of genuine information systems for the market. This approach improves their learning quality and motivation to seek for new knowledge and greatly enhances study efficiency. This methodology is based on applying combined methods: expert-based methods (namely, Heuristic Evaluation) and participant methods (namely, Event diaries).

From 2020 till now, evaluation of student achievements was done via including them into genuine projects carried out at the Vilnius University of Applied Sciences. Students had to evaluate the usability of the information system being developed.

Between October and December 2020, first- and second-year students from three study programmes, making 10 groups, participated in usability testing of the Cvsite tool by applying participant methods. In total, 85 students were conducting usability testing. During testing with participant methods, the "Event Diary" was applied. After the test, an "Impressions after testing" survey was carried out. Students had to record system defects in the event log, constructed of decomposed tasks. The final report comprised 456 properly described events. 85 respondents participated in the post-test impressions survey.

In April 2021, the second testing cycle of the CVSite tool took place by 3 student groups following the same approach. In this test students additionally had taken a Human-Computer Interaction Design course and participated in the testing as usability experts. The evaluation of the tool constituted 50% of the evaluation of the course exam.

In the combined testing approach, the same “Event diary” was applied during testing, where students who recorded a defect indicated compliance with the heuristics formulated by the researchers during the first testing cycle. A total of 941 events were recorded, but only a fraction of them was properly described. 39 events were incorrectly recorded, representing 4.14% of the total number of events. In addition, some positive aspects of the tool under test were recorded (252). However, the majority of the events, 650 (69.1%), were correctly recorded and described. Noted defects in the events could be analyzed and the heuristics (type of defect) could be determined. The list of defect types was expanded within the item “Other”, where the students chose the Nelson heuristic to describe the defect. Properly described events 640 were taken for further analysis.

In April 2022, the combined testing approach was reapplied in a usability evaluation of the Knowledge Alliance Business Idea Assessment Tool. 33 first-year students participated in following test, recording 583 events. The article analyzes and presents the results of student evaluation. This combination of expert evaluation methods and participants' methods allowed assessment of how involving students in genuine projects can improve the quality of learning and motivate students.
Keywords:
Learning assessment, usability, usability evaluating methods, methodology.