J. Bartolomé1, P. Garaizar2

1Tecnalia Research & Innovation (SPAIN)
2University of Deusto, Faculty of Engineering (SPAIN)
An increasing number of recruiting processes -both in public and private sectors- require assessing digital skills of job applicants. With this aim in mind, the Basque Government created the "IT Txartela" (IT Card), a public digital skills certification system. So far, more than 111.000 citizens certified up to 332.000 digital skills through IT Txartela. This online e-assessment service is comprised of several modules with different types of items (multiple-choice, matching, simulations, etc.). In some cases, applicants claim items as confusing or misleading. Considering this, some of these items have been improved or removed from the modules during the last years. However, this feedback about items is provided mostly by participants who did not pass the certification exams. With the aim of better understanding the clarity of the IT Txartela items, we gathered a fine-grained log of 6730 IT Txartela participants (470000 distinct user-question interactions). Then, we asked IT experts to perform an analysis of the clarity and difficulty of the items. Experts identified some items that were confusing and/or tagged with a level higher or lower than expected. At the same time, we used assessment analytics techniques to identify items that required more time and more visualizations than others or had extremely low success rates. Surprisingly, we found a whole set of items that should be revised and remained unnoticed after the experts' analysis. These results show that assessment analytics techniques can identify unforeseen problems in certifications systems and help to improve the items bank periodically.