Accademia Navale di Livorno (ITALY)
About this paper:
Appears in: ICERI2010 Proceedings
Publication year: 2010
Pages: 3819-3828
ISBN: 978-84-614-2439-9
ISSN: 2340-1095
Conference name: 3rd International Conference of Education, Research and Innovation
Dates: 15-17 November, 2010
Location: Madrid, Spain
In two recent papers we exploited the distinction between solutions and results to obtain computer-based problem-solving environments where the student is gently guided to find his own solution instead of being forced to reproduce a teacher's solution. In particular, we designed two such environments for teaching problem-solving with spreadsheets and for database querying. In a later paper we analyzed the methodology in detail, showing how to generalize it to obtain problem-solving teaching environments for other topics such as physics, computer programming and interactive geometry. Concerning physics and computer programming this is now work in progress.
In this paper we report on a new development with the problem-solving environment for database querying. Allowing a deep automatic analysis of the teacher's solution, this environment can be turned into a type of computer-based tutor, able to suggest hints and to assess student's solving efforts.
Regarding the hints, many of them are automatically obtained by thoroughly parsing the teacher's solutions SQL code. They range from simple aspects such as: "which tables are involved in the query", to more sophisticated ones such as: "is a table used more than once". The teacher can adjust such automatic hints, add his own or even create "structured hints" such as partial solutions given as graphical Query-By-Example layouts.
Concerning the automatic assessment, this is obtained following the result-driven approach by automatically comparing student's results and the expected ones coming from the teacher's solution. An evaluation module automatically assigns a score to the student's solution, by comparing results on a range of significant input data. Hints can be naturally exploited to refine the automatic assessment. Indeed, each hint is given a "cost" so, when the student asks for one, the score is diminished by the corresponding cost. In this way the student gets a fair and impartial evaluation of his overall performance and is able to tailor the problem to his own skill and competency.
We report the results of this work and how it may enhance both lab classes and self study.
Technology, educational software, learning and teaching methodologies.