THE COGNITIVE COMPLEXITY ESTIMATION OF THE BASIC STATEMENTS OF THE SCHOOL MATH COURSE

The Glazov Korolenko State Pedagogical Institute (RUSSIAN FEDERATION)

Appears in: ICERI2020 Proceedings

Publication year: 2020

Pages: 162-169

ISBN: 978-84-09-24232-0

ISSN: 2340-1095

doi: 10.21125/iceri.2020.0057

Publication year: 2020

Pages: 162-169

ISBN: 978-84-09-24232-0

ISSN: 2340-1095

doi: 10.21125/iceri.2020.0057

Conference name: 13th annual International Conference of Education, Research and Innovation

Dates: 9-10 November, 2020

Location: Online Conference

Dates: 9-10 November, 2020

Location: Online Conference

The aim of the research is: 1) to develop objective methods for assessing the cognitive complexity of educational texts by measuring the amount of semantic information in them; 2) to determine the information density in theoretical arguments (definitions, theorems, conclusions, etc.) carried out in school mathematics textbooks. Here the information density is considered to be equal to the average coefficient of knowledge folding, i.e. the ratio of the information amount to the text volume.

The methodological basis of this research is the works by E. G. Gel’fman, M. A. Kholodnaya (psihodidactics); Ya. A. Mikk (textbook theory); A. V. Gidlevsky, T. A. Zdrikovskaya, I. S. Naumov, V. S. Vykhovanets (difficulty and complexity of educational texts); N. K. Krioni, A. D. Nikin, A. V. Fillipova, Ch.Ch. Chang, S.M. Silalahi (content analysis of texts); N.V. Lukashevich, Val. A. Lukov, Vl. A. Lukov (thesaurus approach); N. B. Samsonov, E. V. Chmyhova, D. G. Davydov, Yu. A. Tomina (cognitive complexity of scientific and educational text). The applied method requires creating a sample of typical textual and mathematical statements that characterize the study of mathematics in given grades, and counting the number of terms that occur, taking into account their complexity. The dictionary-thesaurus of applied terms is created; their complexity is determined by the method of the complex concepts decomposition into simple ones and by the pair-comparison method. A special computer program is used to evaluate the text complexity. It addresses to the text file dictionary.txt, which contains a list of more than 200 mathematical terms with the specified complexity, and the file F.txt with the text being analyzed. The program takes a term from the dictionary and analyzes the file with the text line by line, counting the number of the given term in it.

The difficulty of the text understanding depends on the average complexity of the sentences forming it. With the method of automated calculation of terms in the text and accounting of their complexity, text files are analyzed, their information content is measured, and average values of the information density in various classes are estimated. More than 15 different math textbooks are analyzed, from which the most typical theoretical statements and formulas are selected.

As an evaluation result of the theoretical information density in school courses of mathematics (5th – 6th grades) and algebra (7th – 11th grades), it was found that in the 1st – 9th grades the cognitive complexity increases slowly, and in the 10th – 11th grades it grows rapidly. Having estimation of the theoretical information density and information volume for 1st – 11th grades, we can determine the total amount of educational information received by students at math lessons.

Citation copied to the clipboard successfully.

Sorry, but the download of this paper is restricted to authorized users only.