DIGITAL LIBRARY
CASE STUDY OF PERFORMANCE IMPORTANCE ANALYSIS IN ASSURANCE OF LEARNING
George Mason University (UNITED STATES)
About this paper:
Appears in: EDULEARN16 Proceedings
Publication year: 2016
Pages: 293-298
ISBN: 978-84-608-8860-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2016.1054
Conference name: 8th International Conference on Education and New Learning Technologies
Dates: 4-6 July, 2016
Location: Barcelona, Spain
Abstract:
Avery, McWhorter, Lirely, and Doty (2014) summarize the decade plus old background of assurance of learning (AOL) standards for continuous improvement for business schools. “As part of this addition, schools seeking to earn or maintain AACSB accreditation must develop a set of defined learning goals and subsequently collect relevant assessment data to determine direct educational achievement to develop assessment tools that measure the effectiveness of their curriculum.” The challenge of assessing assurance of learning is not limited to business schools. Virtually all specialties at all levels and disciplines of education face a similar task.

To develop a schema of discipline competence that addresses this course and the populations it serves, eighteen elements were developed that reflect the key categories of study. These eighteen elements stem from the central topics of the course. We labelled this analysis declarative knowledge competence analysis. To create metrics for each element, tests items from a total of 240 questions were grouped by topic. Next, whether each item was answered correctly or not was recorded in a data file for statistical analysis. The percentage of the items by topic that were answered correctly was then computed. Last, students were categorized by their performance across each of the eighteen domains of discipline knowledge and categorized according to the following standards: As an additional guide to determine the extent to which learning was taking place as a result of class activities, a series of on-line pre- and post- test quizzes were taken by the students.

The nine areas performance analysis results suggest more teaching effort required are [1 = top priority]
(1) Pricing Concepts,
(2) Personal Selling,
(3) Advertising,
(4) New Product Development,
(5) Integrated Marketing Communications,
(6) Retailing,
(7) Pricing Strategy,
(8) Supply Chain and
(9) Services Marketing.

The results of adding importance analysis (Martilla, J.A. and J.C. James, 1977; LaFleur, E.K., L.A. Babin and T. Burnthorne Lopez , 2009) to performance analysis presented a much different view for continuous teaching improvement and curriculum design. The keep up the good work topics (high importance-high performance) included [1 = top performance]
(1) Overview of marketing,
(2) Global marketing,
(3) Consumer behavior analysis,
(4) B2B marketing,
(5) Marketing research,
(6) New product development,
(7) Segmentation targeting and positioning, and
(8) Marketing strategy.

Adding the importance factor into the performance results moves Segmentation targeting and positioning from priority 7 to priority 1, moves Overview of marketing from priority 1 to priority 2, moves Marketing Strategy from priority 8 to priority 3, and so forth. Furthermore, some of the high performance concept areas may have to receive less emphasis to add effort to more important areas.

Depending on the mission of the academic unit, some topics may have to receive less instructional effort to address more important discipline competencies. In this example, they include less emphasis on global marketing, consumer behavior analysis, B2B marketing, marketing research and new product development.
Keywords:
Assurance, learning, importance-performance analysis, discipline declarative knowledge.