WHAT CAN WE LEARN BY TEACHING MACHINES TO UNDERSTAND TEAM-ORIENTED PLAY?
Columbia College Chicago (UNITED STATES)
About this paper:
Appears in:
ICERI2009 Proceedings
Publication year: 2009
Pages: 6363-6369
ISBN: 978-84-613-2953-3
ISSN: 2340-1095
Conference name: 2nd International Conference of Education, Research and Innovation
Dates: 16-18 November, 2009
Location: Madrid, Spain
Abstract:
Much has been made about learning cohorts and teamwork within the classroom. We propose that near and future tech may pave the way for software driven team-facilitation in classrooms, conference rooms and anywhere else that small teams gather to build, create, learn and decide. Leveraging sensor laden environments and machine learning technologies, could we soon bring small team "best practices" to broader audiences - including the classroom - using smart software agents that listen, watch and, when appropriate, respond? Indeed, just how “smart” could the smart classroom become?
This presentation will discuss the various supporting technologies that raise the plausibility of behavioral "facilitators in a box", from low-cost sensors to low-cost high performance computing, data streaming and machine learning mechanisms. If we can build software systems that can recognize individual behavior and even team behaviors from established typologies at or near real time, in classroom settings, it follows that these data could trigger appropriate interventions and machine generated suggestions for improved outcomes. For example: An instructor's monitor might make timely suggestions about which student would benefit from a targeted question, while a student's support system screen might nudge the student to contribute a related something from their previous homework.
This forward looking presentation will be grounded in an overview of Construct, a related two-year software development research agenda that concluded in June of 2009, and that has received additional funding to extend research through 2010.
The Construct project involved the development of an original multiplayer, collaborative videogame, which was leveraged as a data streaming source, which, along with real-time speech recording and speech to text data, were rendered to supporting software architecture. The Construct project is an effort to establish a software suite and research paradigm that may eventually support the lofty ambitions of “team facilitator bots” – and which provides a compelling context to examine the software and research requirements to move technology toward that outcome.
The Construct project collected over 130 million rows of discrete data about subject interaction and game play, as well 60 hours of participant interaction as audio and video. These data were then coded against an established team behavior taxonomy (Marks, Zachary, et. al., 2000). We will present our results on the machine learning side of the architecture, including our ability to train classifiers from manually coded game sessions, and apply those classifiers to automatically code new sessions. This presentation will also reveal aspects of the new round of research which moves a group of the subjects out of the virtual world and into the real, as four subjects will interact around a small, sensor-laden (audio, video, gaze-tracking) table top as they supervise cadres of teams working to complete the game-based task.
Finally, the presentation will present a survey of the state of the art of technologies that may support the creation of ai-driven team facilitation, with a strong attention to speech to text recognition and emotion detection in speech.Keywords:
smart classrooms, sensors, smart environment, small teams, sensor fusion.