LEARNING ACTIVITIES WITH AI ROBOTS IN THE METAVERSE FOR PRIMARY EDUCATION
University of Rijeka (CROATIA)
About this paper:
Conference name: 20th International Technology, Education and Development Conference
Dates: 2-4 March, 2026
Location: Valencia, Spain
Abstract:
Emerging technologies such as the metaverse, educational robotics, and artificial intelligence offer opportunities to create interactive, context-rich learning experiences. They allow students to experiment and iterate within safe, motivating environments while engaging in hands-on, authentic tasks across rich digital and physical contexts. However, their integration in primary education remains limited due to infrastructure constraints, insufficient teacher preparation, time pressures, and privacy concerns. In particular, there are very few examples of classroom activities in which primary learners actively work with dedicated “AI commands” and directly interact with AI-driven robot behaviour, creating a notable gap in current practice. This paper addresses these challenges by presenting classroom-ready activities designed for primary learners.
The activities foster computational thinking, collaboration, and knowledge acquisition, and they also build foundational AI literacy by allowing students to observe and work with AI-driven behaviours in both simulated and physical environments. Rotating team roles (Planner, Checker, Coder, Tester) support equitable participation and maintain alignment with clearly defined learning objectives. Students can engage through natural-language interface, block-based programming, or Python, depending on their prior experience. They also use dedicated AI commands that enable the robot to interpret sensor input, recognise objects, and respond autonomously across environments. The learning environment enables students to develop, test, and refine their solutions in simulation before transferring them to the physical robot, providing a safe and efficient space for experimentation. The learning scenarios that structure the activities are grounded in real-life problems and are organised to progress from simple to increasingly complex challenges.
Two activities that illustrate the approach are presented. In the delivery activity, students design a route and define robot states so the robot can use real-time computer vision to recognise traffic signs, such as STOP, and wait the required time before continuing. In the animal-recognition activity, the robot uses on-board object detection to identify animals, display their names, and store each detection. Because the same code executes in both the metaverse simulation and the physical robot with only minor adjustments, the activities support reliable and seamless transfer between environments. Both activities offer opportunities for extension and differentiation. In the delivery activity, additional constraints such as one-way streets, speed limits, restricted zones, or dynamic obstacles can be introduced to increase complexity and promote strategic planning. In the animal-recognition activity, new challenges can include unfamiliar species, distractor objects, or other variations in conditions, encouraging students to refine their detection logic and adapt their solutions to more realistic scenarios.
The proposed learning activities demonstrate how AI-supported metaverse robotics can be meaningfully incorporated into primary education by combining clear instructional structure, alignment with learning objectives, and seamless transitions between virtual and physical implementation.Keywords:
Metaverse, educational robotics, artificial intelligence, primary education, collaborative learning, problem-based learning.