DIGITAL LIBRARY
DISTRIBUTED DOCUMENT AUTHORING FOR LOCATION-INDEPENDENT COLLABORATIVE LEARNING
University of Luxembourg (LUXEMBOURG)
About this paper:
Appears in: ICERI2017 Proceedings
Publication year: 2017
Pages: 4399-4408
ISBN: 978-84-697-6957-7
ISSN: 2340-1095
doi: 10.21125/iceri.2017.1175
Conference name: 10th annual International Conference of Education, Research and Innovation
Dates: 16-18 November, 2017
Location: Seville, Spain
Abstract:
Collaborative learning entails a group of people actively working together in order to comprehend a new concept or solve a given problem. While this teaching strategy is regularly applied when all participants are in the same location, e.g., a class room, settings with geographically distributed collaborators are becoming increasingly common. Computer-supported collaborative learning (CSCL) generally aims at supporting both scenarios by providing a shared collaboration environment which can transparently integrate alternating phases of joint and separated work. However, most of these solutions have requirements and restrictions that directly affect the work of the collaborators. This includes the use of a central server. The ability for ad-hoc group collaboration in a location without the necessary server connection is thereby limited. Specific types of data, e.g., multimedia files, often require dedicated applications for editing outside the collaboration environment, thereby inherently disconnecting participants from the group when working with this data. The real-time representation of the other collaborators' actions is potentially further restricted by the utilization of differential synchronization. Here, mostly manually triggered systems are employed to share work with the group, both isolating participants from the group and requiring work outside of the main task.

In this paper, we present a distributed document authoring environment developed to tackle these issues and actively support the collaborative learning effort of the individual participant. This collaboration environment is based on an advanced Compound Document System (aCDS) and concurrency-based command application and consistency management models. A supervisor is provided with real-time management and assessment tools for both group and individual performance. These tools provide element-specific access permissions to limit the availability of distinct document parts and the ability to focus on the current actions of a particular collaborator. The user-centric approach limits the impact of issues specific to distributed systems such as manual synchronization and conflict resolution processes.

The aCDS provides the basis for a transparent integration of different, potentially incompatible data types into a single, shared document. This eliminates the need to disconnect individual users from the group to perform editing operations which otherwise would only be available in dedicated applications outside of the collaboration environment. Constant workflow interruptions due to the transition between such applications are effectively prevented and the concurrent operations of each collaborator are available to all others in real time. Compared to other CSCL approaches employing a file-based data representation, the aCDS utilizes a finer-grained data structure. This allows for the precise and concurrent attribution of editing operations to distinct elements within a document. In particular when collaborating closely, undesired interference with the work of others and the risk of synchronization-related conflicts are effectively limited. Furthermore, the applied command distribution and application model does not require a central server, thus enabling ad-hoc collaboration without an external network connection. As a result, the participants in a collaborative learning setting are provided with a responsive and feature-rich, yet unobtrusive work environment.
Keywords:
Collaborative learning, real-time collaboration, collaborative working environment, group and individual assessment, distributed authoring environment.