TO ANSWER, OR NOT TO ANSWER, THAT IS THE QUESTION: EXPERTS’ CONTRIBUTION TO QUESTION-ANSWERING PLATFORMS
1 Han University of Applied Sciences (NETHERLANDS)
2 University of Patras (GREECE)
About this paper:
Conference name: 13th annual International Conference of Education, Research and Innovation
Dates: 9-10 November, 2020
Location: Online Conference
Abstract:
The purpose of this research is to explore the factors that determine that experts will answer learners’ questions in Community Question-Answering (CQA) platforms. While the pertinent literature mostly focuses on large-scale CQA platforms, like Yahoo! Answers and Quora, we examine the medium-scale platform “100mentors” in order to more efficiently explore experts’ behavior. We examine the technological environment, the featured questions, and the expert identity as the three categories that affect experts’ activity in such platforms.
CQA platforms are online information resources that dynamically develop their content through the interaction between users’ questions and answers. To make a CQA platform effective as a source of information, it is crucial that the question response rate is high. Regardless of the degree participation of experts in CQA platforms, many questions remain unanswered. First, in any technological environment, the particular characteristics of each platform affect experts’ behavior toward community question-answering. The most important among them are experts’ habit to participate, their membership status, and the platforms’ rewarding system. Secondly, an important issue is the nature and the content of questions per se, as perceived by the experts as relevant, feasible, and having added value for answering. Finally, expert identity qualities, such as providing a service to the learning community (external regulator) and confirming their knowledge self-efficacy (internal motivation), are strong indicators of experts’ participation in CQA platforms.
Existing research lacks a coherent overview of factors that determine experts’ activity in CQA platforms. Our research objective is to investigate which factors can explain why and how much experts contribute to CQA platforms within the above-mentioned categories (technological environment, learners’ questions, and identity construction). For a more in-depth analysis of this issue, we focus on “100mentors” mobile app, a medium-scale CQA platform that can provide both qualitative and quantitative data for this research.
To pursue our research objective, a research instrument needed to be developed, therefore, we designed, implemented, and evaluated a pilot survey in this design study. First, we developed the theoretical framework and hypothesis for the research instrument informed by the literature review, the user feedback, and the user data on the platform. Then, on the basis of this framework, we developed the constructs and items for the pilot survey. Consequently, we tested this pilot survey with a limited number of experts on the 100mentors platform (N=9). To evaluate the instrument, we conducted qualitative and quantitative analyses. Finally, we redesigned the pilot survey and developed the final survey.
The pilot survey data analysis led to the reshuffling of the items and the creation of new constructs for the survey but also offered some insights into the factors that affect experts’ participation in the 100mentors platform. In regard to the use of the platform, experts are especially concerned with the ease in finding questions they would like to answer. For the questions per se, they appreciate questions that are relevant to their expertise or experience and challenging for them or the learners. Finally, for their personal identity as experts, they value their impact on young learners and the expansion of their own knowledge.Keywords:
Community question-answering platforms, experts, research instrument, survey.