About this paper

Appears in:
Pages: 2102-2111
Publication year: 2010
ISBN: 978-84-614-2439-9
ISSN: 2340-1095

Conference name: 3rd International Conference of Education, Research and Innovation
Dates: 15-17 November, 2010
Location: Madrid, Spain

USING THEORY BASED EVALUATION TO DISTINGUISH AND INTEGRATE RESEARCH AND EVALUATION FUNCTIONS IN LARGE-SCALE PROFESSIONAL DEVELOPMENT PROJECTS

C. Basile1, R.B. Cobb2, L. Sample McMeeking3

1University of Colorado Denver (UNITED STATES)
2Colorado State University (UNITED STATES)
3Edge Hill University (UNITED KINGDOM)
Introduction

Theory-based evaluation (TBE) has been explicated for over the past two decades as an evaluation model to assess how a program will work under certain conditions to solve identified problems (c.f. Chen, 1990; Weiss, 1998), and supported more recently as a viable, evidence-based option in cases where randomized trials or high-quality quasi-experiments are not feasible (Weiss, 2002). Despite the model’s widely accepted theoretical appeal there are far too few examples of its well-implemented use.

In this paper, we will describe the development of a theory-based evaluation design in a Math and Science Partnership (MSP) research project funded by the U. S. National Science Foundation (NSF). Briefly, this initiative is a comprehensive attempt to “build capacity and integrate the work of higher education, especially its science, technology, engineering and mathematics (STEM) disciplinary faculty, with that of K–12 to strengthen and reform mathematics and science education.” (NSF, 2010, p. 3). As of 2010, there were over 140 MSP projects funded typically for five or more years and at multi-million dollar levels.

Our initial assumptions about project efficacy allowed us to form a very simple theory of action for the teacher professional learning component of the project. In hindsight because our initial theory of action was linear and simplistic, our evaluation and research questions were not particularly representative of the entirety of the project and focused only on a very basic understanding of how teacher practice might impact student learning. In addition, the questions were theoretically disconnected, and did not illustrate a developmental and cohesive framework of the project or take into account the nuances, complexity, or limitations of the project as it was proposed. And finally, there was no alignment of evaluation and research and their respective teams were not working well together.

The Findings

Subsequently, through several iterations which are described in detail in our paper, we developed a graphical theory of action – a logic model in current lexicon – that had a set of characteristics that we believe conference participants will find interesting:

• We found that implementation theory assessment in the logic model aligned well with expected program evaluation activities, and program theory assessment aligned well with research activities.

• Because the NSF requires external evaluations of their funded projects, we were able to establish separate research and evaluation questions by separating implementation and program theory assessments with independent budgeting, staffing, and implementation activities.

• We modeled the strength and direction of hypothesized connections between constructs in the logic model based largely on a combination of existing theory and craft knowledge/intuition of project leadership. These hypothesized connections became a set of research and evaluation questions whose answers were derived through a range of descriptive, qualitative, and quasi-experimental designs.

• Despite separate implementation and program theory assessment processes, we were able to combine datasets across these activities to allow us to assess the integrity of the theory of action, not just the hypothesized connections within it.

Our presentation and paper will take the audience through the steps by which we came to these findings and found other applications for this work.
@InProceedings{BASILE2010USI,
author = {Basile, C. and Cobb, R.B. and Sample McMeeking, L.},
title = {USING THEORY BASED EVALUATION TO DISTINGUISH AND INTEGRATE RESEARCH AND EVALUATION FUNCTIONS IN LARGE-SCALE PROFESSIONAL DEVELOPMENT PROJECTS},
series = {3rd International Conference of Education, Research and Innovation},
booktitle = {ICERI2010 Proceedings},
isbn = {978-84-614-2439-9},
issn = {2340-1095},
publisher = {IATED},
location = {Madrid, Spain},
month = {15-17 November, 2010},
year = {2010},
pages = {2102-2111}}
TY - CONF
AU - C. Basile AU - R.B. Cobb AU - L. Sample McMeeking
TI - USING THEORY BASED EVALUATION TO DISTINGUISH AND INTEGRATE RESEARCH AND EVALUATION FUNCTIONS IN LARGE-SCALE PROFESSIONAL DEVELOPMENT PROJECTS
SN - 978-84-614-2439-9/2340-1095
PY - 2010
Y1 - 15-17 November, 2010
CI - Madrid, Spain
JO - 3rd International Conference of Education, Research and Innovation
JA - ICERI2010 Proceedings
SP - 2102
EP - 2111
ER -
C. Basile, R.B. Cobb, L. Sample McMeeking (2010) USING THEORY BASED EVALUATION TO DISTINGUISH AND INTEGRATE RESEARCH AND EVALUATION FUNCTIONS IN LARGE-SCALE PROFESSIONAL DEVELOPMENT PROJECTS, ICERI2010 Proceedings, pp. 2102-2111.
User:
Pass: