Relationships between Involvement and Use in the Context of Multi-site Evaluation American...

download Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.

If you can't read please download the document

Transcript of Relationships between Involvement and Use in the Context of Multi-site Evaluation American...

  • Slide 1
  • Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009
  • Slide 2
  • Beyond Evaluation Use Four-year NSF grant to study the relationships between involvement in program evaluation and use/influence Research team (2 co-PIs and 8 graduate students) based at the University of Minnesota Context of four NSF-funded multi-site programs Involvement and use by not directly intended (unintended) users
  • Slide 3
  • Framework for Involvement Cousins and Whitmores (1998) Systematic Collaborative Inquiry Control of the Evaluation Stakeholder Selection Depth of Participation Burkes (1998) Key Decision Points Evaluation Stages Activities Levels of control
  • Slide 4
  • Framework for Use TypeUse For Definition: The Use of Knowledge... InstrumentalAction... for making decisions Conceptual or Enlightenment Understanding... to better understand a program or policy Political, Persuasive, or Symbolic Justification... to support a decision someone has already made or to persuade others to hold a specific opinion
  • Slide 5
  • Framework for Use and Influence TermDefinition Evaluation use The purposeful application of evaluation processes, findings, or knowledge to produce an effect Influence ON evaluation The capacity of an individual to produce effects on an evaluation by direct or indirect means Influence OF evaluation (from Kirkhart, 2000) The capacity or power of evaluation to produce effects on others by intangible or indirect means
  • Slide 6
  • More Recent Developments Kirkhart, 2000 Evaluation Influence = capacity of persons or things to produce effects on others by intangible or indirect means (Kirkhart, 2000) Map influence along three dimensions: source, intention, and time Mark & Henry 2003, Henry & Mark 2004 Intangible influence on individuals, programs, and communities Focus on direct use of evaluation results or processes not adequate
  • Slide 7
  • Beyond Evaluation Use NSF Programs Name of Program Years of Evaluations Local Systemic Change through Teacher Enhancement (LSC) 1995 present Advanced Technological Education (ATE)1998 - 2005 Collaboratives for Excellence in Teacher Preparation (CETP)1999 - 2005 Building Evaluation Capacity of STEM Projects: Math Science Partnership Research Evaluation and Technical Assistance Project (MSP-RETA) 2002 present
  • Slide 8
  • Four Programs and their Evaluations ATE: Advanced Technological Education mainly community college level projects to enhance work forceevaluation included site visits, yearly survey LSC: Local Systemic Changeprofessional development for STEM in K-12 school districtsevaluation included observations, interviews, and surveys
  • Slide 9
  • Four Programs and Their Evaluations (cont.) CETP: Collaboratives for Excellence in Teacher Preparationprojects to improve STEM teacher educationevaluation included surveys and observations MSP-RETA: Math Science Partnerships, Research Evaluation and Technical Assistanceevaluation technical assistance included national meetings and provision of consultants
  • Slide 10
  • Methods Surveys of project PIs and evaluators in the four projects (645 respondents, 46%) Document review Interviews with key informant project personnel (29) Citation analysis (246 documents 376 citations) Survey of NSF PIs (191 respondents, 54.7%) In-depth analytic case studies
  • Slide 11
  • Results Perception of Evaluation Quality Ability to conduct high quality evaluation Be recognized as capable Interface with NSF Evaluators as brokers and negotiators NSF leveraging involvement and use Importance of dissemination Life Cycles Program Projects Individuals
  • Slide 12
  • Results Project Control Complete choice Required involvement Balance affects use Community and Networking Outreach Development of a community of practice Mutual respect Skill sharing Process use
  • Slide 13
  • Results Tensions Where best to spend time and money Balance local and national evaluation Balance project and evaluation goals Uniqueness Complex context Individual responses
  • Slide 14
  • Implications Participants differentially affected by the depth and breadth of involvement in evaluation activities. Neither breadth nor depth was consistently predictive of perceived level of involvement. Lack of consistency in perceived involvement and use makes measuring involvement challenging. Any investigation likely to be substantially affected by the nature of the evaluation and the characteristics of the individual.
  • Slide 15
  • Limitations Only four instances of large, multi-site NSF evaluations and therefore generalizations to other settings are not possible, although potentialities can be suggested. The case studies themselves are based on self-report data along with some archival records. The numbers of people surveyed and interviewed are small but appear to be at least representative of the groups included. The instruments used for data gathering were developed as part of the project and therefore might not be valid as measures of involvement and use in other contexts.
  • Slide 16
  • Future Research Research on the causal nature of involvement with evaluation use Themes presented here provide fruitful areas for more investigation Cross-case analysis provides a strong baseline for more positivistic research Examine the issues raised here through quantitative path analytic procedures Develop strong theories about the relationship between involvement and use that could form the basis for hypothesis formulation
  • Slide 17
  • Note This material is based upon work supported by the National Science Foundation under Grant No. REC 0438545. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
  • Slide 18
  • For Further Information Online - http://cehd.umn.edu/projects/beu/default.html E-mail [email protected] Research Team: Dr. Frances Lawrenz Dr. Jean A. King Dr. Stacie Toal Kelli Johnson Denise Roseland