Life’s Requirements (in two takes) Tori Hoehler, NASA Ames [email protected].
NASA Postdoctoral Program Evaluation of the NASA Innovations in Climate Education Portfolio Ann M....
-
Upload
allison-hancock -
Category
Documents
-
view
216 -
download
0
Transcript of NASA Postdoctoral Program Evaluation of the NASA Innovations in Climate Education Portfolio Ann M....
NASA Postdoctoral Program Evaluation of the NASA Innovations in Climate Education PortfolioAnn M. Martin [email protected]
NASA Postdoctoral ProgramNASA Innovations in Climate Education (NICE)Minority University Research & Education Program (MUREP) 1
Acknowledgments• Ann was supported by an appointment to the NASA Postdoctoral
Program at NASA Langley Research Center, administered by Oak Ridge Associated Universities through a contract with NASA.
• The NICE Management Team (Advisors: Margaret Pippin & Lin Chambers; Project Managers: Kate Spruill & Monica Barnes; Susan Givens, Dan Oostra, Andrea Geyer, Mary Jo Leber, Denice Dublin, Cassandra Small)
• NICE Awardees & Evaluators• The Tri-Agency Evaluation Working Group• Ruth Anderson, John Baek, Elisabeth Barnett, Rachel Becker-Klein, Bob
Bleicher, John Burgette, Beth Cady, Hilarie Davis, John Fraser, Ellen Guisti, Carol Haden, Jim Hammerman, Kathy Haynie, Sue Henderson, Kimberle Kelly, Joan LaFrance, Shirley Lee, Teresa Lloro-Bidart, Carole Mandryk, Eugene Maurakis, Gerry Meisels, Jim Minstrell, Laura Munski, Mike Odell, Frank Rack, Texas Gail Raymond, Christa Smith, Martin Storksdieck, Sarah Yue, Dan Zalles, & others
2
7175 Projects, 2008-20112014
3
Goals of Evaluation: NASA & NICE
• To describe and understand the program
• To drive program improvement
• To promote and disseminate findings from funded projects
• To capture program stories through qualitative analysis that can’t be captured through simple quantitative metrics
• To make evaluation a more entrenched part of education and public outreach funded by NASA
4
NICE Evaluation Model
5
NICE Funded Project
Evaluators
NPP Postdoc
NICE Mgmt
• Webinars• Telecons• Conference meet-ups• Online evaluator
workspace
• Evaluation Support & Capacity Building
• Rolling project findings up to program level
• Tri-agency logic model
Example Result: Tri-Agency Logic Model
6
Outline: Evaluation Questions
1. What and how are projects evaluating?2. To what extent are NICE program-level integration
activities successful?3. To what extent do NICE-funded projects partner with
each other and with other institutions?4. What challenges are encountered, and what lessons
are learned, during project implementation?5. What overall outcomes for the project portfolio can be
assessed using available data? What promising practices are emerging from the NICE portfolio?
7
Outline: Major Findings
1. Meta-evaluation suggests opportunities for more varied evaluation practices across the portfolio. Evaluation literacy/capacity can be built within NASA, in order to better support awardees and programs in conducting evaluations.
8
Outline: Major Findings1.
2. Analysis of PI experience surveys demonstrates that the unique program model of NICE, including the tri-agency collaboration, is highly valued and successful. Awardees turn to the NICE team as both an “insider friend” and a “customer service representative.”
9
Outline: Major Findings1. 2.
3. Social network analysis of the NICE partnership network demonstrates that NICE reaches far beyond the 75 projects it has funded, reaching 377 unique institutions. NICE is driving a complex national network of institutions working on climate change education, including a range of types of institutions and a range of minority-serving institutions.
10
Outline: Major Findings1. 2. 3.
4. Projects encounter challenges with project timelines, participant recruitment, baseline quantitative skills among participants, and using technology in classroom settings. They learned lessons that suggest the importance of flexibility, communication, organization, and strategic planning.
11
Outline: Major Findings1. 2. 3. 4.
5. NICE’s funded projects have contributed to the overall goals of the program, and to the goals articulated by a tri-agency collaboration engaged in climate change education, demonstrated through outcomes data and evaluative evidence.
12
Question 1: What and how are projects evaluating?• Meta-Evaluation of the NICE Portfolio
13• Sample language from the 2010 GCCE solicitation
Methods & Designs
Instruments & Protocols
15• A lack of standardized climate change concept inventories, along with specific content needs for each project, led most of them to develop their own custom instruments.
Summary• Strong Focus on:• Summative Evaluation• Nonexperimental Designs• Self-report data• Match between project goals and evaluation questions
• Weaker Focus on:• Formative or Front-End Evaluation• Comparison Groups• Cause-and-effect relationships• Direct measurement• Details of analysis
Meta-evaluation: Recommendations• As program officers• Basic evaluation “literacy” or evaluation capacity on
teams• Know what you are looking for (objectives, outcomes,
audiences, etc.)• As writers of solicitations• Provide some guidance for PIs and for evaluators• Model what you are looking for – e.g., logic models;
clear, concise statements of outcomes and impacts• Recognize evaluation expertise as critical expertise
17
Question 2: To what extent are NICE program-level integration activities successful?
• Evaluation of the novel NICE program model
18
NICE as a Community of Practice
19• Awardees highly value their relationships with NICE, with tri-agency projects, and with each other.
Relationships with NICE
Relationship
Mode:
Role for NASA: Expert Colleague Customer Service
representative
One-Way Communication
constantly available presence
staff serve as informants
Two-Way Engagement active participant in project success
staff serve as helpers and
troubleshooters20
Awardee Reflections on Relationships with NICE• "I found communication from the management team to be
exceptional. From day 1 . . . it was made clear that my colleagues and I are members of a learning community facilitated by an active management team focused on providing as many opportunities as possible for learning from each other."
• "I had far more involvement with the program director on this grant than from any other grant . . . Keep that up!”
• "I . . . never felt 'lost in the crowd,' as can happen with some large programs.”
• "This [tri-agency PI] meeting is a showcase of best practices . . . and should be observed and replicated."
21
Program Model Evaluation: Recommendations• As service providers for awardees• Continue pushing for personal, one-on-one relationships• Focus on stability & usability• NASA reporting is the biggest burden, and requires the most
energy to flow smoothly!• As facilitators of a community• Remain engaged in broader (national) conversations and efforts
related to the climate change education community• Use this expertise to proactively “match” awardees with desired
resources
22
Question 3: To what extent do NICE-funded projects partner with each other and with other institutions?• Social Network Analysis of the NICE Partnership Network
23
24Project: bright orange
Non-project: charcoal
MSIs in the NICE Network
25
HBCU: royal blue
Tribal Serving Institution: pale blue
Minority Serving School District: bright orange
HSI: pale orange
“Other” MSI: black
MSIs in the NICE NetworkMSI Category Count
Historically Black Colleges and Universities (HBCUs)
26
Hispanic Serving Institutions 17 (6 community colleges; 11 4-year institutions)
Tribal Serving Institutions (includes Tribal Colleges and Universities [TCUs] and others)
19 (7 community colleges; 4 4-year institutions; 2 school districts; 2 nonprofits; 4 tribal government bodies)
Other Minority Serving Institutions 13 (2 community colleges; 6 4-year institutions; 5 nonprofits)
Minority Serving School Districts 33
26• In terms of number of relationships and embeddedness in the overall network, no statistical difference is observed between the total population of projects and the projects hosted at MSIs, nor between the full network and the nodes at MSIs.
Network Analysis: Recommendations• Shaping NICE through Activities & Selection• Efforts to provide integration activities and networking
opportunities have shaped a true national network; continue to focus on partnership (Ask NICE, PI meetings, etc.)
• Identify “touchpoints” in the network, around which partnerships cluster.
• Facilitating Partnership & Collaboration• Strive to keep the 4 new NICE-T projects tightly connected to
colleagues as previous projects leave (37 projects have sunsetted)
27
Question 4: Challenges & Lessons Learned
• Or: “What challenges are encountered, and what lessons are learned, during project implementation?”
28
Key Challenges: Timelines, Technology, & Baseline Knowledge
29
(Some!) Lessons Learned• Flexibility• Assembling a team• Tradeoff between more content and deeper content• Resources• Other climate change education resources• Teachers!• New standards• Evaluation
30
Challenges & Lessons Learned: Recommendations• Award timelines – lengthen or change?• Current idea: develop, pilot, revise, implement, evaluate,
disseminate• What about breaking that down? “Develop/Pilot/Revise” vs.
“Implement and Evaluate” vs. “Disseminate & Scale Up.”• Recognize technology as both an opportunity and a barrier• Recognize the demands on time, expertise, and budget• Allocate specific time and resources to content with these aspects
• Keep the level of innovation manageable
31
Question 5: Outcomes & Promising Practices
• Or: “What overall outcomes for the project portfolio can be assessed using available data? What promising practices are emerging from the NICE portfolio?”
• Synthesis of research & evaluation findings
32
Outcome Synthesis Summary
33
K-12 Teachers K-12 StudentsHigher
Education Faculty
Higher Education Students
# of Projects Targeting Audience Category (of 38) 35 21 6 12Awareness ✔ ✔ ✔ ✔
Knowledge (Content/ Pedagogy) ✔ ✔ ✔ ✔
Interest ✔ ✔ ✔
Attitudes ✔ ✔ ✔
Confidence ✔ ✔
Behaviors ✔ ✔ ✔
Skills/Competencies ✔ ✔ ✔ ✔
Total # of Outcomes Reported 82 23 9 31
Evidence of Meeting Logic Model Outcomes
34
NICE Logic Model: Short-Term and Intermediate Outcomes
Short-Term Outcomes Intermediate OutcomesEducators can better engage their students; students and educators can critically evaluate information.
Educators have content knowledge & confidence; participants are more climate literate.
Students are more interested in STEM education and careers.
More students pursue and complete STEM degrees (including teaching) and the workforce pipeline is more diverse.
MSIs are aware of resources and opportunities Increased participation of MSIs in NASA education.
Increased (and sustained) collaborations, partnerships, and alignment in CCE investments.
Products & resources are used by other programs.
Co-developed community understanding of CCE, with advances in the teaching and learning of CCE.
Sustained community of CC educators & evaluators.
Impact & value of NICE is documented and communicated.
Increased effectiveness of NICE projects (via NICE management team support)
Advances in the understanding of how to effectively teach climate change concepts.
(Some!) Promising Practices• Professional development for higher education faculty• Incorporation of implementation planning time into
professional development• Connecting climate change content to factors that motivate
students (social awareness, career interest, multicultural or family-oriented approaches to learning and knowing)
• Targeting development of the quantitative skills that underpin scientific research skills
35
Outcomes & Promising Practices: Recommendations• NASA Project Monitoring:• Better output tracking, to capitalize on all of the resources &
educational tools produced.• Making better use of evaluations (first step: getting our hands on
them!)• Further research into key practices:• Opportunities for longer-term evaluation in classrooms (realistic
timeline is longer than a NICE award)• Higher education students desire pipelines and course sequences
that make climate change relevant to their degree programs and careers, but projects to engage faculty were successful when they “plugged in” climate change to existing offerings.
36
NICE Evaluation Summary
37
NICE Evaluation Summary• NICE has developed an innovative model for NASA-funded
grant/cooperative agreement management• Awardee experiences are significantly strengthened by constant
contact with NASA (expert colleague/customer service)• Awardees and their projects benefit from involvement in a
national network of climate change educators• NICE’s partnership focus has extended its reach well beyond the
75 funded projects• Evaluation continues to be a frontier for NASA education
38
Evaluation Within NASA• Coordinated strategy• e.g., Paperwork Reduction Act: duplication of effort with too
many small projects trying to climb over a large barrier• Evaluation Capacity & Support• e.g., Logic Models: useful tool for communicating program,
keeping on task, and enabling evaluation• Use of project evaluations• How can we ensure that projects use this critical piece of what
they fund?
39