Daryl E. Chubin, Ph.D., Senior Advisor American Association for the Advancement of Science

10
NSF Broadening Participation in Computing Alliances (BPC- A) Introduction to Common Core Indicators of Breadth and Depth of Participation Daryl E. Chubin, Ph.D., Senior Advisor American Association for the Advancement of Science CE21—Portland, Oregon January 14, 2013

description

NSF Broadening Participation in Computing Alliances (BPC-A) Introduction to Common Core Indicators of Breadth and Depth of Participation. Daryl E. Chubin, Ph.D., Senior Advisor American Association for the Advancement of Science CE21—Portland , Oregon January 14, 2013. - PowerPoint PPT Presentation

Transcript of Daryl E. Chubin, Ph.D., Senior Advisor American Association for the Advancement of Science

Page 1: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

NSF Broadening Participation in Computing Alliances (BPC-A)

Introduction to Common Core Indicators of Breadth and Depth of Participation

Daryl E. Chubin, Ph.D., Senior Advisor

American Association for the Advancement of Science

CE21—Portland, Oregon

January 14, 2013

Page 2: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Evaluation Technical Assistance Team

Daryl E. Chubin, American Ass’n for the Advancement of Science

Betsy Bizot, Computing Research Association

Tom McKlin, The Findings Group

Alan Peterfreund, SageFox Consulting Group

In collaboration with BPC-A project evaluators . . .

This has been a community-driven undertaking.

Page 3: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Broadening Participation in Computing Alliances

History• Program launched in 2005-2006 with funding of 10 Alliances• Multi-institutional interventions in any segment of the education pathway,

precollege (formal and informal) to workforce• Most Alliances focused on undergraduate segment• In 2009, the Alliance evaluators undertook to develop, in addition to annual

project report information, a set of Common Core Indicators that would allow reporting of progress at the program level

• In 2011-2012, 10 of 13 Alliances participated in submission to a review team of a Common Core report

• Review team assembled a program-level report (data + narrative), which was submitted to NSF in August 2012

Sources • Chubin and Johnson, Telling the Stories of the BPC Alliances, June 2010, http://

php.aaas.org/programs/centers/capacity/documents/BPC%20Stories.pdf;

• Chubin et al., NSF’s BPC Alliances Program—A Report on Common Core Indicators of Breadth and Depth of Participation, August 2012 (available on request)

Page 4: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

BPC Alliances—Common Core Indicators (2009-2012)

• The review team identified three indicators that, by consensus among BPC Alliance (BPC-A) evaluators, cover the “common core” at the heart of all the NSF-funded Alliances.

• Detailed descriptions of and measurement strategies for each indicator were developed.

• The three indicators are:

1. Individual Participation and Outcomes

2. Organization Capacity Development

3. Alliance Impact

Page 5: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

CCI Methodology and Review Process

• Each Alliance is different, thus each report was reviewed individually to contextualize its metrics and terminology. Then each indicator was applied across all alliances.

• Most alliances were captured in their latter stages (typically year 4 or 5). The composite yielded a picture of program performance.

• The team notes the difference between: “broadening participation” and measuring broadening participation

designed (by intention) and implemented (through project evolution)

Page 6: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Indicator 1

• The review team used a single year, academic year 2010-2011, to capture both in-school and summer activities for all projects.

• Over 90% of K-12 students are participating in what BPC-A evaluations report as a level of “deeper engagement.” Significant evidence is most compelling for participants who are more

deeply engaged, with the least evidence found for undergraduate students merely “touched” by the program.

• The majority of faculty participation is limited engagement among K-12 teachers. Faculty more deeply engaged in the BPC projects were more likely to

effect positive change among their undergraduate students in attitudes toward and persistence in computing.

Page 7: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Indicator 2

• For this indicator, all active years of the project were used, i.e., as a cumulative measure.

• All alliances except one intentionally build faculty communities; 8 of 10 alliances intentionally build student communities. Aside from academic institutions, BPC projects have an impact

through collaboration on youth- and community-based organizations, professional associations, industry partners, and public sector agencies.

• All 10 projects present evidence of changes in knowledge/skill development, generating/ disseminating tools, and expanding stakeholder awareness.

Page 8: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Indicator 3

• Without the alliances fostering identity with the broader community, no claim of cultural change as a national phenomenon in computing can be made.

• Six of the 10 Alliances affirmed a transition in developing ties with entities not originally designated as partners, with expansion and progress toward sustainability observed. Alliances have impacted a wide variety of organizations

outside their initial partners, most notably other institutions of higher education, other K-12 institutions, and the broader CS community.

Page 9: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Community

• The idea of building community cuts across all common core indicators for BPC alliances. Community is intertwined with the activities an Alliance conducts

and with the experiences of its participants.

• The Alliances share no common framework for tabulating the value of community participation.

• Community alliances appear to serve three key functions: a strategy for deepening engagement,

a way of leveraging or increasing the impact of formal instruction or interaction during an activity, and

a way of sustaining the effect of a short-term activity such as a workshop by providing an avenue for continuing support and engagement.

Page 10: Daryl E. Chubin, Ph.D.,  Senior Advisor American Association for the Advancement of Science

Recommendations

• Indicators should be assembled at the outset of new programs, rather than developed concurrently with them.

• NSF expectations for new programs should be made clear in the solicitation, with the caveat that overly-rigid expectations can stifle innovation.

• A common language is needed for describing both the processes and outcomes around community-building. Words like “collaborator” and “partner” have different meanings for

different practitioners, yet are used interchangeably.

• The story of the BPC Alliances cannot be told with numbers alone. While vigilant measurement is essential, so too is attention to the

unique characteristics of each Alliance and the circumstances in which it is expected to thrive.