20 th Year Evaluation Space Grant 20 th Year Evaluation Program Performance and Results Report...
-
Upload
timothy-gray -
Category
Documents
-
view
212 -
download
0
Transcript of 20 th Year Evaluation Space Grant 20 th Year Evaluation Program Performance and Results Report...
20th Year Evaluation
Space Grant 20th Year Evaluation
Program Performance and Results Report Reviewer TrainingAtlanta, GA
October 27th, 2008
Reviewer Role Scoring Rubric Special Considerations Summary
PPR Reviewer Training
Reviewer Role
Scoring Rubric
Special Considerations
Summary
20th Year Evaluation
2
Agenda
• Reviewer Role• Scoring Rubric
– Guiding Principles– Rubric Areas– Scoring– Strengths/Weaknesses
• Special Considerations• Summary
Reviewer Role Scoring Rubric Special Considerations Summary
PPR Reviewer Training
Reviewer Role
Scoring Rubric
Special Considerations
Summary
20th Year Evaluation
3
Reviewers• Reviewers are invited or selected by
NASA headquarters because of the ability to make an expert judgment based on available data.
• Reviewers are...– Space Grant Directors– NASA Headquarters Personnel– Field Center Personnel– Former Space Grant Directors– Other individuals invited by NASA
Reviewer Role Scoring Rubric Special Considerations Summary
Reviewer Role
20th Year Evaluation
4
Reviewers• The Reviewer role is...
– To apply knowledge of Space Grant program to make an independent, unbiased assessment of the assigned consortia.
Reviewer Role Scoring Rubric Special Considerations Summary
Reviewer Role
20th Year Evaluation
5
Reviewers
“In my PPR…”
Reviewer Role Scoring Rubric Special Considerations Summary
Reviewer Role
20th Year Evaluation
6
Reviewers• Develop a working understanding of the NASA Education
Outcomes– Contribute to the development of the Science, Technology,
Engineering, and Mathematics (STEM) workforce in disciplines needed to achieve NASA’s strategic goals (Employ and Educate).
– Attract and retain students in STEM disciplines through a progression of educational opportunities for students, teachers, and faculty (Educate and Engage).
– Build strategic partnerships and linkages between STEM formal and informal education providers that promote STEM literacy and awareness of NASA’s mission (Engage and Inspire).
Reviewer Role Scoring Rubric Special Considerations Summary
Reviewer Role
20th Year Evaluation
7
What is a Rubric?– A tool that defines and communicates
criteria to assess performance.– Standardizes assessment in areas where
a great deal of subjective judgment is required.
The reviewer makes a judgment based on the outlined criteria.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
8
Methodology– An expert panel was identified to develop the rubric.
The base panel included three Space Grant Program content experts and one measurement professional.
– The scoring rubrics are based on and directly aligned with the guidelines.
– Consensus was reached between all panel members for the Final Rubric.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
9
Scoring Categories
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
Categories
(Qualitative Judgment)
Scale
(Quantitative Judgment)
• Missing 0
• Poor 2, 1
• Good 5, 4, 3
• Excellent 7, 6
• Not Rated NR
20th Year Evaluation
10
Sample Rubric
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
Evaluation Topic: (e.g. Consortium Management, Higher Education, Research Infrastructure)
Associated CMIS Data: [list of specific CMIS data table(s), as appropriate]
0 Missing The consortium did not address this required element
1Poor
There is inconclusive evidence indicating that the consortium is meeting the goals of the evaluation topic, and/or, evidence is inconclusive because of contradictions between the data sources.2
3
GoodEvidence indicates that the consortium is meeting the goals of the evaluation topic. There is consistency between the data sources or there are minor inconsistencies.
4
5
6Excellent
There is conclusive evidence indicating that the consortium is excelling at meeting the goals of the evaluation topic. The evidence is conclusive because of the consistency between all data sources.7
**In the consortium specific rubrics, the option “NR” is available and represents “No Rating.” This means that there were no consortium specific elements
20th Year Evaluation
11
Guiding PrinciplesFive Guiding Principles
– Alignment
– Rigor
– Context
– Consistency
– Results
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
12
Guiding PrinciplesAlignment
– The PPR Report and data demonstrate alignment with the Legislation, Program Objectives, and NASA programmatic guidance.
– The Reviewer judges how well the consortium delineates the state needs and aligns its programs with the Space Grant legislation, national program objectives, and NASA programmatic guidance
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
13
Guiding PrinciplesRigor
– The PPR Report articulates its purpose, SMART goals and objectives. It articulates a clear understanding of what the consortium was trying to accomplish and how its activities will be assessed.
– The Reviewer judges how well the consortium articulates its purpose, goals and objectives, and its assessment and evaluation plans.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
14
Guiding PrinciplesContext
– Context refers to having an understanding of the resources the consortium dedicates to an area.
– Context also refers to understanding the level of resources a consortium has based on its grant type (Page 20 and 21 of the PPR Guidelines)
– The Reviewer judges how well the consortium justifies the portion of its resources allocated to each program element.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
15
Guiding PrinciplesConsistency
– The CMIS data, where appropriate, validate the results reported in the PPR Report. Significant inconsistencies might indicate that PPR Report statements are questionable.
– The Reviewer judges the degree of consistency between the PPR Report analysis and the CMIS data.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
16
Guiding PrinciplesResults
– The PPR Report and CMIS data give evidence that the consortium is making important achievements. The consortium is able to demonstrate tangible results.
– The Reviewer judges the results achieved relative to the resources allocated to each program element.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
17
Guiding PrinciplesThe Guiding Principles create a Foundation for each reviewer.
This foundation enables the reviewer to make consortium specific judgments that are independent of other consortia.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
18
Guiding Principles
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
Rating Guided by Principles
Alignment
Rigor
ContextConsistency
Results
20th Year Evaluation
19
Rubric AreasThe Rubric is designed with the same format as the Program Performance and Results Report.
– Each element of the PPR Report is unique. Because of this uniqueness, a rubric is customized for each element.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
20
Rubric Types
• Each programmatic element has three rubric types:– Description
– Core Criteria (The number of criteria vary by outcome)
– Impact/Results or Evidence of Success
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
21
Rubric Areas• Executive Summary and Consortium Impact• Foreword• Consortium Management
– Description– Core Criteria
• Strategic Plan, Consortium Structure/Network (Internal), Diversity, Consortium Operations, Resource Management, Collaborations and Partnerships Outside the Consortium
– Impact/Results
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
22
Rubric AreasNASA Education Outcome I
– Fellowships/Scholarship Program– Research Infrastructure– Higher Education
NASA Education Outcome I: National Program Emphases
– Diversity– Workforce Development– Longitudinal Tracking
– Minority Serving Institutions
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
23
Rubric AreasNASA Education Outcome 2
– Precollege Programs
NASA Education Outcome 3– Public Service Program
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
24
Scoring Process• Review the rubric for the section of the PPR Report being
assessed.• Read PPR Report section being assessed.• Consider CMIS Data and other data sources associated with the
section being assessed.• Using rubric, make qualitative judgment on whether or not the
consortium is “excellent,” “good,” or “poor.”• After a qualitative judgment is made on the level of the
consortium, make a quantitative judgment on what integer score to assign to the consortium within the level.
• “Close the loop” by re-assessing your rating considering the qualitative and quantitative judgments. This is the italicized statement within each rubric qualitative area.
Reviewer Role Scoring Rubric Special Considerations Summary
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
25
Scoring Process
Reviewer Role Scoring Rubric Special Considerations Summary
[CONSORTIUM MANAGEMENT] RESOURCE MANAGEMENT Provide an analysis of trends of matching fund sources and amounts, distribution of funds among affiliates, management and administrative costs, as well as allocation of funds across project elements. For each of these areas, discuss your strategy, rationale, as well as associated strengths and weaknesses. Describe how staff resources are allocated in terms of management and administrative tasks, resource development, and/or project implementation. ASSOCIATED CMIS DATA: Consortium Management: match and Other Federal Funds Table, and all three Program Allocation Tables
1
Poor
There is a cursory discussion of the resource management strategy and rationale. There is inconclusive evidence that the resource management strategy contributes to the attainment of the SMART goals across project elements. There is no critical analysis of the strengths and weaknesses of the strategy and rationale. Staff resource allocation is not or is poorly aligned with the overall consortium strategy and rationale.
Evidence is inconclusive because of contradictions between the data sources. Overall, resource management is not consistent with the ability of the consortium to meet its
goals.
2
3
Good
There is a good discussion of the resource management strategy and rationale. There is evidence that the resource management strategy contributes to the attainment of the SMART goals across project elements. There is an analysis of the strengths and weaknesses of the strategy and rationale. Staff resource allocation is aligned with the overall consortium strategy and rationale.
There is consistency between the data sources or there are minor inconsistencies. Overall, resource management is consistent with the ability of the consortium to meet its goals.
4
5
6
Excellent
There is an in-depth discussion of the resource management strategy and rationale. There is conclusive evidence that the resource management strategy contributes to the attainment of the SMART goals across project elements. There is a critical analysis of the strengths and weaknesses of the strategy and rationale. Staff resource allocation is directly aligned with the overall consortium strategy and rationale.
Evidence is conclusive because of the consistency between all data sources. Overall, resource management enables the consortium to meet its goals.
7
1. Qualitative Judgment
2. Quantitative Judgment
3. Close the loop
Scoring Rubric
Definition
Categories
Methodology
Sample Rubric
Guiding Principles
Rubric Types/Areas
Scoring Process
20th Year Evaluation
26
Comments
• Statement Guidelines– Maintain Self-Anonymity– Avoid Referencing Individuals by
Name– State Complete Thoughts– Make Specific, Concise Comments– Maintain Objectivity in Positive and
Negative Comments
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
27
DataCMIS Data may be a Starting Point
– The CMIS Data may not be representative of all data that are presented in the PPR Report.
A consortium may cite data that are outside the realm of the variables included in the CMIS database. These data should be considered in addition to any available CMIS data.
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
28
Reviewer ExpertisePoor or Good? Good or Excellent?
– It is possible that a consortium in any PPR Report area being judged has characteristics of poor, good, and/or excellent performance. The expertise of the reviewer is the deciding factor in these cases. The reviewer makes a judgment based on the preponderance of the available evidence of whether the consortium is excellent, good, or poor.
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
29
Not RatedNR?
– It is possible that the consortium specific elements were not a focus of the consortium. As noted in the PPR Report Guidelines, the consortium is to specifically state in the description if an element was not applicable If the Description provides an explicit statement that an element was not a focus, the consortium specific rubric will be rated as “NR”.
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
30
Not RatedIs a Consortium Evaluation Harmed by NRs?
– No. NRs will not be included in the assessment compilations of criteria and impact/results.
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
31
DemographicsImpacts Can Differ Based on State Demographics.
– The demographics of the state may make it appear that the impact a consortium is having is insufficient based on the amount of resources dedicated to the area.
– Refer to the PPR Report Foreword to review the described consortium landscape
– If a reviewer is from a state with demographics much different than the consortium being reviewed, the reviewer should utilize his/her expertise but not apply an unfair bias against a consortium. (This refers to the “context” guiding principle).
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
32
Grant Types• The PPR Report Guidelines (page 20-21)
outline the Space Grant Types. – Designated– Program Grant– Capability Enhancement
• An in-depth understanding of the grant types is required so that a consortium’s PPR receives a fair review (This refers to the “context” guiding principle).
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
33
Consortium Concurrence• The reviewer provides no rating related to
concurrence• The Executive Panel will review this
requirement
Reviewer Role Scoring Rubric Special Considerations Summary
Special Considerations
Comments
Expertise
Data
Not Rated
Demographics
Grant Types
Concurrence
20th Year Evaluation
34
Summary• The Guiding Principles create a foundation for the
reviewers.• Use of the rubric standardizes scoring for the reviewers.• Scoring
– Qualitative (Excellent, Good, Poor, Incomplete)– Quantitative [7-6 (Excellent), 5-3 (Good), 2-1 (Poor),
0 (Incomplete)]• Reviewers are the experts invited or selected to use their
knowledge as a basis to make judgments.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
35
Activity: Comment Evaluation• The following slides contain actual
reviewer comments from the 15th year evaluation
• Consider the guidelines reviewed earlier and judge if the comments are appropriate or inappropriate
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
36
Activity: Comment Evaluation• Effective Comments:
– The translation of science.nasa.gov into Spanish provides on-going impact to the Hispanic community in STATE and around the world. Excellent examples of collaboration with NASA Center. Very impressive impact through pre-college efforts -- not only bringing the Program to STATE, but the design and oversight of statewide professional development. This clearly demonstrates alignment and coordination with the state systemic reform efforts.
– While the purpose is clear, the description was lacking a discussion of measurable objectives with clearly defined metrics. The description was lacking a discussion of assessment and evaluation plan. According to the CMIS data, there has not been an underrepresented minority student award since 1998. In fact, according to CMIS, that’s the only underrepresented minority student in five years. Student participation research and mentoring with field centers and industry is not as conclusive as it could be. The discussion is a bit too general and appears to center around outreach activities.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
37
Activity: Comment Evaluation• There is not much analysis of what the needs
are and how the consortium is organizing its resources to best address those. I would recommend that the director should convene a planning group in his state, including the principals and one or two outside persons, and go through the planning process.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
38
Activity: Comment Evaluation• There is not much analysis of what the needs
are and how the consortium is organizing its resources to best address those. (Appropriate comment) I would recommend that the director should convene a planning group in his state, including the principals and one or two outside persons, and go through the planning process. (Inappropriate comment-it is not the reviewers role to make recommendations)
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
39
Activity: Comment Evaluation• The strategic implementation plan is clearly
derived from the National program’s strategic plan… promotes a variety of activities and is effectively working to meet the needs of its citizens.
• Strategic objectives clearly derived from National priorities. Evidence of analysis of state needs.
• Very complete.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
40
Activity: Comment Evaluation• The strategic implementation plan is clearly derived from the
National program’s strategic plan… promotes a variety of activities and is effectively working to meet the needs of its citizens. (Appropriate comment: states why area is a strength)
• Strategic objectives clearly derived from National priorities. (Appropriate comment) Evidence of analysis of state needs. (Inappropriate comment: does not provide a qualitative assessment)
• Very complete. (Inappropriate comment: that’s it?)
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
41
Activity: Comment Evaluation• Both are appropriate and comprehensive comments:
– The translation of science.nasa.gov into Spanish provides on-going impact to the Hispanic community in STATE and around the world. Excellent examples of collaboration with NASA Center. Very impressive impact through pre-college efforts -- not only bringing the Program to STATE, but the design and oversight of statewide professional development. This clearly demonstrates alignment and coordination with the state systemic reform efforts.
– While the purpose is clear, the description was lacking a discussion of measurable objectives with clearly defined metrics. The description was lacking a discussion of assessment and evaluation plan. According to the CMIS data, there has not been an underrepresented minority student award since 1998. In fact, according to CMIS, that’s the only underrepresented minority student in five years. Student participation research and mentoring with field centers and industry is not as conclusive as it could be. The discussion is a bit too general and appears to center around outreach activities.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
42
Activity: Rubric Application• NASA Ties rubric from 15th Year Evaluation
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
NASA TIES: Relationships that have been established with and Enterprises for the purposes of implementation, coordination, communication, or disseminationASSOCIATED CMIS DATA: Program Summary Statistics Reports – 5 Year Averages and 5 Year Cumulative. Also Fellowship and Scholarship: Award Recipient Demographics, Research: Participants, and Higher Education: Participants1
Poor
There is inconclusive evidence of existing relationships with and Enterprises. Plans to create relationships are not evident. If there is evidence of existing relationships, these relationships are disjointed or inconsistent and have no apparent goals. Evidence does not indicate a synthesis of the five-year evaluation period.
Evidence is inconclusive because of contradictions between the data sources.
2
3
Good
There is evidence of existing relationships with and Enterprises. There is evidence that the relationships were established to assist the consortium in meeting program goals. Evidence indicates a synthesis of the five-year evaluation period.
There is consistency between the data sources or there are minor inconsistencies
4
5
6
Excellent
There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium.
Evidence is conclusive because of the consistency between all data sources.
7
20th Year Evaluation
43
Activity: Rubric Application• This consortium was rated as Excellent by all reviewers for this submission (Potential identifying information removed)
– NASA Ties: Strong ties exist between … NASA Centers. The Consortium works with NASA … through the Undergraduate Student Research Program (USRP). An employee from each Center, generally in the University Affairs Office, is assigned to work with … staff as Center Coordinator for USRP. This relationship is strengthened throughout the program cycle as… staff work closely with Center Coordinators on the application review and selection process, program marketing efforts, student placement and evaluation process… formally became a.. member in July 2003 and … serve on our Advisory Council. A … partnership exists with… In this effort, we also work… The Consortia has funded a… position… We continue our relationship… by funding one or two students each year. Additionally, we work with … experiments through two … universities. NASA… supports Consortium … projects and supported an … project. We manage the … Program for NASA … and the…Enterprise. Our ties to… are strengthened through … other joint educational projects. … provides a… administrative coordinator slot for NASA …. was a supporter of the… Experiment Program for which we sponsored … educators. Our working network with NASA Centers continues to expand as our program grows.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
20th Year Evaluation
44
Activity: Rubric Application• Why Excellent?
– NASA Ties Strong ties exist between … NASA Centers. The Consortium works with NASA … through the Undergraduate Student Research Program (USRP). An employee from each Center, generally in the University Affairs Office, is assigned to work with … staff as Center Coordinator for USRP. This relationship is strengthened throughout the program cycle as… staff work closely with Center Coordinators on the application review and selection process, program marketing efforts, student placement and evaluation process… formally became a.. member in July 2003 and … serve on our Advisory Council. A … partnership exists with… In this effort, we also work… The Consortia has funded a… position… We continue our relationship… by funding one or two students each year. Additionally, we work with … experiments through two … universities. NASA… supports Consortium … projects and supported an … project. We manage the … Program for NASA … and the…Enterprise. Our ties to… are strengthened through … other joint educational projects. … provides a… administrative coordinator slot for NASA …. was a supporter of the… Experiment Program for which we sponsored … educators. Our working network with NASA Centers continues to expand as our program grows.
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review
6
Excellent
There is conclusive evidence of formalized, existing relationships with Enterprises. There is evidence that the relationships have developed into a partnership between the consortium and the NASA Centers and Enterprises that facilitates meeting consortium goals. There is evidence of products, processes, publications, or other accomplishments as a result of these relationships. Evidence indicates a synthesis of the five-year evaluation period that analyzes trends of the consortium.
Evidence is conclusive because of the consistency between all data sources.
7
Award Levels and Amounts
Awardees
1998 1999 2000 2001 2002 Total5 Yr. Average
Total Awards
65 69 65 66 53 318 64
Average Award
Amount
$ 10,312
$ 12,195
$ 12,764
$ 13,156
$ 14,375
$ 62,802
$ 12,560
20th Year Evaluation
45
Site Review
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Log into the review site at https://secure.spacegrant.org/20th/review/
Enter your email and password here
Summary
Application
Comments
Site Review
20th Year Evaluation
46
Site Review
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Logging in brings you to the review summary page
This page displays the consortia you will review. Click on the consortia name to go to enter your score and comments
Scores you have entered and saved will be displayed. Scores you still need to enter will be grayed out.
Summary
Application
Comments
Site Review
20th Year Evaluation
47
Site Review
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
These links allow the reviewer to advance to other rubric sections
Select “Review Summary” to return to the summary page.
Enter your rating by selecting the radio button
Summary
Application
Comments
Site Review
20th Year Evaluation
48
Site Review
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
You must press “Save Page” to save your data. If you return to the review summary or select next or previous page, your data is not saved.
Summary
Application
Comments
Site Review
20th Year Evaluation
49
Site Review
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
The “Submit All Program Performance and Results Reviews” button is at the bottom of the Review Summary page. Select this button only when you have completed all reviews. You close your review process when you select this button.
Summary
Application
Comments
Site Review
20th Year Evaluation
50
Questions?• Content related Questions:
– [email protected]• Technical Questions:
Reviewer Role Scoring Rubric Special Considerations Summary
Summary
Summary
Application
Comments
Site Review