Assessing Research-Doctorate Programs: A Methodology Study.

26
Assessing Research- Doctorate Programs: A Methodology Study

Transcript of Assessing Research-Doctorate Programs: A Methodology Study.

Page 1: Assessing Research-Doctorate Programs: A Methodology Study.

Assessing Research-Doctorate Programs:

A Methodology Study

Page 2: Assessing Research-Doctorate Programs: A Methodology Study.

Committee Task

• Review and revise the methodology used to assess the quality and effectiveness of research doctoral programs.

• Explore new approaches and new sources of information about doctoral programs and new ways of disseminating these data.

• Recommend whether to conduct a full assessment using the methodology developed in by the committee

Page 3: Assessing Research-Doctorate Programs: A Methodology Study.

History of NRC Assessments

• 1982 “Assessment of Research-Doctorate Programs in the United States”

Lyle V. Jones (Co-Chair)

Gardner Lindzey (Co-Chair)• 1995 “Research-Doctorate Programs in the

United States: Continuity and Change”

Marvin L. Goldberger (Co-Chair)

Brendan Maher (Co-Chair)

Page 4: Assessing Research-Doctorate Programs: A Methodology Study.

Perceived Strengths of Prior NRC Assessments

• Authoritative source

• Comprehensive

• Clearly stated methodology

• Temporal continuity

• Widely quoted and utilized

Page 5: Assessing Research-Doctorate Programs: A Methodology Study.

Perceived Weakness of Prior NRC Assessments

• Spurious precision of program rankings

• Confounding of research reputation and educational quality

• Soft criteria for assessments of programs

• Ratings based on old data

Page 6: Assessing Research-Doctorate Programs: A Methodology Study.

Weaknesses continued…

• Poor dissemination of results for some audiences

• Taxonomy categories out of date

• Validation of data inadequate

Page 7: Assessing Research-Doctorate Programs: A Methodology Study.

Design of the Methodology Study

• Formation of a committee. Definition of tasks.• Panel meetings to define questions, discuss

methodology. Panels: Taxonomy and interdisciplinarity Quantitative measures Student processes and outcomes Reputation and data presentation

• Pilot trials of questionnaires, taxonomy.

Page 8: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations

• Spurious precision issue:

The committee recommends a new statistical methodology to make clear the probable range of ranking for each assessed academic unit.

Page 9: Assessing Research-Doctorate Programs: A Methodology Study.

Alternative Approach to Rankings to Convey Rating Variability

• Draw ratings at random.

• Calculate rating for that draw.

• Repeat process enough times to reach statistical reliability.

• Present distribution of ratings from all the draws.

Page 10: Assessing Research-Doctorate Programs: A Methodology Study.
Page 11: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations continued…

• Research versus education issue:

– Drop reputational estimate of education quality as not independent of the reputational estimate of

program quality.

– Add quantitative indicators of educational offerings and outcomes.

Page 12: Assessing Research-Doctorate Programs: A Methodology Study.

Program Measures and a Student Questionnaire

• Questions to programs– Size– Student characteristics and financing– Attrition and time to degree– Competing programs

Page 13: Assessing Research-Doctorate Programs: A Methodology Study.

Program Measures and a Student Questionnaire continued…

• Questions to students in selected fields– Employment Plans– Professional Development– Program Environment– Infrastructure– Research Productivity

Page 14: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations continued…

• Soft criteria issue:

Add quantitative measures concerning research output, citations, student support, time to degree, etc.

Page 15: Assessing Research-Doctorate Programs: A Methodology Study.

Examples of Indicators

• Publications per faculty member

• Citations per faculty member

• Grant support and distribution

• Library resources (separating out electronic media)

• Laboratory Space

• Interdisciplinary Centers

Page 16: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations continued…

• Poor dissemination issue:

– Add analytic essays to archival book output. – Add updateable current web output.– Add electronic assessment tools. – Add links from professional societies.

Page 17: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations continued…• Taxonomy issue:

– Update 1995 taxonomy.

– State clear criteria.

– Consult professional societies, administrators and faculty.

– Allow for two academic categories (rated programs and emerging fields).

– Named subfields to help universities classify their programs.

– Allowed faculty to be in more than one program.

– Included two sub-threshold humanities fields (classics and German) to maintain continuity.

Page 18: Assessing Research-Doctorate Programs: A Methodology Study.

Recommendations continued…

• Validation issue:

Conduct pilot studies and institute checks, both by institutional respondents and by external societies.

Page 19: Assessing Research-Doctorate Programs: A Methodology Study.

Pilot Institutions

• University of Maryland

• Michigan State University

• Florida State University

• University of Southern California

• Yale University

• University of Wisconsin at Milwaukee

• University of California, San Francisco

• Rennsalear Polytechnic Institute

Page 20: Assessing Research-Doctorate Programs: A Methodology Study.

What’s next

• Obtain financing for the full study from both federal and foundation sponsors.

• If funding is obtained:– Full study would begin in Spring, 2004– Data collection in 2004/2005 for previous

academic year.– Final report in summer 2006

Page 21: Assessing Research-Doctorate Programs: A Methodology Study.

ConclusionThe study that the Committee recommends is a BIG

undertaking in terms of survey cost and the time of graduate programs and their faculty. Why is it worth it?

It will provide faculty, students and those involved with public policy an in-depth look at quality and characteristics of those programs that produce our future scientists, engineers, and those who help us understand the human condition.

Page 22: Assessing Research-Doctorate Programs: A Methodology Study.

Committee Jeremiah Ostriker, Princeton,

(Astrophysics), Chair

Elton Aberele, U. of Wisc (Ag) John Brauman, Stanford U. (Chem) George Bugliarello, PolyNY (Eng) Walter Cohen, Cornell U. (Hum) Jonathan Cole, Columbia U. (Soc

Sci) Ronald Graham, UCSD (Math) Paul Holland, ETS (Stat)

Earl Lewis, U. of Michigan (History) Joan Lorden, U. of Alabama-

Birmingham (Bio) Louis Maheu, U. de Montréal (Soc) Lawrence Martin, SUNY-Stony

Brook (Anthro.) Maresi Nerad, U. Wash (Sociology

& Education) Frank Solomon, MIT (Bioscience) Catherine Stimpson, NYU (Hum)

Page 23: Assessing Research-Doctorate Programs: A Methodology Study.

Sub Committee – Panels

• STUDENT PROCESSES AND OUTCOMES

• QUANTITATIVE MEASURES

• TAXONOMY AND INTERDISCIPLINARITY

• REPUTATIONAL MEASURES AND DATA

PRESENTATION

Joan Lorden (Chair) University of Alabama-Birmingham

Catherine Stimpson (Chair) New York University

Walter Cohen (Co-Chair) Cornell UniversityFrank Solomon (Co-Chair) Massachusetts Institute of Technology

Jonathan Cole (Co-Chair) Columbia UniversityPaul Holland (Co-Chair) Educational Testing Service

Page 24: Assessing Research-Doctorate Programs: A Methodology Study.

Additional Panel MembersSTUDENT PROCESSES

AND OUTCOMES

• Adam Fagen, Harvard Univ.(Bioscience,

grad.student)

• George Kuh, Indiana Univ. (Education)

• Brenda Russell, Univ. ofIllinois-Chicago

(Bioscience)

• Susanna Ryan, Indiana U. (English, Woodrow Wilson Fellow)

QUANTITATIVE

MEASURES

• Marsha Moss, Univ. of Texas (Institutional Research)

• Charles E. Phelps, Univ. of Rochester (Provost & Econ.)

• Peter D. Syverson, Council of Graduate Schools

Page 25: Assessing Research-Doctorate Programs: A Methodology Study.

Additional Panel Members

TAXONOMY AND

INTERDISCIPLINARITY

• Richard Attiyeh,UCSD (Econ.)

• Robert F. Jones, AAMC (Bioscience)

• Leonard K. Peters, VPI (Computer Science)

REPUTATIONAL MEASURES AND DATA RESENTATION

• David Schmidley, Texas Tech (President &

Bioscience)

• Donald Rubin, Harvard (Statistics)

Page 26: Assessing Research-Doctorate Programs: A Methodology Study.

Project web-site

http://www7.nationalacademies.org/resdoc/index.html