Research Reviews in the Netherlands A Practical Approach

19
1 Research Reviews in the Netherlands A Practical Approach Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl

description

Research Reviews in the Netherlands A Practical Approach. Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl. Contents. What? The Dutch System of Research Reviews Why? Aims and Owners of the System How? 1993-2003 and 2003-2009. - PowerPoint PPT Presentation

Transcript of Research Reviews in the Netherlands A Practical Approach

Page 1: Research Reviews in the Netherlands A Practical Approach

1

Research Reviews in the Netherlands

A Practical Approach

Roel Bennink, coordinator research reviewsQuality Assurance Netherlands Universities

www.qanu.nl

Page 2: Research Reviews in the Netherlands A Practical Approach

2

Contents

1. What?The Dutch System of Research Reviews

2. Why? Aims and Owners of the System

3. How? 1993-2003 and 2003-2009

Page 3: Research Reviews in the Netherlands A Practical Approach

3

Universities in the Netherlands

• Fourteen research based universities (incl. OU)• 4.8 billion Euro; 50.000 staff, 160.000 students

0

5.000

10.000

15.000

20.000

25.000

UU

RU

G

TU

D

UvA

LE

I

VU

RU

UM

TU

E

WU

UT

EU

R

UvT

Staff

Students

Publications

Page 4: Research Reviews in the Netherlands A Practical Approach

4

Funding of Research

Three funding sources1. Direct funding (60%)

• Main source, stable, plus extra for dissertations and research schools (each about 10%)

2. Competitive funding (10%)

3. Contracts (30%)

Page 5: Research Reviews in the Netherlands A Practical Approach

5

Research Reviews in NL

• All publicly funded research must be submitted for external review every 6 years.

• System started in 1993, organised by the universities collectively (VSNU)– Nationwide, per discipline (± 35)

• New Protocol in 2003: SEP– Not always nationwide– (individual) Boards are responsible

Page 6: Research Reviews in the Netherlands A Practical Approach

6

Standard Evaluation Protocol

• Internal and external objectives combined:– Improving quality of research

– Improving research management & leadership

– Accountability to government and society

• Object of the assessment:– Research quality by international standards

– Depending on mission per Institute/Programme:• Social or economical objectives

• Technical or infrastructural objectives

Page 7: Research Reviews in the Netherlands A Practical Approach

7

Standard Evaluation Protocol

• Four main aspects:

– Quality• International recognition and innovative potential

– Productivity• Scientific output

– Relevance• Scientific and socio-economic impact

– Prospects• Flexibility, management, leadership

Page 8: Research Reviews in the Netherlands A Practical Approach

8

Five point scale

5. Excellent: - internationally leading

- important and substantial impact6. Very Good:

- internationally competitive, national leader- significant contribution

3. Good:- internationally visible, nationally competitive- valuable contribution

2. Satisfactory: - nationally visible- adds to understanding

1. Unsatisfactory:- flawed, not worthy of pursuing

Page 9: Research Reviews in the Netherlands A Practical Approach

9

Method: Self-analysis and Peer review

Close link between research management, quality control and accountability to higher levels

Multi-purpose data collection Uniform criteria (all universities use SEP) (sometimes) Citation analyses (often) Simultaneous and comparative (always) Public reports

Page 10: Research Reviews in the Netherlands A Practical Approach

10

Descriptive elements:

1. Mission, leadership, strategy, policies2. Research processes (teamwork, supervision, quality

control) 3. Reputation (reviews, awards, citations) 4. Internal evaluation (management, culture)5. External validation (spin-offs, stakeholder survey)6. SWOT-analysis 7. Key publications (list of 5, copies of 3)

Page 11: Research Reviews in the Netherlands A Practical Approach

11

Quantitative elements:

1. Research staff (tenured, non-tenured, PhD, support) per year

2. Funding (ministry; research councils; contracts) per year

3. Spending (personnel; other) per year

4. Results (publications)

Page 12: Research Reviews in the Netherlands A Practical Approach

12

Evaluative questions

1. Documentation• What is missing, what is not needed?• How much effort is spent in producing the self-studies?

2. Committees• How are committee members selected?• What is a good site visit? What can go wrong?• What determines the quality of the committee reports?

3. Consequences• What are the effects of the reviews? What measures are taken by

faculties/universities/funding agencies/ministries?

4. Lessons learned• What mistakes can be avoided?• What are critical success factors for research reviews?

Page 13: Research Reviews in the Netherlands A Practical Approach

13

Documentation

What is missing, what is not needed?• It is always too much and never enough

• The SEP-set is what every Institute or Programme should have anyway

• Bibliometrics are expensive, time consuming and only useful in some disciplines

How much effort is spent in producing the self-studies?

• That depends on what you already have

• “It’s a lot of work, but very useful”

Page 14: Research Reviews in the Netherlands A Practical Approach

14

Committees

How are committee members selected?• Proposals by Faculties, approval by Boards (and QANU)• Members must be independent and unbiased internationally

acknowledged experts

What is a good site visit?• Honest and open discussions, well-prepared peers• Critical and constructive questions; good teamwork

What determines the quality of the reports?• Public nature, feedback loop• Comparative overview; general chapters per subfield• Combination of scores and text (including recommendations)• Directed at management, not ministry

Page 15: Research Reviews in the Netherlands A Practical Approach

15

Consequences

What are the effects and measures? Visibility is increased Management dialogues are enhanced Management information improves Publishing in high impact international journals is stimulated Groups are merged, extended, redirected or stopped High marks are (sometimes) financially rewarded Low marks lead to critical questions Recommendations are taken seriously Ministry is kept at a distance

Page 16: Research Reviews in the Netherlands A Practical Approach

16

Lessons learned in NL

What mistakes can be avoided?• Don’t assess individuals, only groups• Don’t focus too much on the scores only, or on ranking• Don’t ask too far-reaching (‘strategic’) questions

What are the critical success factors?• Keep it simple• Stay close to real organisational structures and management

processes• Agree on uniform definitions and data• Cooperate with other universities• Build/buy and maintain research information systems (METIS RU).

Page 17: Research Reviews in the Netherlands A Practical Approach

17

Weaknesses

• Committees find it difficult to assess the Institute level

• Scoring the ‘management’ was abolished

• Central scheduling was abolished

• Costs (time & money) remain an issue

• Tools for measuring “exchange of knowledge” need further development

• Managers (and journalists and politicians) attach too

much value to rankings

Page 18: Research Reviews in the Netherlands A Practical Approach

18

Strengths

• SEP works as a basic and practical tool• Peer review is authoritative about programmes (main

strength)• External reviews are a useful addition to quality

assurance• External reviews shift power to faculty and university• Policy decisions at the responsible level are supported• Reviews are used to look ahead• Peers from abroad add international dimension to quality• Cooperation between universities facilitates

benchmarking

Page 19: Research Reviews in the Netherlands A Practical Approach

19

Any questions?