Le politiche della ricerca al tempo dei rankings

Post on 10-May-2015

113 views 0 download

Tags:

description

Presentazione di Giuseppe De Nicolao al II Convegno Roars: “Higher Education and Research Policies in Europe: Challenges for Italy”, 21 febbraio 2014 CNR, Piazzale A. Moro 7, Roma

Transcript of Le politiche della ricerca al tempo dei rankings

Le  poli(che  della  ricerca    al  tempo  dei  rankings  

Giuseppe  De  Nicolao  Università  di  Pavia  

SOMMARIO  

Prologo:  “mamma  li  Turchi!”  

1.  Dall’Egi=o  con  furore  

2.  Should  you  believe  in  the  Shanghai  ranking?  

3.  Numb3rs!  

4.  No  Rankings?  No  party.  

5.  The  power  of  numbers  

6.  Where  are  we  going?  

Prologo  Mamma  li  Turchi!  

«Siamo agli ultimi posti nelle classifiche mondiali. Per questo motivo presenteremo a novembre la riforma dell’Università, [...] Mi auguro di non dover più vedere in futuro - conclude - la prima università italiana al 174mo posto»

Chi  è  un  “highly  skilled  migrant”?  

Dipende  dai  ranking  

Highly  skilled  migrants  

Can  I  become  a  highly  skilled  migrant  in  the  Netherlands  -­‐  even  if  I  haven't  got  a  job  yet?  

To  be  eligible,  you  must  be  in  possession  of  one  of  the  following  diplomas  or  cerPficates:  

•  a   master's   degree   or   doctorate   from   a   recognised   Dutch  insPtuPon  of  higher  educaPon  or  

•  a  master's   degree  or   doctorate   from  a  non-­‐Dutch   insPtuPon  of   higher   educaPon   which   is   ranked   in   the   top   150  establishments  in  either  the  Times  Higher  EducaPon  2007  list  or  the  Academic  Ranking  of  World  UniversiPes  2007  issued  by  Jiao  Ton  Shanghai  University  in  2007  

Posso  avere  una  borsa  per  un  master  o  un  PhD?    

Dipende  dai  ranking  

Un  professore  straniero    nel  collegio  del  do?orato?  

Dipende  dai  ranking  

Capitolo  1  Dall’EgiGo  con  furore  

Capitolo 1 Dall’Egitto con furore

rewind  ...    

16  seGembre  2010  

New  York  Times,  November  14,  2010  

Alexandria’s  surprising  prominence  was  actually  due  to  “the  high  output  from  one  scholar  in  one  journal”  —  soon  idenPfied  on  various  blogs  as  Mohamed  El  Naschie,  an  EgypPan  academic  who  published  over  320  of  his  own  ar(cles  in  a  scien(fic  journal  of  which  he  was  also  the  editor.  

                 

rewind  ...    

26  novembre  2008  

•  ...  of  the  400  papers  by  El  Naschie  indexed  in  Web  of  Science,  307  were  published  in  Chaos,  Solitons  and  Fractals  alone    while  he  was  editor-­‐in-­‐chief.    

•  El  Naschie’s  papers  in  CSF  make  4992  cita(ons,  about  2000  of  which  are  to  papers  published  in  CSF,  largely  his  own.  

never  again?  

THE  ranking  2012:    oops,  I  did  it  again!    

•  Only  a  few  of  the  more  than  100  co-­‐authors  of  2008  and  2010  reviews  were  from  MEPhI  

•  The  2008  parPcle  physics  review  received  nearly  300  Pmes  as  many  citaPons  in  the  year  acer  publicaPon  as  the  mean  for  that  journal  

•  Cites  were  averaged  over  the  rela(vely  small  number  of  MEPhI’s  publicaPons  yielding  a  very  high  citaPon-­‐rate  

•  Further,  if  citaPons  are  generally  low  in  their  countries,  then  insPtuPons  get  some  more  value  added  (regional  modificaAon)  

THE  ranking  2012:    oops,  I  did  it  again!    

what  about  other  rankings?  

the  top  10  most    spectacular  errors  of  …  

reviewed  by  University  Ranking  Watch    

QS  greatest  hits:  interna(onal  Students    and  Faculty  in  Malaysian  Universi(es  

•  In  2004  UniversiP  Malaya  (UM)  in  Malaysia.  reached  89th  place  in  the  THES-­‐QS  world  rankings.  

•  In  2005  came  disaster.  UM  crashed  100  places  

•  PoliPcal  opposiPon:  shame  on  the  university  leadership!  

•  Real  explanaPon:  lot  of  Malaysian  ciPzens  of  Indian  and  Chinese  descent  erroneously  counted  as  “foreigners”.  

QS  greatest  hits:  500  wrong  student  faculty  ra(os  in  2007  QS  Guide    

•  Someone  slipped  three  rows  when  copying  and  pasPng  student  faculty  raPos:  Dublin  InsPtute  of  Technology  was  given  Duke’s  raPo,  Pretoria  got  Pune’s,  Aachen  RWT  got  Aberystwyth’s  (Wales).  And  so  on.  Altogether  over  500  errors.  

Let’s  go  technical  ...  

Capitolo  3  Should  you  believe  in  the  Shanghai  ranking?  

Shanghai:  criteri  e  importanza  %  

The  “normaliza(on  trap”  1/2  

The  “normaliza(on  trap”  2/2  

Should  you  believe  in  the  Shanghai  ranking?  An  MCDM  view    J.-­‐C.  Billaut  D.  Bouyssou  P.  Vincke    

•  all  criteria  used  are  only  loosely  connected  with  what  they  intended  to  capture.  

•  several  arbitrary  parameters  and  many  micro-­‐decisions    that  are  not  documented.    

•  flawed  and  nonsensical  aggregaPon  method  

•  «the  Shanghai  ranking  is  a  poorly  conceived  quick  and  dirty  exercise»  

 «any  of  our  MCDM  student  that  would  have  proposed  such  a  methodology  in  her  Master’s  Thesis  would  have  surely  failed  according  to  our  own  standards»    

HUNDREDS OF UNIVERSITIES WITH SIMILAR SCORES SC

OR

E SHANGHAI RANKING:

HOW MUCH RELIABLE?

RANK

THE RANKING

SHA

NG

HA

I RA

NK

ING

Twenty  Ways  to  Rise  in  the  Rankings  (1/3)  by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html  

1.   Get  rid  of  students.  The  university  will  therefore  do  be=er  in  the  faculty  student  raPo  indicators.  

2.    Kick   out   the   old   and   bring   in   the   young.  Get   rid   of   ageing  professors,   especially   if   unproducPve   and   expensive,   and   hire  lots  of    temporary  teachers  and  researchers.  

5.   Get   a   medical   school.   Medical   research   produces   a  disproporPonate  number  of  papers  and  citaPons  which   is  good  for   the   QS   citaPons   per   faculty   indicator   and   the   ARWU  publicaPons  indicator.  Remember  this  strategy  may  not  help  with  THE  who  use  field  normalisaPon..  

7.  Amalgamate.  What  about  a  new  mega  university   formed  by  merging  LSE,  University  College  London  and  Imperial  College?  Or  a   tres   grande   ecole   from   all   those   li=le   grandes   ecoles   around  Paris?  

9. The   wisdom   of   crowds.   Focus   on   research   projects   in   those  fields   that   have   huge   mul(   -­‐   “author”     publica(ons,   parPcle  physics,  astronomy  and  medicine  for  example.    Such  publicaPons  ocen  have  very  large  numbers  of  citaPons.  

10.    Do  not  produce  too  much.  If  your  researchers  are  producing  five   thousand   papers   a   year,   then   those   five   hundred   citaPons  from   a   five   hundred   “author”   report   on   the   latest   discovery   in  parPcle  physics  will  not  have  much  impact.    

Twenty  Ways  to  Rise  in  the  Rankings  (2/3)  by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html  

13.   The  importance  of  names.  Make  sure  that  your  researchers  know  which  university  they  are  affiliated  to  and  that  they  know  its  correct  name.  Keep  an  eye  on  Scopus  and  ISI  and  make  sure  they  know  what  you  are  called.  

18.  Support  your  local  independence  movement.  Increasing  the  number   of   internaPonal   students   and   faculty   is   good   for   both  the  THE  and  QS  rankings.  If  it  is  difficult  to  move  students  across  borders  why  not  create  new  borders?  

20.    Get   Thee   to   an   Island.   Leiden   Ranking   has   a   li=le   known  ranking   that   measures   the   distance   between   collaborators.   At  the   moment   the   first   place   goes   to   the   Australian   NaPonal  University.  

Twenty  Ways  to  Rise  in  the  Rankings  (3/3)  by  Richard  Holmes  hGp://rankingwatch.blogspot.it/2013/12/twenty-­‐ways-­‐to-­‐rise-­‐in-­‐rankings-­‐quickly.html  

Capitolo  3  Numb3rs!  

Capitolo  4  

Let’s  open  the  box  

UNIVERSITY RANKING

RAW DATA

Fonte:  “Malata  e  Denigrata”  a  cura  di  M.  Regini,  Donzelli  2009  

INTERNAZIONALIZZAZIONE  

Let’s  open  the  box  

1.  “ScienPfic  excellence”  2.  Student  faculty  raPo  3.  Job  market  

4.  Funding  

1.  “Scien(fic  excellence”  

Classifiche  degli  atenei:    valore  scienAfico  assai  dubbio.  

Come  si  può  misurare  il  peso  di  una  nazione  nel  panorama  scienAfico  internazionale?  

Contando  gli  ar(coli  scien(fici  che  produce  e    

le  citazioni  che  ques(  oGengono  

Italia:  8°  per  ar(coli  scien(fici  Fonte:  SCImago  su  daP  Scopus  1996-­‐2012    

0  

10000  

20000  

30000  

40000  

50000  

60000  

70000  

80000  

90000  

100000  

1985   1990   1995   2000   2005   2010  

Regno  Unito  

Germania  

Giappone  

Francia  

Canada  

Italia  

Spagna  

Olanda  

Svizzera  

Svezia  

PUBBLICAZIONI (WoS)

PUBBLICAZIONI 2004-2010: CRESCITA MEDIA ANNUA (%)

-­‐1  

0  

1  

2  

3  

4  

5  

6  

7  

8  

Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 3.2) (dati ISI Web of Knowledge, Thomson-Reuters) http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf

0  

1000000  

2000000  

3000000  

4000000  

5000000  

6000000  

PUBBLICAZIONI 2004-2010: NUMERO DI CITAZIONI

Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 4.1) (dati ISI Web of Knowledge, Thomson-Reuters) http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf

Efficienza:  Italia  baGe    Germania,  Francia  e  Giappone  

OCTOBER 2009

Ma come fanno questi italiani a produrre così tanta ricerca con così poche risorse?

2.  Student  faculty  ra(o  

1

26 countries

6 4 2 5INDONESIA CZECH REP. ITALY BELGIUM SLOVENIA

3 SAUDI ARABIA

Rapporto studenti/docenti: su 26 nazioni solo 5 stanno peggio di noi

3.  Job  market  

4.  Funding  

Quanto è elitaria la “top 500”?

OTHER 16,500 UNIVERSITIES TOP 500

PERFORMANCE

NU

MB

ER O

F U

NIV

ERSI

TIES

... e cosa costa stare in cima?

OPERATING    EXPENSES  

FONDO  FINANZIAMENTO  ORDINARIO  2012  

MILIARD

I  DI  EURO

 

LE  “OPERATING  EXPENSES”  DI  HARVARD  AMMONTANO  AL  44%  DEL  FONDO  DI    FINANZIAMENTO  DELL’INTERO  SISTEMA  UNIVERSITARIO  STATALE  ITALIANO  

E.  Hazelkorn:  

“EsKmated  yearly  budget    of  €1.5  billion  to  be  ranked    in  the  world’s  top  100”  

Spesa per università (% PIL): l’Italia è 30° su 33 (fonte: OCSE 2013)

e  ciò  nonostante  ...  

% di Atenei che entrano nei “top 500” (Leiden: top 250)

Fonte  dei  daK:  “Malata  e  denigrata  :  l’universita  italiana  a  confronto  con  l’Europa”  (a    cura  di  M.  Regini,  Roma,  Donzelli  2009)  

CLASSIFICA:

No ranking? No party.

Capitolo  4  

Niente  classifica?    Niente  valutazione.  

“Ogni  valutazione  deve  me[ere  capo  a  una  classifica.  Questa  è  la  logica  della  valutazione.  Se    c'  è  una  classifica,  non  c'  è  neanche  una  reale  valutazione”  Giulio  TremonP,  “Il  passato  e  il  buon  senso”  CdS  22-­‐08-­‐08  

Per  rispondere  andiamo  alle  fon(  

(la  “VQR  inglese”)  

ma  come  fanno    veramente  gli  inglesi  ?  

NO RANKINGS PLEASE!WE’RE ENGLISH!!

“RAE2008 results are in the form of a quality profile for each submission made by an HEI [Higher Education Institution]. We have not produced any ranked lists of single scores for institutions or Units of Assessment, and nor do we intend to.”

5  livelli  di  qualità  (assolu()  

La  chiave  di  volta:  i  “quality  profiles”  

Dai  livelli  ai  numeri  

9 (dal 2011)

La formula

Score = Volume × Cost × (9p4 + 3p3 + p2)

p4  =  %  prodos  in  classe  4  p3  =  %  prodos  in  classe  3  p2  =  %  prodos  in  classe  2  

Capitolo  5  The  power  of  numbers  

Rankings  

•  Fragile  scienPfic  grounds  •  IncenPve  to  gaming  

•  Raw  data  are  obscured  •  They  are  not  necessary  to  manage  funding  (see  RAE/REF)  

Why,  then?  

Rankings  are  based  on  composite  indicators  

Science  or  pseudo-­‐science?  

Aggregators  vs  non-­‐aggregators  (1/3)  

Aggregators  vs  non-­‐aggregators  (2/3)  

Aggregators  vs  non-­‐aggregators  (3/3)  

•  Aggregators:    value  in  combining  indicators:  extremely  useful  in  garnering  media  interest  and  hence  the  a[enKon  of  policy  makers  

•  Non-­‐aggregators:      key  objecKon  to  aggregaKon:  the  arbitrary  nature  of  the  weighKng  process  by  which  the  variables  are  combined  

Germany  

•  “We  look  back  decades  and  people  came  to  German  universiKes;  today  they  go  to  US  universiKes.”  

•  The  Exzellenzini(a(ve  (2005):  from  tradiPonal  emphasis  on  egalitarianism  towards  compePPon  and  hierarchical  stra(fica(on  

France  

•  The  Shanghai  ranking    

 “generated  considerable  embarrassment  among  the  French  intelligentsia,  academia  and  government:  the  first  French  higher  educaKon  insKtuKon  in  the  ranking  came  only  in  65th  posiAon,  mostly  behind  American  universiKes  and  a  few  BriKsh  ones”  

Australia  

•  The  SJT  and  QS:  at  least  two  Australian  universiPes  among  the  top  100.  

•  Opposing  strategic  opPons:  –  fund  a  small  number  of  top-­‐Per  compePPve  universiPes  

– “creaPon  of  a  diverse  set  of  high  performing,  globally-­‐focused  insPtuPons,  each  with  its  own  clear,  dis(nc(ve  mission”.  

Japan  

•  “The  government  wants  a  first  class  university  for  internaKonal  presKge  ”  

•  “in  order  for  Japanese  HEIs  to  compete  globally,  the  government  will  close  down  some  regional  and  private  universiKes  and  direct  money  to  the  major  universiAes”  

•  some  insPtuPons  will  become  teaching  only.    

Why obsessing about the “top 1%”?

OTHER 16,500 UNIVERSITIES TOP 1%

PERFORMANCE

NU

MB

ER O

F U

NIV

ERSI

TIES

Answer:  trickle-­‐down  knowledge!  

E.  Hazelkorn  on  rankings  

•  90  or  95%  of  our  students  do  not  a=end  elite  insPtuPons.  Why  are  we  spending  so  much  on  what  people  aren’t  aGending  as  opposed  to  what  they  are  a=ending?  

•  EsPmated  yearly  budget  of  €1.5  billion  to  be  ranked  in  the  world’s  top  100.  May  detract  resources  from  pensions,  health,  housing,  ....  

•  Are  “elite”  insPtuPons  really  driving  naPonal  or  regional  economic  and  social  development?  

Does  trickle-­‐down  work?    

“Governments and universities must stop obsessing about global rankings and the top 1% of the world's 15,000 institutions. Instead of simply rewarding the achievements of elites and flagship institutions, policy needs to focus on the quality of the system-as-a-whole.”

There is little evidence that trickle-down works.

Capitolo  6  

Where  are  we?  

•  (Even)  Phil  Baty  (Times  Higher  EducaKon)  admits  that  there  are  aspects  of  academic  life  where  rankings  are  of  li=le  value  

•  Can  we/you  afford  the  ‘reputaPon  race’?  •  We  will  have  to  live  in  a  world  in  which  extremely  poor  rankings  are  regularly  published  and  used.  

What  can  be  done  then?  

What  can  be  done  then?  

•  There  is  no  such  thing  as  a  ‘‘best  university’’  in  abstracto.  

•  Stop  talking  about  these  ‘‘all  purpose  rankings’’.  They  are  meaningless.  

•  Lobby  in  our  own  insPtuPon  so  that  these  rankings  are  never  men(oned  in  insPtuPonal  communicaPon  

•  Produce  many  alterna(ve  rankings  that  produce  vastly  different  results.  

Grazie  per  l’aGenzione!