Sunum qqml2014
-
Upload
izmir-institute-of-technology -
Category
Documents
-
view
160 -
download
0
Transcript of Sunum qqml2014
Results Oriented Research Performance Evaluation ’’Introducing Reliable and Efficient Performance Indicators in Academic World’’
Tayfun BASAL
Regional Director , Elsevier – Turkey, Iran, ME & Central Asia
Gultekin GURDAL
Library Director, Izmir Institute of Technology
QQML 2014. 6th International Conference on. Qualitative and Quantitative Methods in Libraries. 27 - 30 May 2014 Istanbul
Conceptual expectations from scientific production on the national scale • Being scientifically competent
• Strong obligation of having a information/knowledge based economy
• Global qualification of research outputs
• Using the potential of scientific research to foster innovation and economical growth
Social & Economical Improvement of the Country
Direct contribution of science to economy
Economical Development
Technology/Know How
Scientific Production
Major Challenges
• Academic grants system pressure on researchers to focus on quantitative production
• Missing focus on the national scale to pay specific attention to research in universities (separating teaching & research focus)
• Researchers are expected to run both teaching and research activities and nevertheless do administrative work as well which maintain less focus on research
• Lack of enough funding to do more qualified research • Lack of tools to showcase success also does not support search for
further funding opportunities
• Lack of competitive benchmark mechanism lead by research governors create unfair competition between researchers
• lack of enough PhD students
Key steps to create a more competitive research environment
• Strategic planning and support for capacity building • Enhancing infrastructure for research • Fundamental set up for higher education research world
but customized plans on different needs of institutions • Better financial support and additional funding
opportunities in benchmark to global leaders • Better compensation opportunities for innovative research
projects and hence regulate relevant innovation frameworks
• Provide alignment strategies and support initiatives to encourage better collaboration between corporates and academic world (benchmark from global success stories)
How to integrate academic researchers into a new world? • Research administrators should define transparent,
competitive and objective performance indicators to encourage research success
• Demographic factors should taken into account while performance measurement frame work is being designed
• Status relevant issues should be factored into evaluation analysis (additional workloads, subject/discipline related advantages/disadvantages, experience level)
• Outputs on quantitative level combined with qualitative insights should be introduced in assessments to have a better understanding of the current performance; nevertheless, benchmark based tasks/targets should be introduced)
Define priorities and measure success accordingly • Researchers should be informed in advance regarding
different scale expectations and there should be a strong alignment between institutional management and researchers
• Global, national, institutional priorities should be defined and relevant achievements should be incentivized as a result
National science
strategies
Local competencies
Global visibility and competition
success
Questions to answer before defining the frame work… • What is National Higher Education Research Agenda?
• How can we address regional , national and institutional issues with a multilayered research policy approach?
• What is the most efficient way of measuring innovative science and what are the most useful tools to offer in order to encourage scientists/institutions better?
• How can we make the successful researchers, institutions and competitive fields more visible on the national & global scale?
Must have or fundamental steps… • Maintaining researchers orientations or trainings at the
• Providing required set of scientific reference data and solutions to researchers
• Set up a successful institutional management system to observe and regulate process steps
• Lead the way to facilitate funding flows into the institution
What are the first steps of defining performance indicators?
• Extend your assessment to find out institutional competencies and hence prioritize relevant subject areas, disciplines & contributing researchers
• Set competitive benchmarks to leverage current performance; nevertheless, provide benchmark updates on a regular basis to catchup with competition and better understand global trends
• Tailor efficient strategies to support increase of national and global visibility
Limitations of the discussion and main topic • This study in particular focus on better evaluation of research
outputs and hence we are not going to discuss the others that should be included within performance measurement system frame work as a whole such as demographic factors and status relevant concerns. It is believed that this may be another topic of discussion for an article from the field of sociology.
A sample set of analysis & ideas to materialize thoughts • One of the biggest challenges
while evaluating the research outputs is how to address the subject field or discipline related variations in between different researchers
• Traditional ways of measuring quality and impact of science is not observed to be efficient/objective anymore
• Several different researches and discussions around the topic agree that a merge of below would bring a better solution to the challenges
Quantitative analysis
Qualitative analysis
Visibility/Impact analysis
Some indicators to focus
• Country specific indicators
• Institution specific indicators
• Department/project group specific indicators
• Researcher specific indicators
Ambition is not to re-invent the wheel ! Point is whom should be
compared/benchmarked against whom and what should be the context and how should we select/define the
indicators?
Each case has its own priorities… Therefore most relevant indicators/metrics can be aggregated from current pool, define new ones to fit requirements or readapted from current ones….this study would like to focus on more qualitative indicators and set benchmarks from Turkish case example and then try to redesign some new ones if necessary!.....
Multiple layers of benchmarks
• Global benchmarks
• Regional (geography ) benchmarks
• National benchmarks
• Institutional benchmarks
• Individual (researcher benchmarks)
Selecting a sample metric to be used for benchmarks • Citations per publication ”CPP” (one of the snowball metrics
and also widely agreed/perceived to be one of the important metrics for research performance measurements). CPP will be used as the sample metric in the following analysis of different benchmark scales
Global benchmarks
Turkey is underperforming against world benchmark but doing much better in comparison to Brics group
Regional Benchmarks
Turkey is underperforming against middle east and asia pacific but closing the gap with 2 regions in the last 6-7 years
Institutional benchmarks (Most productive 2 Turkish institutions are selected as case examples; Istanbul & Hacettepe)
Both universities are performing better than Turkey and their individual performance is very close to each other
Individual (researcher) benchmarks in a certain selected area(L.Akarun, computer science)
Selected researcher is performing much better than Turkey in the field of computer science but having a parallel performance in comparison
to her performance against her university where she is affiliated.
Some findings & proposals • The benchmark that is selected to measure and individual
performance is highly dependent on the benchmark scale that evaluation is performed.
For example total score for the individual with respect to citations per publication may be defined as;
• 20% for institutional performance
• 20% for national performance
• 20% for global performance
• 40% for performance against selected special benchmarks (best benchmark of the global scale
Above shares are illustrative only and the specific expectation or priority of the university research management may decide which
aspect is more important for them
Multiple scale comparison for the same sample researcher in the field of computer science
Above graph shows performance against, country, institutions, globe and also identified best benchmark group which is G8 countries in this
example
Custom indicator defined to see relative performance against benchmarks for the same sample individual
Name Citations per Publication
Overall (Average over years)
L. Akarun 6.5 Bogazici University 6.7 G8 6.7
Turkey 5.2
World 4.7
Relative CPP Performance of the individual Benchmark
Overall (Average over years)
0,97 Against affiliated
inst.
0,97 Against selected
best
1,25 Against country (TR) 1,38 Against world
We have identified the relative citations per publications of the researcher against benchmarks; so if we want to see the relative performance of the researcher against her institutions then it is
individual performance divided by institutional performance. If the result is equal to one then it is means equal performance; if bigger than one than means better performance and if less than one then
it means worse performance. Performing 3 % less against inst. & selected best but 25% better performance than country and 38%
better performance than the global scale
Conclusion
1) Qualitative data analysis for individuals is highly dependent on benchmark scale selected; nevertheless, comparison study is bounded with time interval and subject field/discipline selected
Conclusion
2) One can perform perfect on the local/regional scale but may be performing poorer against selected best in class benchmarks, therefore the evacuation body
Conclusion
3) There are several metrics/indicators available now so that you can selected any of them which demonstrates more value against expectations but it is always possible to take a side route and customize a new one which may fit individual expectations better and more transparent
Disclaimer: this presentation is focused on sharing some insights regarding structuring a well balanced analysis related to research outputs especially on the qualitative indicators of production but the original article that this presentation
is based on would discuss the topic more broadly with multiple aspects.