DEA Malmquist Index Application for University Ranking Data … · 2017-06-21 · DEA 2017 DEA...
Transcript of DEA Malmquist Index Application for University Ranking Data … · 2017-06-21 · DEA 2017 DEA...
DEA 2017
DEA Malmquist Index Application for University Ranking DataDiscovery and Discourse of QuestionsRegarding Index Data, DMU Selection and DEA Sensitivity
Prague, 27.06.2017
Matthias Klumpp
University of Duisburg-Essen, PIMFOM University of Applied Sciences, ILDUniversity of Twente, IEBIS
Agenda
1. Research Question
2. HE Framework and Trends
3. DEA Malmquist Index Method
4. Application Data (Rankings)
5. Calculation Results
6. Discussion
7. Outlook
1. Research Question(s)
1
RQ1 Can a DEA Malmquist index calculation be applied to a longitudinal efficiency analysis of universities regarding input and ranking output data?
RQ2 Is there an index data problem using university ranking data for longitudinal efficiency analysis?
RQ3 What further questions do arise applying university rankings data to a DEA (Malmquist index) analysis?
2. HE Framework and Trends
Accountability & New Public Management concepts
University rankings as “zero sum game” (Morphew &
Swanson 2011, pp. 195-196, Federkeil, van Vught & Westerheijden 2012a, p. 45)
Problem: Possibly necessary increasing input volu-
mes for sustaining ranking positions (Hazelkorn 2013, p. 71)
For efficiency analysis methodology, this is also
connected to the question of industry or structural
efficiency (Farrell 1957, p. 262; Ylvinger 2000, p. 165)
2
2. HE Framework and Trends
University rankings with high impacts on resource,
student and researcher allocation – specifics:
Times Higher Education
ARWU
QS Ranking 3
Indexed Output Evaluation
Numbers („100“)
NOT IndexedOutput Evaluation
Numbers
CWTS Leiden
U-Multirank
3. DEA Malmquist Index Method
Data envelopment analysis as non-parametric
technique (Charnes, Cooper & Rhodes 1978, Banker, Charnes & Cooper 1984)
Malmquist index as longitudinal extension (Malmquist 1953
Caves, Christensen & Diewert 1982)
4
M I
4. Application Data (Rankings)
Inputs: ETER project
Outputs: 5 Indicators from THE and CWTS
rankings each
Correlation testing
Timeframe 2011-2016
70 universities (DMU), data example for two
universities
5
University
Budget Inputs
Throughput
Outputs(THE) / indexed
Academic Staff
Teaching Internat. Outlook
r = 0.86
Research
CitationIndustryIncome
r = 0.23
r = 0.89
r = 0.23
r = 0.21n = 420
6
7
n = 420
Budget Acad.
Staff
THE
Teaching
THE Int.
Outlook
THE
Research
THE
Citations
THE Ind.
Income
CWTS P CWTS
TCS
CWTS
TNCS
CWTS
P_top1
CWTS
P_top50
Budget 1.000 0.860 0.533 -0.084 0.386 0.265 0.077 0.678 0.683 0.654 0.594 0.665
Acad. Staff 1.000 0.490 -0.130 0.378 0.172 0.186 0.681 0.631 0.639 0.579 0.656
THE Teaching 1.000 0.228 0.890 0.231 0.213 0.690 0.724 0.733 0.746 0.711
THE Internat. Outl. 1.000 0.173 0.375 -0.142 0.053 0.149 0.138 0.240 0.092
THE Research 1.000 0.198 0.273 0.700 0.709 0.733 0.734 0.718
THE Citations 1.000 -0.252 0.296 0.423 0.368 0.408 0.334
THE Industry Inc. 1.000 0.167 0.128 0.178 0.173 0.174
CWTS_P 1.000 0.963 0.980 0.921 0.995
CWTS_TCS 1.000 0.985 0.959 0.978
CWTS_TNCS 1.000 0.974 0.994
CWTS_P_top1 1.000 0.949
CWTS_P_top50 1.000
Correlations
7
n = 420
Budget Acad.
Staff
THE
Teaching
THE Int.
Outlook
THE
Research
THE
Citations
THE Ind.
Income
CWTS P CWTS
TCS
CWTS
TNCS
CWTS
P_top1
CWTS
P_top50
Budget 1.000 0.860 0.533 -0.084 0.386 0.265 0.077 0.678 0.683 0.654 0.594 0.665
Acad. Staff 1.000 0.490 -0.130 0.378 0.172 0.186 0.681 0.631 0.639 0.579 0.656
THE Teaching 1.000 0.228 0.890 0.231 0.213 0.690 0.724 0.733 0.746 0.711
THE Internat. Outl. 1.000 0.173 0.375 -0.142 0.053 0.149 0.138 0.240 0.092
THE Research 1.000 0.198 0.273 0.700 0.709 0.733 0.734 0.718
THE Citations 1.000 -0.252 0.296 0.423 0.368 0.408 0.334
THE Industry Inc. 1.000 0.167 0.128 0.178 0.173 0.174
CWTS_P 1.000 0.963 0.980 0.921 0.995
CWTS_TCS 1.000 0.985 0.959 0.978
CWTS_TNCS 1.000 0.974 0.994
CWTS_P_top1 1.000 0.949
CWTS_P_top50 1.000
8
Example
9
5. Calculation Results
DEA Malmquist index calculation 2011-2016 for 70
European universities: BANXIA Frontier Analyst,
output maximization, BCC / VRS model, three cases
(I) THE (II) CWTS (III) THE+CWTS
Efficiency Scores Base Year 2011
10
12
12
6. Discussion
a) Overall range of efficiency scores for 2011 quite
small between (17 institutions with) 100% and a
minimum of 61.60% (mean 87.21%, run I)
b) Out of 350 dynamic data points, altogether 118
with annual efficiency losses but 232 with
increases – efficiency improvement not a “given”
(Alsabawy, Cater-Steel and Soar 2016, Chang 2016, Yanson and Johnson 2016)
13
6. Discussion
c) Productivity increase (individual and overall)
within range of existing results, e.g. Parteka and
Wolszczak-Derlacz (2013, p. 73) - average 4.1%
annual increase of productivity for 266 public
universities in 7 European countries 2001-2005
d) Also identical: German universities have above-
average efficiency improvements compared to
other European countries
14
6. Discussion
e) Interesting results regarding the distinction
between general technological progress (“frontier
shift”) and individual organisational reasons for
efficiency changes (“catch-up”) interesting
f) Resource and organisational consequences of
efficiency development results like e.g. in the
health care or service sector
(Tiemann and Schreyögg 2009, Harlacher and Reihlen 2014)
7. Outlook
RQ1: Application functional with results
RQ2: No problem with indexed numbers obvious
RQ3: Further questions arising:
(i) Output type selection
(ii) DMU selection and sequence
DEA sensibility problem (relative)15
DEA 2017
DEA Malmquist Index Application for University Ranking DataDiscovery and Discourse of QuestionsRegarding Index Data, DMU Selection and DEA Sensitivity
Matthias Klumpp [email protected]
Thank you very much for your kind attention.