Papers in Innovation Studies - Lunds...

56
Papers in Innovation Studies Paper no. 2015/27 The Innovation Union Scoreboard is flawed: The Case of Sweden – not the innovation leader of the EU – updated version Charles Edquist ([email protected]) CIRCLE, Lund University Jon Mikel Zabala-Iturriagagoitia ([email protected]) Deusto Business School, Deusto University This paper is a supplement and an updating of a previous version published in CIRCLE Papers in Innovation Studies 2015/16 in April 2015, with the title “The Innovation Union Scoreboard is flawed: The Case of Sweden not being the innovation leader of the EU”. The previous version was based on the Innovation Union Scoreboard 2014. However, the Innovation Union Scoreboard 2015 was published two weeks after our previous version. This made this updated paper, based on both scoreboard versions, necessary. In this paper we follow exactly the same structure as in the previous CIRCLE WP 2015/16 (see Edquist and Zabala-Iturriagagoitia, 2015), but here we also introduce the new data provided by the Innovation Union Scoreboard 2015. This version: August 2015 Centre for Innovation, Research and Competence in the Learning Economy (CIRCLE) Lund University P.O. Box 117, Sölvegatan 16, S-221 00 Lund, SWEDEN http://www.circle.lu.se/publications

Transcript of Papers in Innovation Studies - Lunds...

Page 1: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

Papers in Innovation Studies

Paper no. 2015/27

The Innovation Union Scoreboard is flawed: The Case of Sweden – not the innovation

leader of the EU – updated version

Charles Edquist ([email protected]) CIRCLE, Lund University

Jon Mikel Zabala-Iturriagagoitia ([email protected]) Deusto Business School, Deusto University

This paper is a supplement and an updating of a previous version published in CIRCLE Papers in Innovation Studies 2015/16 in April 2015, with the title “The Innovation Union Scoreboard is flawed: The Case of Sweden – not being the innovation leader of the EU”. The previous version was based on the Innovation Union Scoreboard 2014. However, the Innovation Union Scoreboard 2015 was published two weeks after our previous version. This made this updated paper, based on both scoreboard versions, necessary. In this paper we follow exactly the same structure as in the previous CIRCLE WP 2015/16 (see Edquist and Zabala-Iturriagagoitia, 2015), but here we also introduce the new data provided by the Innovation Union Scoreboard 2015.

This version: August 2015

Centre for Innovation, Research and Competence in the Learning Economy (CIRCLE)

Lund University

P.O. Box 117, Sölvegatan 16, S-221 00 Lund, SWEDEN

http://www.circle.lu.se/publications

Page 2: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

WP 2015/27

The Innovation Union Scoreboard is flawed: The Case of Sweden – not the innovation leader of the EU – updated version Charles Edquist and Jon Mikel Zabala-Iturriagagoitia

Abstract According to the Innovation Union Scoreboard published by the European

Commission, Sweden has been, and still is, an innovation leader within the EU and one of

the most innovative countries in Europe. In this paper, the performance of the Swedish

national innovation system is analyzed using exactly the same data as those employed by

the Innovation Union Scoreboard for the years 2014 and 2015.

We argue that the Summary Innovation Index provided by the Innovation Union

Scoreboard is highly misleading. Instead of merely calculating this Summary Innovation

Index, the individual indicators that constitute this composite innovation indicator need to be

analyzed in much greater depth in order to reach a correct measure of the performance of

innovation systems. We argue that input and output indicators need to be considered as two

separate types of indicators and each type should then be measured individually. Thereafter

the input and output indicators should be compared to one another, as is normally done in

productivity and efficiency measurements.

To check whether our approach provides results similar to those of the Innovation

Union Scoreboard (or not), we apply it and analyze the relative position of Sweden -

appointed the innovation leader of the EU, by the EU. A theoretical background and reasons

for selecting the indicators used are also given and a new position regarding Sweden’s

innovation performance compared to the other EU countries is calculated.

Our conclusion is that Sweden cannot be seen as an innovation leader in the EU. This

means in turn that the Innovation Union Scoreboard is flawed and may therefore mislead

researchers, policy-makers, politicians as well as the general public – since it is widely

reported in the media.

JEL codes: O30, O38, O49, O52

Keywords: Innovation system, innovation policy, innovation performance, Sweden,

indicators, input, output

Disclaimer: All the opinions expressed in this paper are the responsibility of the individual

author or authors and do not necessarily represent the views of other CIRCLE researchers.

Page 3: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

1

The Innovation Union Scoreboard is flawed: The

Case of Sweden – not the innovation leader of the

EU – updated version 1

By

Charles Edquist and Jon Mikel Zabala-Iturriagagoitia

Version 2015-08-09

1 This paper is a supplement and an updating of a previous version published in CIRCLE Papers in

Innovation Studies 2015/16 in April 2015, with the title “The Innovation Union Scoreboard is flawed:

The Case of Sweden – not being the innovation leader of the EU”. The previous version was based on the

Innovation Union Scoreboard 2014. However, the Innovation Union Scoreboard 2015 was published two

weeks after our previous version. This made this updated paper, based on both scoreboard versions,

necessary. In this paper we follow exactly the same structure as in the previous CIRCLE WP 2015/16

(see Edquist and Zabala-Iturriagagoitia, 2015), but here we also introduce the new data provided by the

Innovation Union Scoreboard 2015.

Page 4: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

2

Abstract

According to the Innovation Union Scoreboard published by the European Commission,

Sweden has been, and still is, an innovation leader within the EU and one of the most

innovative countries in Europe. In this paper, the performance of the Swedish national

innovation system is analyzed using exactly the same data as those employed by the

Innovation Union Scoreboard for the years 2014 and 2015.

We argue that the Summary Innovation Index provided by the Innovation Union

Scoreboard is highly misleading. Instead of merely calculating this Summary

Innovation Index, the individual indicators that constitute this composite innovation

indicator need to be analyzed in much greater depth in order to reach a correct measure

of the performance of innovation systems. We argue that input and output indicators

need to be considered as two separate types of indicators and each type should then be

measured individually. Thereafter the input and output indicators should be compared to

one another, as is normally done in productivity and efficiency measurements.

To check whether our approach provides results similar to those of the Innovation

Union Scoreboard (or not), we apply it and analyze the relative position of Sweden -

appointed the innovation leader of the EU, by the EU. A theoretical background and

reasons for selecting the indicators used are also given and a new position regarding

Sweden’s innovation performance compared to the other EU countries is calculated.

Our conclusion is that Sweden cannot be seen as an innovation leader in the EU. This

means in turn that the Innovation Union Scoreboard is flawed and may therefore

mislead researchers, policy-makers, politicians as well as the general public – since it is

widely reported in the media.

Keywords: Innovation system, innovation policy, innovation performance, Sweden,

indicators, input, output.

Page 5: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

3

Table of Contents

Abstract…………………………………………………………………………………..2

1. Introduction………………………………………………………………………..4

2. Methodology………………………………………………………………………6

3. Theoretical background and relevant indicators…………………………………..7

4. Analysis…………………………………………………………………………..16

4.1. Output orientation ............................................................................................ 16

4.2. Input orientation ............................................................................................... 22

4.3. The performance of the Swedish national innovation system ......................... 25

5. Conclusions………………………………………………………………………37

Acknowledgements…………………………………………………………………….39

References……………………………………………………………………………...39

Annex 1: Innovation outputs of the national innovations systems of the EU28 member

states ……………………………………………………………………………………43

Annex 2: Innovation inputs of the national innovations system of the EU28 member

states ……………………………………………………………………………………47

Annex 3: The Efficiency of the EU28 Innovation System……………………………..51

Page 6: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

4

1. Introduction

The European Commission has often highlighted that Europe is one of the most

innovation intensive regions in the world (European Commission, 2013a). With the

recent strategy “Europe 2020”, the European Commission has outlined its intention to

focus on today’s challenges in a changing world and stated its desire to become “a

smart, sustainable and inclusive economy” (European Commission, 2013b: 1). The

European Union has therefore set ambitious objectives in five areas to be reached by

2020. In addition to climate and energy, education, employment and social inclusion,

innovation is one of the five pillars to form a so-called “Innovation Union” (European

Commission, 2013a).

To support the establishment of an Innovation Union, the European Commission is

using the Innovation Union Scoreboard (IUS) as a tool to monitor the implementation

of and to examine and illustrate the “innovation performance” of European Member

States (European Commission, 2011).2 This suggests that the IUS is meant to have a

real impact on the evaluation of the policies of the Member States, on the allocation of

resources, and – supposedly – on the design of innovation policies at the European,

national and regional levels. In other words, the results provided by the IUS have a

significant (political) impact.

One of the key findings of the IUS is that Sweden holds the position of innovation

leader in the EU, and that its ranking remains stable over time (European Union, 2013,

2014, 2015). Sweden has the top position (ranked number 1) of all EU28 Member

States in what is called “EU Member States’ Innovation Performance” (European

Union, 2014: 5). This has been reported in the media and also reached high-level

politicians and policy-makers in Sweden. For example Sweden’s former foreign

minister Carl Bildt tweeted from his official account that it is “Nice to see that Sweden

is ranked as the No 1 innovation country in the EU”3, echoing the results of the IUS

report. The Swedish Innovation Policy Agency (VINNOVA) also concluded in a

newsletter that “Sweden leads the EU innovation league.”. The Minister of Industries at

that time, Annie Lööf, commented on Sweden’s standing, saying that “the fact that

2 The IUS is published by the DG for Internal Market, Industry, Entrepreneurship and SMEs, Unit J3 –

Innovation Policy for Growth. The 2015 edition was prepared by Hugo Hollanders, Nordine Es-Sadki and

Minna Kanerva from the Maastricht Economic and Social Research Institute of Innovation and

Technology (UNU-MERIT). 3 https://twitter.com/carlbildt/status/316807766700351488, 27/03/2013, 12:03am.

Page 7: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

5

Sweden again tops the innovation league in the EU and draws away from other

countries shows that our efforts to increase our innovation power give results”.4 We will

show below that these statements are based on a weak flawed analysis.

To assess the “innovation performance” of the Member States, a Summary Innovation

Index (SII) is provided by the IUS. The SII includes 25 indicators,5 which are divided

into three main categories (i.e. enablers, firm activities and outputs) and eight

dimensions (i.e. human resources, excellent research systems, finance and support, firm

investments, linkages and entrepreneurship, intellectual assets, innovators, economic

effects). However, in its successive reports, IUS does not provide any conceptual or

theoretical discussion of these categories and dimensions, nor of the specific indicators

and the relations among them. The reports limit themselves to briefly describing the

indicators considered.

The purposes of this paper are the following. We question whether Sweden can be

considered to be the top position holder within the EU with regard to “innovation

performance” in any meaningful use of this term. In doing so, we employ exclusively

the data provided by the IUS 2014 and 2015 to assess the performance of the Swedish

innovation system and discuss whether or not Sweden can be regarded as the innovation

leader in Europe. On a theoretical basis, we single out a number of input (n=4) and

output (n=8) innovation indicators from the 25 employed by the IUS. We then compare

Sweden’s position to those of the other EU28 Member States according to the data

provided by the IUS for the years 2014 and 2015 (see also Edquist and Zabala-

Iturriagagoitia, 2015). Finally, we compare the measures of innovation outputs and

inputs of each of the EU28 countries.6 This measure of productivity (or efficiency) of

innovation systems (i.e., the relationship between the innovation inputs and outputs) is

then also used to compare the performance of Sweden’s innovation system with those of

the other 28 EU member countries.

With this paper we do not intend to discuss the quality and accuracy of the IUS data,

this will be a future endeavor. We merely use exactly the same data provided by the

IUS. However, as a consequence of adding some conceptual and theoretically informed

considerations to the IUS analysis, we obtain quite different results.

4 In newsletter from VINNOVA of March 14, 2014.

5 For the definitions of each of these 25 indicators, see European Union (2014, 2015).

6 Our aggregated output indicator measures innovations as such, and not their determinants or

consequences.

Page 8: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

6

Our analysis shows not only why and how Sweden’s status as an innovation leader

needs to be revised but also that the IUS analysis is flawed in respect to assessing

innovation performance. The IUS mode of measuring innovation performance is

outright incorrect and highly misleading, not only for analysts/researchers but also for

policy-makers/politicians. Lack of theoretical awareness among EU administrators and

their advisors is the probable explanation for this problem.

The paper is organized as follows. Section 2 provides an overview of the research

methodology applied. Section 3 presents the rationale and theoretical basis for the

selection of certain indicators. The analysis of the relative position of Sweden in the

European context is developed in Section 4. There, we use the normalized IUS scores

for each of the selected indicators and provide new rankings for both the innovation

inputs and the innovation outputs. We also calculate the ranking of the efficiency of the

Swedish innovations system by relating the innovation outputs and inputs to each other

(thus calculating innovation performance). Finally, Section 5 concludes with a

discussion of the main findings of the research and its relevance for the practice of

innovation policy making.

2. Methodology

This paper begins with a brief presentation of the 25 IUS indicators included in the IUS

(2014, 2015) and a discussion of which of these that best measure innovation input and

output, respectively. The recent and current literature on the innovation systems

approach, has demonstrated that not all the indicators included in the IUS are adequate

for measuring either input or output. Based on these findings, we use only 12 of the IUS

indicators, selecting eight output indicators and four input indicators.7

After selecting the indicators that we deemed most relevant for the purposes of this

research, we gathered all the data from the IUS 2014 and 2015, all with normalized

scores for each indicator chosen and for all EU28 countries. We then ranked all EU28

countries for each indicator. This provided a basis for making a comprehensive and in-

depth analysis of the relative position of Sweden for a diverse set of measures.

By categorizing the indicators as inputs and outputs, we are able to see the extent to

which innovation inputs are transformed into or materialize as innovation outputs.

7 The definition of all the indicators considered and the rationale for their selection are provided in section

3.

Page 9: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

7

Innovation performance in terms of efficiency is calculated as the ratio between the

eight innovation output indicators and the four innovation input indicators. A high score

for the input indicators means that a great deal of effort and a great many resources have

been devoted to stimulating innovation. Similarly, a high score for the output indicators

shows that a country has a high production of innovations. However, if the input side is,

relatively speaking, much larger than the output side, the efficiency of the system as a

whole is low.

3. Theoretical background and relevant indicators

The IUS (until 2009 called the European Innovation Scoreboard) provides a

comparative assessment of the research and innovation performance of the EU Member

States (currently 28) as well as Iceland, the Former Yugoslav Republic of Macedonia,

Norway, Serbia, Switzerland and Turkey,) and the relative strengths and weaknesses of

their research and innovation systems (European Union, 2014: 8). It uses the most

recent data available from a variety of sources (e.g. Eurostat, Scopus, Thomson Reuters,

OECD, the Office for Harmonization in the Internal Market, the United Nations). In this

paper we compare the performance of Sweden with that of the other EU28 member

states using only the data provided by the IUS 2014 and 2015. However, as described

below, a different approach is applied to analyze the data (see Section 2).

The IUS identifies 25 indicators, which are divided into three categories and eight

dimensions (see Figure 1). Unfortunately, there is no conceptual or theoretical

discussion of how and why these indicators were selected. The three categories

considered consist of Enablers, Firm activities and Outputs. According to the IUS

report, the Enablers “capture the main drivers of innovation performance external to the

firm” (European Union, 2014: 4) and cover three innovation dimensions: human

resources, open, excellent and attractive research systems, and finance and support.

Firm activities “capture the innovation efforts at the level of the firm” (ibid) and are also

grouped in three innovation dimensions: firm investments, linkages and

entrepreneurship, and intellectual assets. Finally, outputs cover “the effects of firms’

innovation activities” (ibid) in two innovation dimensions: innovators and economic

effects.

Page 10: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

8

Figure 1. - Measurement framework of the Innovation Union Scoreboard

Source: European Union (2014: 8).

Based on the indicators included in these categories and dimensions, the IUS provides a

Summary Innovation Index (SII). For each year, each indicator has a normalized score

that varies from a minimum performance of 0 up to a maximum of 1. In the SII all

indicators are given the same weight.8

The IUS draws the conclusion that the country with the highest average score for the 25

indicators is also the best “innovation performer,” regardless of whether the indicators

used measure the input or output side of innovation or something else. In addition, the

IUS provides no explicit definition of “innovation performance,” which is quite

surprising since this is the most central concept in the scoreboard reports. However,

implicitly the SII score is the IUS definition of “innovation performance”.

Countries are ranked according to the SII in the following groups: innovation leaders

(more than 20% above EU average),9 innovation followers (less than 20% above, or

8 For a discussion on the adequacy of weighting indicators when elaborating composite measures see

Grupp and Schubert (2010). 9 In the IUS 2014 and in the IUS 2015, the EU28 countries regarded as innovation leaders are: Denmark,

Finland, Germany and Sweden.

Page 11: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

9

more than 90% of the EU average),10

moderate innovators (relative performance rates

between 50% and 90% of the EU average)11

and modest innovators (less than 50% of

the EU average) (European Union 2014: 11).12

One could expect that a category designated to “outputs” (see figure 1) would include

mainly indicators of innovation output in the sense of innovations as such. This

“outputs” category includes three indicators related to the behavior of “innovators” and

five indicators related to the “economic effects” of innovations. The indicators under the

“innovators” heading are ‘SMEs introducing product or process innovations as % of

SMEs’, ‘SMEs introducing marketing or organizational innovations as % of SMEs’ and

‘Employment in fast-growing firms of innovative sectors.’ The “economic effects”

category includes five indicators: those related to employment in knowledge-intensive

activities, to exports of medium and high-tech industries, to knowledge-intensive

services, to sales of new to market and new to firm innovations and to license and patent

revenues. Accordingly and surprisingly, however, the IUS category of “outputs”

includes diverse types of indicators, some of which can be regarded as actual outputs or

results of innovation activities, but at the same time, it also includes indicators that refer

to the consequences (i.e. the impact) of these innovations (see Figure 1).

Productivity is a measure of the efficiency of a person, company, system, country, etc.

in converting inputs into outputs. Productivity or efficiency is the ratio between outputs

(nominator) and inputs (denominator), or output per unit of input. Hence, (innovation)

output is, of course, a part of any productivity measure. When measuring innovation

productivity, some sort of input and some sort of output must be compared to determine

the performance of a given unit. Therefore it is quite surprising that the IUS estimates

the “innovation performance” without making any distinction between inputs and

outputs. It is simply methodologically incorrect to speak of “performance” in the sense

of productivity or efficiency and, at the same time, mix inputs and outputs.

10

In the IUS 2014, the EU28 countries regarded as innovation followers are: Austria, Belgium, Cyprus,

Estonia, France, Ireland, Luxembourg, Netherlands, Slovenia and the UK. In the IUS 2015, the EU28

countries regarded as innovation followers are: Austria, Belgium, France, Ireland, Luxembourg,

Netherlands, Slovenia and the UK. 11

In the IUS 2014, the EU28 countries regarded as moderate innovators are: Croatia, Czech Republic,

Greece, Hungary, Italy, Lithuania, Malta, Poland, Portugal, Slovakia and Spain. In the IUS 2015, the

EU28 countries regarded as moderate innovators are: Cyprus, Czech Republic, Estonia, Greece, Hungary,

Italy, Lithuania, Malta, Poland, Portugal, Slovakia and Spain. 12

In the IUS 2014 and in the IUS 2015, the EU28 countries regarded as modest innovators are: Bulgaria,

Latvia, and Romania.

Page 12: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

10

To measure the performance of an innovation system in terms of

productivity/efficiency, the indicators must, in some way, be separated into indicators

that reflect the input character of innovation (causes, determinants) on the one hand, and

other measures which reflect the outputs of the innovative action (actual innovations)

(Mahroum and Al-Saleh, 2013; Zabala-Iturriagagoitia et al., 2007a). Both sides need to

be considered separately, and then related to each other. Neither input nor output

indicators themselves can measure the innovation performance of a country. It is the

relation between them which measures innovation performance. To be able to assess

which of the indicators provided by the IUS are inputs and which are output indicators,

we thus define inputs and outputs as follows:

Innovation input indicators refer to the resources (human, material and financial;

private as well as governmental) which are used to create innovations, including

bringing them to the market.

Innovation output indicators refer to new products and processes, new designs and

community trademarks as well as marketing and organizational innovations, which are

new to the market and/or new to the firm and are adopted by users.

Based on the definitions provided by the IUS for each of the 25 indicators, we identify

eight indicators as measuring innovation output and four as measuring innovation input.

Table 1 below shows the definition of each of the eight output indicators considered and

the data sources according to the IUS.

Table 1. - Indicators proposed as innovation output indicators by us

Indicator Interpretation Data source

2.2.1 SMEs innovating

in-house (% of

SMEs)

This indicator measures the degree to

which SMEs that have introduced any

new or significantly improved products

or production processes have been

innovated in-house.

Eurostat

(CIS)

2.3.3 Community

trademarks per

billion GDP (in

PPP€)

Trademarks are an important innovation

indicator, especially for the service

sector. It fulfils the three essential

functions of a trademark: it identifies the

origin of goods and services, guarantees

consistent quality through evidence of

the company's commitment vis-à-vis the

consumer, and is a form of

communication, a basis for publicity and

Office for

Harmonization

in the Internal

Market and

Eurostat

Page 13: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

11

advertising.

2.3.4 Community

designs per billion

GDP (in PPP€)

A design is the outward appearance of a

product or part of it resulting from the

lines, contours, colours, shape, texture,

materials and/or its ornamentation.

Office for

Harmonization

in the Internal

Market and

Eurostat

3.1.1 SMEs introducing

product or process

innovations (% of

SMEs)

Technological innovation, as measured

by the introduction of new products

(goods or services) and processes, is a

key ingredient to innovation in

manufacturing activities.

Eurostat

(CIS)

3.1.2 SMEs introducing

marketing or

organizational

innovations (% of

SMEs)

This indicator tries to capture the extent

that SMEs innovate through non-

technological innovation.

Eurostat

(CIS)

3.2.2 Contribution of

medium and high-

tech products

exports to the trade

balance

The manufacturing trade balance reveals

an economy's structural strengths and

weaknesses in terms of technological

intensity. It indicates whether an industry

performs relatively better (or worse) than

total manufacturing and can be

interpreted as an indicator of revealed

comparative advantage that is based on a

country’s trade specialisation.

UN Comtrade

3.2.3 Knowledge-

intensive services

exports (as % of

total services

This indicator measures the

competitiveness of the knowledge-

intensive services sector.

Eurostat

3.2.4 Sales of new to

market and new to

firm innovations

(as % of turnover)

This indicator measures the turnover of

new or significantly improved products

and includes both products which are

only new to the firm and products which

are also new to the market. The indicator

thus captures both the creation of state-

of-the-art technologies (new to market

products) and the diffusion of these

technologies (new to firm products).

Eurostat

(CIS)

Source: European Union (2014: 86-90).

The above 8 indicators provided by the IUS13

help identify the outputs of an innovation

system. Five of them are also listed as outputs in the IUS (i.e. indicators 3.1.1, 3.1.2,

3.2.2, 3.2.3 and 3.2.4). The three indicators which we, but not the IUS, consider to be

13

We acknowledge that other output indicators may also be available and used by other statistical offices

and methodologies (e.g. Global Innovation Index). However, in this paper we limit the scope of our

analysis to the indicators and data provided by the IUS.

Page 14: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

12

output measures are listed in the IUS as 2.2.1, 2.3.3 and 2.3.4. In the following we

justify our reasons for considering these three additional indicators to be innovation

output indicators. We also argue that three of the indicators that the IUS lists as outputs

(3.1.3, 3.2.1 and 3.2.5) should not be considered as measures of innovation output.

As stated above, we are looking for output indicators that, to the largest extent possible,

measure innovations as such. The notion of innovation output, according to the

definition that we provided above, is partly different than the IUS category of “outputs,”

which is defined as “the effects of firm’s innovation activities” (European Union, 2014:

4). For us, however, the term innovation output indicators refers to new products and

processes, new designs and community trademarks as well as marketing and

organizational innovations which are either new to the market and/or new to the firm

and are adopted by users (see the definition above).

In the IUS, the category of “outputs” thus places more emphasis on the consequences

(i.e., impacts or results) of innovations than on the actual production of innovations

(i.e., what we call “outputs”). In passing, we want to mention that in this paper we are

not at all interested in the consequences of innovations, such as economic growth or

employment. Rather, we are interested in actual innovations and the determinants of

innovations – which we call “input indicators,” which will be discussed later in this

section.

As stated above, we contend that indicators 2.2.1, 2.3.3 and 2.3.4 should be categorized

as output indicators, despite the fact that they are classified in the IUS as “firm

activities” rather than “outputs”. Let us present our reasons for classifying them as

innovation output indicators.

The IUS indicator 2.2.1, ‘SMEs innovating in-house’, is classified under the category

firm activities. However, we believe that this indicator needs to be seen as an innovation

output indicator. According to the definition provided by the IUS, it refers to the degree

to which SMEs have succeeded with the introduction of new or significantly improved

products and/or processes which may have been innovated inside the company. In other

words, it identifies the firms where innovation processes have been completed and led

to an actual new product or process. Therefore, this indicator measures in fact outputs of

an innovation system.

Page 15: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

13

Similar arguments hold for indicators 2.3.3, ‘Community trademarks per billion GDP’,

and 2.3.4, ‘Community designs per billion GDP’, which are also included in the IUS

under the category of firm activities. Community trademarks, as well as community

designs, are significant aspects of product innovations, since they help to label a specific

brand or design. Since the number of community trademarks and community designs

(related to GDP in each country) provide a measure of innovations which are already on

the market, these two indicators should be seen as innovation output indicators. It

should also be highlighted that the IUS explicitly lists indicator 2.3.3 (trademarks) as an

“innovation indicator” (see Table 1).

The five indicators 3.1.1 through 3.2.4 in Table 1, are considered to be “outputs,” both

by the IUS and by us. However, a conceptual difference exists between the label

“outputs” as used in the IUS and the concept of “innovation output” used in this paper.

It is for this reason that we do not classify as “innovation output indicators” the

following three indicators referred to in the IUS as “outputs”: 3.1.3, ‘Employment in

fast-growing firms of innovative sectors’; 3.2.1, ‘Employment in knowledge-intensive

activities’ and 3.2.5, ’License and patent revenues from abroad’.

The rationales for not considering the above three indicators as measures of innovation

output are the following. Indicators 3.1.3 and 3.2.1 measure employment. Employment

may be an outcome of innovation but it may also be a result of other forces. Some kinds

of innovations (e.g. product innovation) often result in increased employment while

other kinds (e.g. process innovations) normally result in decreasing employment per

unit of output. These two indicators can therefore not be considered innovation output

indicators (Edquist et al., 2001). Employment should be considered a consequence of

innovations rather than an innovation as such – just as in the case of economic growth.

The third indicator listed in IUS as an output, 3.2.5, licenses and patents, refers to sales

of intellectual property rights. Although patents may form the basis for innovations,

they are certainly not innovations, although this is a common misunderstanding. As

long as a product or a process has not been commercialized and adopted by users (user

firms or consumers) it cannot be considered to be an innovation. There are, for example,

many inventions which are patented but never reach the market and therefore never

become innovations.

Looking further at the measurement framework of the IUS, it also becomes clear that

while one of the main categories of indicators is considered to be a measure of

Page 16: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

14

innovation output, there is no category explicitly referring to innovation inputs

(determinants of innovation) or providing a clear specification of what such inputs

would be. Instead, the IUS defines two other main types of innovation indicators:

Enablers, innovation drivers which are outside the firm, and Firm activities, indicators

which capture the innovation efforts undertaken by firms. However, it seems that some

of the Enablers and Firm activities have characteristics which motivate classifying them

as innovation input indicators, as we now discuss below.

We propose four indicators (see Table 2), that fulfill the requirements for innovation

inputs according to the definition presented above. Two of the indicators are categorized

in the IUS conceptual structure as “enablers” and two as “firm activities”. Two of those

indicators measure R&D expenditures from the public and private sector, both

important determinants of innovation.14

Venture capital, which is important “for the

relative dynamism of new business creation” (European Union, 2014: 87), is especially

needed for risk and cost intensive innovation and is also required to enhance innovation

by commercialization of R&D results. In addition to R&D intensive investments,

companies need to invest in non-R&D innovation expenditures as well.

Table 2. - Indicators proposed as innovation input indicators by us

Indicator Interpretation Data

source

1.3.1 R&D expenditure in

the public sector (%

of GDP)

Trends in the R&D expenditure indicator

provide key indications of the future

competitiveness and wealth of the EU.

Eurostat

1.3.2 Venture capital (%

of GDP)

The amount of venture capital is a proxy for

the relative dynamism of new business

creation. In particular for enterprises using or

developing new (risky) technologies; venture

capital is often the only available means of

financing their (expanding) business.

Eurostat

2.1.1 R&D expenditure in

the business sector

(% of GDP)

This indicator captures the formal creation of

new knowledge within firms.

Eurostat

2.1.2 Non-R&D This indicator measures non-R&D innovation Eurostat

14

Of these two indicators, R&D expenditures in the business sector are certainly to a very large extent

directly undertaken to enhance innovation. R&D expenditures in the public sector are to a lesser extent

undertaken directly for this purpose, since a substantial proportion is pursued to result in scientific

knowledge, part of which may, in turn, result in innovations. In spite of this, we include both of these

indicators in the category of input indicators, although a part of the public sector R&D expenditures may

result in innovations after a substantial time lag.

Page 17: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

15

innovation

expenditures (% of

turnover)

as percentage of total turnover. Several of the

components of innovation expenditure, such

as investment in equipment and machinery

and the acquisition of patents and licenses,

measure the diffusion of new production

technology and ideas.

(CIS)

Source: European Union (2014: 86-90).

The four input indicators proposed above are linked to innovation activities and are

typically undertaken to enhance innovation, at least in part. There are, of course, other

determinants of innovation processes (Furman et al., 2002). Ideally we should include

all such determinants as input indicators. However, we would then need a holistic

theory of all determinants of innovation processes and their relative importance (Samara

et al., 2012). That we do not have. Admittedly, this is unsatisfactory – but a fact. For

example, in the IUS (and in this paper) no account is taken of determinants of

innovation processes operating from the demand side.15

In other contexts we do, however, argue in a more holistic way in terms of ten activities

in innovations systems (sometimes called functions) that influence innovation processes

and cover a wide range of determinants of innovation, if not all (Edquist, 2005, 2011).

However, given the purposes of this paper, we have chosen to include only these four

since they are clearly input indicators and, in addition, data for these indicators are also

found in the data provided by the IUS.16

As indicated above, we choose here to use only

the IUS data, although with a different approach than the IUS.17

In addition, using data

from the IUS database makes it possible to compare the two approaches to measuring

performance of innovation systems independent of the availability and quality of the

data used (Zabala-Iturriagagoitia et al., 2007a).18

15

Demand side determinants are addressed in Edquist and Zabala-Iturriagagoitia (2012) and in Edquist et

al. 2015). 16

If all innovation input and all innovation output indicators were included, we would be able to calculate

total productivity. As indicated, we will be satisfied here with a limited number of indicators on both

sides, i.e., we will only be able to provide a partial measure of productivity (efficiency, innovation

performance). At least we make some distinction between input and output indicators – which the IUS

does not do, when calculating “EU Member States innovation performance”. 17

There will, of course, be reasons to include non-IUS indicators if a holistic approach is pursued. But

this would be a different exercise. 18

The sources of the IUS data were briefly mentioned in the beginning of section 3. However, we here

make no attempt to evaluate the quality of these data – simply since it is not the purpose of this paper.

This does not exclude the fact that we are aware of the discussion of the problems with regard to quality

of, for example, the Community Innovation Survey data. Examples of contributions to this discussion are

Evangelista et al. (1998), Mairesse and Mohnen (2002), Laursen and Salter (2006), Arundel et al. (2009)

and Heidenreich (2009) to mention a few.

Page 18: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

16

Our main goal here is to begin to consider some indicators as inputs and others as

outputs, and in so doing to provide a starting point that could be generalized into a more

holistic and less partial approach in future work. Admittedly, the analysis pursued here -

partially comparing rankings between output and input indicators - follows a linear

logic (as opposed to a systemic or holistic one). This is not supported by the scholars in

the innovation systems approach, not even by us (Edquist, 2014a, 2014b, 2014c).

Based on the arguments outlined in this section, the eight output indicators and the four

input indicators will next be used to assess the innovation performance of the Swedish

national innovation system in relation to the rest of the EU28 Member States and also to

analyze the viability of the proposed approach.

4. Analysis

After discussing the rationales for the selection of certain input and output indicators,

this section analyzes the performance and relative position of Sweden for each of the

proposed indicators within the EU. In this paper, following Frane (2014), we limit the

analysis to the data provided by the IUS 2014 and 2015. We make some comparisons

between the data provided in these two editions of the IUS. Except for such limited

comparisons we do not analyze the development over time, which is a matter for further

work.

4.1. Output orientation

We begin the analysis of the Swedish innovation system from the perspective of the

production of innovation outputs, first looking at the three indicators that are not

explicitly regarded as output indicators by the IUS, but which we consider to be such

(see Section 3). Starting with indicator 2.2.1, ‘SMEs innovating in-house’, the data

provided by the IUS 2014 and 2015 show that the normalized score for Sweden is

higher than the EU28 average. While the EU28 average reaches 0.570 points in 2011

(0.513 in 2012), the latest year for which data is available for this indicator, Sweden has

a normalized score of 0.729 in year 2011 (0.779 in year 2012).19

Sweden thereby holds

the 8th

position for this indicator in the IUS 2014 (4th

position in the IUS 2015), the

19

Note that most of the indicators provided by the IUS 2014 refer to years 2011 and 2012, while those

included in the IUS 2015 refer to years 2012 and 2013.

Page 19: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

17

leading countries in the IUS 2014 being Germany (0.933), Cyprus (0.833) and Denmark

(0.813) (see Table 3).20

Indicator 2.3.3, on ‘Community trademarks,’ shows a similar picture for Sweden, with

0.573 points as the normalized score in 2012 (i.e., IUS 2014) and 0.661 points in 2013

(i.e., IUS 2015). Although Sweden is above the EU28 average (0.444 in the IUS 2014

and 0.580 in the IUS 2015), the country is not part of the group of countries leading this

indicator, as it holds the 7th

position in 2012 and the 8th

in 2013. Three countries

(Cyprus, Luxembourg, and Malta) reach the highest normalized score of 1.0 in both

years, while Sweden achieves just over half of that.21

Taking a closer look at community designs, as measured by indicator 2.3.4, Sweden’s

score of 0.574 is almost the same as the average for the EU28 countries for 2012, which

is about 0.566. Sweden holds the 8th

position in the European context, which is led by

Luxembourg and Austria (1.0 normalized score), Denmark being in the third position

(0.971). This is one of the indicators where a larger difference is observed between the

outputs in Sweden according to the IUS 2014 and the IUS 2015. In year 2013 (i.e., IUS

2015), the indicator 2.3.4 on community designs amounts to 0.999, Sweden being

ranked 3th in the context of the EU28.

Hereafter we analyze those indicators which are classified as “outputs” by the IUS and

as innovation output indicators by us. According to IUS indicator 3.1.1, ‘SMEs

introducing product or process innovations,’ Sweden reached a normalized score of

0.781 in 2011 (4th

position), and 0.656 in 2012 (6th

position), which is the latest data

offered by the IUS. This is significantly higher than the EU average (0.577 in 2011 and

0.432 in 2012), but also below the top ranked countries, which are Germany, Belgium

and Luxembourg.

The normalized score for Sweden for indicator 3.1.2, ‘SMEs introducing marketing or

organizational innovations’ (0.605 in year 2011 and 0.540 in year 2012), also shows a

score above the EU28 average (0.566 in 2011 and 0.495 in 2012). Nevertheless Sweden

20

The leading countries in the IUS 2015 are the Netherlands (0.797), Ireland (0.792) and Germany

(0.787). 21

It seems quiet unclear, why especially Cyprus and Malta are top ranked for indicator 2.3.3, well above

other European countries such as Germany (0.595 – 6th

position in 2012, 0.670 – 6th

position in 2013),

France (0.308 – 19th

in 2012, 0.462 – 20th

in 2013), Italy (0.396 – 16th

in 2012, 0.545 – 15th

in 2013),

United Kingdom (0.419 – 12th in 2012, 0.578 – 13

th in 2013) or the Netherlands (0.541 – 9

th in 2012,

0.631 – 10th

in 2013).

Page 20: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

18

is still in the 10th

(2011) and 12th

(2012) position, respectively, and miles behind the

innovation leaders, Germany, Luxembourg, Ireland and Greece.

For indicator 3.2.2, ‘Contribution of medium and high-tech product exports to trade

balance’, the distance between Sweden and the top-ranked countries is substantial.

Sweden has a normalized score of 0.579 for 2012 (0.648 in 2013), which is slightly

above the EU average (0.553 in 2012, 0.658 in 2013). However, this should not obscure

the fact that Sweden is behind 14 other EU countries (i.e., in year 2013 Sweden ranks

9th

). This means that half of the countries analyzed in the context of the EU28 show a

better result than Sweden did in 2012 for this particular measure. Germany (0.930) leads

the group once more in 2012, Slovenia (0.802) and Hungary (0.756) being second and

third respectively.22

For 3.2.3, ‘Knowledge-intensive services exports’, the EU28 average showed a

normalized score of 0.606 in 2011 (0.665 in 2012), while Sweden reached 0.510 points

(0.524 in 2012), which is below the EU average and places the country in the 10th

position (11th

in 2012). Ireland and Luxembourg lead the ranking for indicator 3.2.3,

while Denmark holds the third position.

Finally, the result observed in relation to indicator 3.2.4, ‘Sales of new to the market

and new to the firm innovations,’ is even worse, as Sweden falls down to position 21 in

2011 (24th

in 2012), with a normalized score of 0.248 (0.156 in 2012). Sweden is far

behind the European average (0.664 in 2011, 0.488 in 2012) and only a few countries

show a poorer result. Greece and Slovakia (1.0) are the best performers among the

EU28 in 2011, Spain (0.982) being third.23

Given the fact that this indicator measures

the share of the turnover due to the sales of significantly improved products, new to the

firm, or new to the market innovations, this indicator is, in our view, one of the most

important and basic output indicators of all. Sweden’s poor result for this indicator

should therefore be seen as a serious weakness in the Swedish innovation system.

Table 3 summarizes the normalized scores for the eight output indicators and the

relative position Sweden holds in relation to the EU28 countries for the latest year for

which data are available for each indicator. It also gives an average ranking and

normalized score for Sweden for all output indicators.

22

In year 2013 (i.e. IUS 2015), Hungary is the leading country for this particular indicator (0.899),

Germany being second (0.892) and Slovakia (0.850) being third respectively. 23

In year 2012 (i.e. IUS 2015), Denmark is the leading country for this particular indicator (1.000),

Slovakia being second (0.869) and Spain (0.590) being third respectively.

Page 21: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

19

Table 3. - The innovation output indicators of the Swedish national innovation systems24

Indicator Score

in 2014

[2015]

Ranking

(out of 28)

in 2014

Ranking

(out of 28)

in 2015

EU 28

average in

2014 [2015]

Leading countries

(top 3) in 2014

Leading countries

(top 3) in 2015

2.2.1 SMEs innovating in-house as % of SMEs 0.729

[0.779]

8 4 0.570

[0.513]

Germany (0.933)

Cyprus (0.833)

Denmark (0.813)

Netherlands (0.797)

Ireland (0.792)

Germany (0.787)

2.3.3 Community trademarks per billion GDP (in PPP-€) 0.573

[0.661]

7 8 0.444

[0.580]

Luxembourg (1.0)

Cyprus (1.0)

Malta (1.0)

Cyprus (1.000)

Luxembourg (1.000)

Malta (1.000)

2.3.4 Community designs per billion GDP (in PPP-€) 0.574

[0.999]

8 3 0.566

[0.569]

Luxembourg (1.0)

Austria (1.0)

Denmark (0.971)

Denmark (1.000)

Luxembourg (1.000)

Sweden (0.999)

3.1.1 SMEs introducing product or process innovations as

% of SMEs

0.781

[0.656]

4 6 0.577

[0.432]

Germany (1.0)

Belgium (0.848)

Luxembourg (0.792)

Luxembourg (0.732)

Germany (0.717)

Belgium (0.713)

3.1.2 SMEs introducing marketing or organizational 0.605 10 12 0.566 Germany (1.0) Luxembourg (0.851)

24

The data and rankings for the innovation outputs of all EU28 Member Countries are presented in Annex 1.

Page 22: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

20

innovations as % of SMEs [0.540] [0.495] Luxembourg (0.960)

Greece (0.801)

Ireland (0.797)

Germany (0.720)

3.2.2 Contribution of medium and high-tech product

exports to trade balance

0.579

[0.648]

15 9 0.553

[0.658]

Germany (0.930)

Slovenia (0.802)

Hungary (0.756)

Hungary (0.899)

Germany (0.892)

Slovakia (0.850)

3.2.3 Knowledge-intensive services exports as % total

service exports

0.510

[0.524]

10 11 0.606

[0.665]

Ireland (1.0)

Luxembourg (1.0)

Denmark (0.959)

Ireland (1.000)

Luxembourg (1.000)

Denmark (1.000)

3.2.4 Sales of new to market and new to firm innovations

as % of turnover

0.248

[0.156]

21 24 0.664

[0.488]

Greece (1.0)

Slovakia (1.0)

Spain (0.982)

Denmark (1.000)

Slovakia (0.869)

Spain (0.590)

Average output result25

0.575

[0.620]

10 4 0.568

[0.550]

Germany (0.859)

Luxembourg (0.754)

Denmark (0.701)

Luxembourg (0.772)

Denmark (0.728)

Germany (0.723)

Source: own elaboration based on the European Union (2014 and 2015) data.

25

Calculation based on the sum of the average normalized score for each output indicator and divided by the number of output indicators.

Page 23: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

21

According to the IUS 2014 and 2015, the results for Denmark, Finland, Germany and

Sweden are well above those of the EU average. These countries are labelled the

‘innovation leaders’. According to the IUS, “in all dimensions the performance of the

innovation leaders, Sweden, Denmark, Germany and Finland is not too different”

(European Union, 2014: 4-5). Table 3 gives a sharply different picture, however. Taking

into account the normalized values observed for the eight output indicators discussed

above, and according to the IUS 2014 data, Sweden has an average normalized score of

0.575 for the innovation output indicators (0.620 for the IUS 2015), which is very close

to the EU28 average of 0.568 (0.550 for the IUS 2015). Sweden thereby holds the 10th

position among the EU28 (4th

in the IUS 2015).26

This result means that nearly a third of all EU countries have higher innovation outputs

than Sweden based on the IUS 2014 data. The best performing countries, based on the

IUS 2014 data, and with regard to innovation output are Germany (0.859), Luxembourg

(0.754) and Denmark (0.701). As shown by Table 3, Sweden (ranked 10th

) is thus far

behind Germany and considerably behind Luxembourg and Denmark. In turn, the best

performing countries, based on the IUS 2015, are Luxembourg (0.772), Denmark

(0.728) and Germany (0.723), Sweden being ranked 4th

with a normalized value of

0.620.

The main indicators explaining the change in the relative position of Sweden when

using IUS 2014 and IUS 2015 data are 2.2.1 “SMEs innovating in-house” (change from

0.729 – 8th

- in 2010 to 0.779 – 4th

in 2013), 2.3.4 “Community designs” (change from

26

We have also made a limited sensitivity analysis by replicating the above analysis of the innovation

outputs including 12 indicators (2.2.1, 2.3.1, 2.3.3, 2.3.4, 3.1.1, 3.1.2, 3.1.3, 3.2.1, 3.2.2, 3.2.3, 3.2.4,

3.2.5). As was the case with the average output results included in Table 3, the calculation of the average

output results in this case is based on the sum of the average normalized score for these output indicators,

which are then divided by the number of output indicators (n=12). Using this procedure with the IUS

2014 data, the ranking is led by Germany with a normalized score of 0.809, followed by Luxembourg

with 0.746 and Denmark with 0.720. Sweden ranks 4th

with a normalized value of 0.686. When

comparing the average values and rankings with both approaches (12 outputs as compared to 8 outputs)

we get a correlation of R2=0.85. This implies that the average output results and the subsequent rankings

with 8 and 12 outputs show very similar values for the EU28 countries. When using the IUS 2015 data,

the ranking is led by Luxembourg with a normalized score of 0.765, followed by Germany with 0.725

and Denmark with 0.724. Sweden ranks 5th

with a normalized value of 0.690. When comparing the

average values and rankings with both approaches (12 outputs as compared to 8 outputs) we get a

correlation of R2=0.92 in 2015. As was the case with the R

2 value for the IUS 2014, this high correlation

implies that the results we obtain as regards the low output performance of Sweden is not dependent on

the number of indicators considered, but rather on the low outputs achieved by Sweden in comparison

with the results obtained in the other EU28 countries.

Page 24: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

22

0.574 – 8th

– in 2012 to 0.999 – 3rd

– in 2012), and 3.2.2 “Medium and high-tech

product exports” (change from 0.579 – 15th

in 2012 to 0.648 – 9th

– in 2013).27

The results above should call for a serious reconsideration of who the real European

“innovation leaders” may be, and in what sense they are leaders. These findings also

call into question the way that the European Commission analyzes the innovation data

presented in the IUS. Using the data from the IUS 2014 to assess the production of

innovation outputs, we have concluded that Sweden does not possess one of the best

performing innovation systems in the EU28, as it ends up ranked number ten out of 28

in our analysis (4th

when using the IUS 2015 data). Admittedly, the method described in

this subsection is quite partial, only measuring innovation outputs. To make the method

less partial, in section 4.3 we will compare these outputs to some inputs that have been

available for developing and commercializing innovations. However, we will first

discuss our selection of the indicators and data of relevance for innovation input in

section 4.2.

4.2. Input orientation

The four IUS indicators which we consider here as being important for the input side of

innovation processes are listed in Table 4, where we also summarize Sweden’s

normalized scores and rankings for the input indicators we selected. A high position and

ranking here means that innovation efforts (often investments) to enhance innovation

output are high. Thus, when a country has a high normalized score and ranking on the

input side but a low one on the output side, the country has a low efficiency in the

translation of inputs into outputs, i.e., a low productivity of the innovation system – as

we will later discuss in section 4.3.

Looking at the IUS scores for indicator 1.3.1, ‘Public R&D expenditures’, we see that

Sweden had a normalized score of 0.979 in 2012 (0.957 in 2013), which is close to the

highest result (Finland, 0.990 in 2012 and Denmark 0.989 in 2013), while the EU28

average was 0.639 in 2012 (0.641 in 2013).

The score for private R&D expenditures in 2012, indicator 2.1.1 (Business R&D

expenditures as % of turnover), was even higher (a normalized score of 0.991 in 2012

and 0.956 in 2013), with Sweden being again ranked second only after Finland.

27

See Edquist and Zabala-Iturriagagoitia (2015) for a more detailed analysis of the IUS 2014 data in the

case of Sweden.

Page 25: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

23

Table 4.- The innovation input indicators of the Swedish national innovation systems28

Indicator Score in

2014

[2015]

Ranking

(out of 28)

in 2014

Ranking

(out of 28)

in 2015

EU28 average

in 2014 [2015]

Leading countries (top

3) in 2014

Leading countries (top

3) in 2015

1.3.1 Public R&D expenditures as % of GDP 0.979

[0.957]

2 3 0.639

[0.641]

Finland (0.990)

Sweden (0.979)

Denmark (0.918)

Denmark (0.989)

Finland (0.957)

Sweden (0.957)

1.3.2 Venture Capital investments 0.503

[0.536]

8 7 0.478

[0.472]

Luxembourg (1.0)

UK (0.762)

Finland (0.544)

Luxembourg (0.858)

UK (0.672)

Denmark (0.604)

2.1.1 Business R&D expenditures as % of GDP 0.991

[0.956]

2 2 0.558

[0.559]

Finland (1.0)

Sweden (0.991)

Slovenia (0.926)

Finland (1.000)

Sweden (0.956)

Denmark (0.868)

2.1.2 Non-R&D innovation expenditures as % of

turnover

0.319

[0.412]

10 10 0.275

[0.349]

Cyprus (0.936)

Lithuania (0.701)

Estonia (0.557)

Estonia (0.871)

Latvia (0.764)

Germany (0.746)

Average input result29

0.698

[0.775]

1 1 0.488

[0.505]

Sweden (0.698)

Finland (0.694)

Germany (0.631)

Sweden (0.775)

Germany (0.718)

Denmark (0.672)

Source: own elaboration based on the European Union (2014 and 2015) data.

28

The data and rankings for the innovation inputs of all EU28 Member Countries are presented in Annex 2. 29

Calculation based on the sum of the average normalized scores for each input indicator divided by the number of input indicators.

Page 26: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

24

Regarding Venture Capital investments, indicator 1.3.2, in 2012 Sweden shows a

normalized score of 0.503 in 2012 (0.536 in 2013), the 8th

position in the EU context in

2012 (7th

in 2013) and slightly above the EU28 average (0.478 in 2012, 0.472) in 2013.

Finally, for indicator 2.1.2, ‘Non-R&D innovation as % of turnover,’ Sweden shows a

normalized score of 0.319 in 2010 (0.412 in 2012), which positions the country 10th

in

2010 and 2012. This means, regarding the IUS definition of the indicator, that

investments in “equipment and machinery and the acquisition of patents and licenses”

(European Union, 2014: 87) are low and that more than a third of all European countries

are investing more in order to spread new production technologies and inventions.

Looking at all four input indicators that we selected as important for innovation, it

becomes evident, that Sweden is at the very top with regard to its average ranking

(ranking number one based on both in 2014 and 2015 IUS data, 0.698 and 0.775,

respectively) among the other EU 28 Member States.30

According to the IUS 2014 data,

Finland has ranking number 2 (0.694) and Germany has ranking number 3 (0.631).31

Using the IUS 2015 data, Germany has ranking number 2 (0.718) and Denmark has

ranking number 3 (0.672). It should be pointed out that the differences between the

normalized scores of the inputs for these countries are quite small. In section 4.3, we

will now compare output and input indicators with each other in order to discuss the

efficiency or performance of the Swedish national innovation system as a whole.

30

As we discussed in Section 3, a considerable part of public R&D has other objectives than innovation.

Therefore, we have also made a limited sensitivity analysis to check whether the average input result for

Sweden shows a similar or a different pattern when indicator 1.3.1, ‘Public R&D expenditures ’is

included or excluded. As was the case with the average output results included in Table 3, the calculation

of the average output results in this case is based on the sum of the average normalized score for the input

indicators considered, which are then divided by the number of input indicators (n=3). When excluding

indicator 1.3.1, and using this procedure with the IUS 2014 data, the ranking is also led by Sweden with a

normalized score of 0.604, followed by Finland with 0.595 and Germany with 0.556. When comparing

the average values and rankings with both approaches (4 inputs as compared to 3 inputs) we get a

correlation of R2=0.85. This implies that the average input results and the subsequent rankings whether

with 3 or 4 inputs show very similar values for the EU28 countries. When using the IUS 2015 data, the

ranking is led by Germany with a normalized score of 0.664, followed by Sweden with 0.635 and Finland

with 0.573. When comparing the average values and rankings with both approaches (4 outputs as

compared to 3 outputs) we get a correlation of R2=0.87. As was the case with the R

2 value for the IUS

2014, this high correlation implies that the results we obtain as regards the high input performance of

Sweden is not dependent on whether “Public R&D expenditures” are included or not. 31

We have also replicated the above analysis of the innovation inputs considering 7 input indicators

(1.1.1, 1.1.2, 1.1.3, 1.3.1, 1.3.2, 2.1.1, 2.1.2). The ranking is still led by Sweden with a score of 0.771 in

2014 and a score of 0.810 in 2015. When comparing the average values and rankings with both

approaches (7 inputs as compared to 4 inputs) we get a correlation of R2=0.86 in 2014 and an R

2=0.83 in

2015.

Page 27: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

25

4.3. The performance of the Swedish national innovation system

So far we have seen that our analysis of the IUS data shows that Sweden is not in a top

position on the output side, while the input side shows that there is a high inflow of

resources into the national innovation system - both positions in relation to the other EU

member states. Together these results indicate that Sweden is not the leading country in

terms of innovation performance in the EU – according to our approach to analyzing the

IUS data.

In this subsection we focus further on the relation between the input and the output

sides, since this relation provides a measure of the innovation performance of the

Swedish national innovation system in terms of its efficiency, or productivity. We will

also rank the productivity or efficiency of Sweden’s innovation system in relation to the

other EU28 countries. In what follows, we will discuss – and qualify - the interpretation

of this ranking.

What then can be concluded about a country’s performance on the basis of the 25 IUS

indicators? The IUS calculates a Summary Innovation Index (SII) from them in which it

gives the same weight to all indicators. In addition, it makes no distinction whatsoever

to show whether indicators reflect (a) innovations as such, (b) determinants or inputs of

innovation processes, or (c) consequences of innovations. After using these indicators to

calculate what is called the “EU Member States’ innovation performance,” an IUS

ranking for the EU28 Member States is calculated. Sweden has, for several years,

emerged as number one in this ranking. This has often been interpreted as meaning that

Sweden is the top ranked country in the EU with regard to innovation performance.

That this interpretation is common was documented in section 1.

Behind our choice of analysis below is the well-established fact that the only way to

measure the efficiency or productivity of a firm, country or system is to compare

outputs with inputs, as we have already argued in Section 2. There must be a nominator

and a denominator in a productivity ratio. The efficiency or productivity of an

innovation system is thus defined by us as the ratio between innovation output and

innovation input. Such a ratio shows how efficiently the countries/systems use their

innovation inputs. The IUS, on the other hand, simply calculates the average between

all the 25 indicators and does not relate them to each other in any other way.

We acknowledge that, as in any process of change, there are time lags between the

investments made on the input side and achieving certain outputs (Brenner, 2014;

Page 28: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

26

Brenner and Broekel, 2009; Makkonen and van der Have, 2013). However, in this

paper, we have decided not to include any analysis of time gaps in the discussion of the

relations between innovation inputs and innovation outputs.32

We want to stress,

however, that time lags between the inputs and outputs should be considered in further

analyses of the efficiency of innovation systems, although this this might be difficult to

do for all EU28 countries, due to the differences in their innovation profiles, and

because, for example, radical innovations may require longer gestation periods than

incremental innovations.

The productivity of the Swedish innovation system is shown in Table 5. Applying our

method to the data provided by the IUS 2014, Sweden is ranked extremely high with

regard to input (ranking number one) and at an intermediate level (ranking number 10)

with regard to output. The IUS 2015 data show that Sweden continues to be ranked first

with regard to input and 4th

with regard to output. As shown in Table 5, this obviously

leads to a very low ranking with regard to the productivity or efficiency of the

innovation system. Annex 3 provides data showing that Sweden is ranked number 24

among the EU28 Member States with regard to the productivity of the innovation

system as we define it here using the IUS 2014 data and number 25 using the IUS 2015

data.33

Obviously, the national innovation system in Sweden cannot be said to perform

well at all from this productivity point of view.

It may be repeated here that Sweden scores very low on output indicator 3.2.4 (Sales of

new to the market and new to the firm innovations), where Sweden falls down from

position 21 in 2011 to position 24 in 2012. This indicator is one of the most important

output indicators of all – see section 4.1.

32

The IUS (2014 and 2015) do not have such an analysis discussion either, and do not even mention the

issue. 33

We have also calculated the productivity of the Swedish innovation system and its relative ranking in

the EU28 context with the indexes elaborated with the 7 inputs (input value of 0.771) and the 12 outputs

(output value of 0.686). In the IUS 2014, Sweden (with a productivity of 0.89) then holds the 18th

position in the EU28. When comparing the average values and rankings with both approaches (4 inputs

and 8 outputs compared with 7 inputs and 12 outputs) a correlation of R2=0.42 is observed. In the IUS

2015, Sweden (with a productivity of 0.852) then holds the 21st position in the EU28. When comparing

the average values and rankings with both approaches (4 inputs and 8 outputs compared with 7 inputs and

12 outputs) a correlation of R2=0.49 is observed.

Page 29: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

27

Table 5: The efficiency and productivity of Sweden’s innovation system34

Score in 2014 Score in 2015 Ranking

(out of

28) in

2014

Ranking

(out of

28) in

2015

Leading

countries

(top 3) in

2014

Leading

countries

(top 3) in

2015

Productivity of the innovation system

(=innovation output divided by

innovation input)

0.82

(0.575/0.698)

0.80

(0.620/0.775)

24 25 Greece

(2.52)

Bulgaria

(2.19)

Italy (1.98)

Cyprus

(4.053)

Luxembourg

(3.431)

Romania

(1.976)

Source: own elaboration based on data from the European Union (2014 and 2015).

34

The data and rankings for all EU28 Member Countries are presented in Annex 3.

Page 30: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

28

Figure 2 shows the distribution of the ranking scores obtained from the above efficiency

estimation (using 4 input and 8 output indicators) for both the IUS 2014 and 2015. This

ranking is then compared with the one provided by the SII, which, according to the IUS,

measures “EU Member States’ Innovation Performance” (European Union, 2014:5).

With this we want to check to what extent the two approaches (i.e., the one followed by

the IUS with the SII and our productivity/efficiency approach) provide similar results or

not. If the two rankings coincided, one would expect the majority of countries to be

distributed along a 45° line. However, this is not the case. The negative relation of these

indices must result from their different conceptual frameworks and settings, since the

indicator data are the same in both cases. As can be observed, Sweden is not the only

country where the two rankings are radically different. In fact, this is the case for most

countries, including innovation leaders, innovation followers, moderate innovators and

modest innovators.

Figure 2. – Comparison of the IUS Summary Innovation Index (SII) and our innovation

performance measurement for the EU28

IUS 2014

Source: own elaboration.

Page 31: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

29

IUS 2015

Source: own elaboration.

Our results indicate that the efficiency or productivity of the Swedish national

innovation system is far from being adequate. Compared to the resources invested, the

Swedish innovation system does not manage to produce a large enough innovation

output – given the assumption that the IUS data measures inputs and outputs correctly.

To put these results into perspective, we will now compare them to the results of

another innovation index which has been produced for eight years. The latest version

has been published as “The Global Innovation Index 2014”and was produced by

Cornell University, INSEAD, and WIPO (Cornell University et al., 2014). “The Global

Innovation Index” (GII) includes 81 indicators for 143 countries. All indicators are

classified as innovation input or innovation output indicators and a sub-index is

calculated for each.35

Sweden is ranked high both with regard to the Innovation Input

Sub-Index (6th

) and the Innovation Output Sub-Index (3th

) (Cornell University et al.,

2014: 16-18). The GII also provides a simple average of the two sub-indexes, i.e.,

outputs and inputs are not compared in this average. Hence, this GII average is

calculated in a way similar to the Summary Innovation Index (SII) of the IUS (2014).

35

In this paper, we have not evaluated the accuracy of the GII classification into input and output

indicators.

Page 32: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

30

According to the GII average index, Sweden was ranked number 3 in 2014, after

Switzerland (ranked number 1) and the United Kingdom (ranked number 2). In the GII

average indexes for 2011 and 2013, Switzerland was ranked number 1 and Sweden

number 2 (Cornell University et al., 2014). Hence the GII Index and the IUS SII index

lead to similar results for the case of Sweden.

As mentioned, the IUS does not make any distinction or comparisons at all between

innovation input and innovation output indicators. However, in GII a ratio is calculated

between the Innovation Output Sub-Index and the Innovation Input Sub-Index for all

countries. This is called the “Innovation Efficiency Ratio”. “It shows how much

innovation output a given country is receiving from its inputs” (Cornell University et

al., 2014: 7). It is therefore, in its basic characteristics, similar to the innovation

performance (i.e. efficiency/productivity) measure proposed earlier by us.

The GII “Innovation Efficiency Ratio” shows that, despite Sweden’s very high ranking

for inputs (6) and outputs (3), it is ranked number 22 with regard to innovation

efficiency. Hence our use of IUS data to calculate efficiency of innovation systems and

the calculation of the GII “Innovation Efficiency Ratio” lead to results that point in the

same direction: they both indicate that the Swedish innovation system is quite

inefficient.

The results presented here should be related to the decades-old discussion of the so-

called “Swedish paradox” (Edquist and McKelvey, 1998), a notion still central to

innovation policy discussions in Sweden. When first formulated, it was as a reflection

of a high research and development (R&D) intensity in Sweden coupled with a low

share of high-tech (R&D intensive) products in manufacturing as compared to the

OECD (Organization for Economic Co-operation and Development) countries.36

It was

seen as a paradox between a high input and a low output measured by these specific

indicators.

In other words, the Swedish paradox pointed to a low productivity for the Swedish

national system of innovation in this specific sense, i.e., on the basis of the scarce data

that was available in the 1990’s. Subsequently, the expression has been used widely, but

often formulated as a general relation between inputs and outputs – e.g., that the

investments in R&D in Sweden are very large, but that the ‘pay-off’ (in terms of, for

36

The share of high-tech products was seen as a proxy for innovation output intensity.

Page 33: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

31

example, growth and competitiveness) is not particularly impressive (e.g., Andersson et

al., 2002). Due to varying uses of the concept, and since many formulations have been

based on rather partial data, there has been considerable discussion about to what extent

a paradox actually exists.37

The analysis presented in this paper indicates that the Swedish paradox, in the original

sense of the term, is still in operation, provided that the IUS data correctly measures

inputs and outputs and that we are applying them in an appropriate way. The same

conclusion is also indicated by the GII data and analysis. The reasonable interpretation

is that Sweden invests substantial inputs for the development of innovations, but when it

comes to the actual production of innovation outputs, Sweden shows relatively low

results.

To understand the current confusion about the innovation performance of Sweden as

compared to other EU countries, it is important to note that the meaning of the notion

“innovation performance” in the setting “EU Member States’ Innovation Performance”

is not explicitly defined or specified in the IUS. No theory-based discussion of

innovation performance is underpinning it, which makes it hard for the reader of the

IUS reports to evaluate the meaning and implications of this concept. “Innovation

performance” is only specified implicitly and contextually by the way it is used and

measured. This is highly surprising, since “innovation performance” is the absolutely

most central concept in the IUS reports. If this vagueness is conscious, it should be

severely criticized, since it may lead to – and has led to – many different interpretations.

As it is used in the IUS, “innovation performance” implicitly means an average obtained

by calculating values for 25 indicators measuring the determinants of innovations,

innovations as such and the consequences of these innovations (i.e., enablers, firm

activities and outputs, in the IUS language).38

In the IUS, there is no conceptual or

theoretical discussion that motivates why the term should have this meaning. No criteria

are presented for choosing exactly those 25 indicators. Since it includes determinants,

37

The “Swedish Paradox” has been intensively discussed (e.g., Jacobsson and Rickne, 2004; Granberg

and Jacobsson, 2006; Audretsch, 2009; Ejermo and Kander, 2009; Ejermo et al., 2011). However, most of

these authors define the phenomenon in different ways compared to Edquist and McKelvey (1998).

Hence the different views on whether there is a paradox or not is dependent on what is meant by the term.

This could be analyzed, but it is not within the scope of this paper. 38

In spite of this very wide use of the concept, the IUS does label it “innovation performance” and not

“innovation-related activities”, which may sound more sensible.

Page 34: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

32

innovations as such and consequences, the term is useless from an innovation policy

point of view.

The use of the term is even counterproductive from a policy point of view. The main

reasons for this are the following:

The objective of innovation policy must be to influence the development and

diffusion of innovations as such. Therefore the relative innovation output should

be known to policy analysts, policy-makers and politicians. If these innovation

intensities are not known, it is hard for policy-makers to motivate why the

innovations intensities should be improved by means of policy (Edquist, 2011).

The SII is not a measure of innovation output as such, but includes a

considerable number of variables in addition to output innovation indicators. We

have shown, for example, that the SII includes indicators of determinants of

innovations (section 4.3). Hence the SII score (“innovation performance”) for a

country will increase if it puts more (input) resources into its innovation system

even if this results in no innovations at all. This confusing “linear logic” means

that the “innovation performance” concept of the IUS is more of an obstacle to

the development of innovation policy than an asset that facilitates the design and

implementation of innovation policy.

Currently existing innovation intensities have been influenced by a number of

forces that affect innovation processes – forces which we call determinants of

innovations. Many of these determinants can be influenced by the state. When

the state (through its public agencies) influences these determinants in order to

increase innovations intensities of certain kinds, it is actually pursuing

innovation policy. Therefore, the degree to which public organizations can

influence innovations should be known by these organizations. They should be

able to know and monitor the evolution of these determinants. The SII does not

help them to do so. On the other hand, the group of input indicators that we have

singled out from the list of 25 IUS indicators is useful for this purpose (but the

number of determinants included in this group should be enlarged. It should

include – ideally – all important determinants of innovation processes and

describe them using a holistic approach, see Edquist 2014a, 2014c).

In this context it might also be useful to briefly address the consequences of

innovations – consequences for productivity growth, for employment, for

environmental conditions, for social conditions, etc. (Gómez Uranga et al.,

Page 35: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

33

2014). It is important for policy-makers and politicians to know these

consequences. They are the factors that politicians are actually interested in.

They are not interested in innovations as such, but in their socioeconomic

consequences. Politicians know, though, that they have to achieve certain

innovation intensities in order to achieve the socioeconomic benefits of

innovations.

We now return to the issue of performance. As stated above, in this paper we

understand “innovation performance” to mean either of two things:

Output of innovations as such, which is here measured by eight innovation

output indicators (discussed in section 4.1),39

and

The productivity or efficiency of innovation systems, which is here measured as

the ratio between the eight innovation output and the four innovation input

indicators (discussed in section 4.3).

As can be observed in Annex 3, the EU national innovations systems are ranked in the

following order in terms of efficiency: Greece, Bulgaria, Italy, Romania, Ireland,

Cyprus, Luxembourg, Portugal, Spain, Slovakia, Germany, Austria, Czech Republic,

Malta, Belgium, Croatia, Denmark, France, Netherlands, Latvia, Hungary, Estonia,

Finland, Sweden, Slovenia, United Kingdom, Poland and Lithuania (European Union

2014). No matter how unbelievable it may appear, this efficiency ranking is correct to

the extent that the IUS data are correct and our approach to dealing with them is

reasonable. It is still counterintuitive, however, since many of the top ranked countries

are less developed and must be considered to have less developed innovation systems.

One possible partial explanation for this seeming contradiction is that many of these less

developed EU countries devote very limited resources to inputs, but still manage to

obtain a reasonable number of outputs in relation to the inputs they are able to invest.

For example, Annex 2 shows that Sweden invested 7.35 times more than Bulgaria in

innovation inputs, according to IUS 2014. At the same time Sweden achieved an output

figure that is only 2.77 times higher than Bulgaria. The figures are similar for the other

EU countries that rank high on productivity, but have low innovation input figures. The

39

It should be stressed that our notion of “innovation output” is almost identical to the definition of

“innovation” in the OECD “Guidelines for collecting and interpreting innovation data”, i.e., the so-called

“Oslo Manual” (Oslo 2015).

Page 36: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

34

above figures simply imply that the less developed countries manage to use their (more

limited) resources in a more productive/efficient way.

How can this higher productivity of less developed national innovation systems be

explained? As Zabala-Iturriagagoitia et al. discuss (2007a: 667), countries with more

developed and more comprehensive innovation systems are usually more oriented

towards radical innovation, the development of new industries, often in knowledge-

intensive sectors and in high-tech industries. Such innovation efforts are characterized

by uncertainty, high risk, and failures. In contrast, countries with lower innovation

capacity and fewer innovation resources tend to absorb and adopt the embodied

knowledge and the innovations of others (e.g., from abroad). Such absorption involves

lower innovation input costs, but may, at the same time, be more efficient, as they may

avoid the inherent risk involved in the development of these innovations, so that the

‘new’ knowledge is more rapidly and cheaply adapted and adopted than in the country

that developed it.

According to Freeman and Soete (1988), the efficient diffusion of innovations is often

much more important for development and growth than being the lead innovator.

Leading innovation countries may thus be more prone to the creation of new-to-the-

world innovations, while follower countries are more prone to the absorption and later

diffusion of these innovations (as long as the required levels of absorptive capacity are

in place). This is highly relevant from an innovation policy point of view, a partial

objective of which may be to catch up with the leading countries by absorbing

innovations from abroad. Closing a gap in an existing trajectory of innovation is

normally easier and cheaper than opening up a new innovation path.

We have now calculated the differences in the productivities of the EU innovation

systems (in one of many possible ways). We have also provided some possible

explanations for the fact that many less developed EU countries with fairly

underdeveloped national innovation systems score very high with regard to

productivity.

Some countries with comprehensive and well-developed innovation systems, and high

innovation inputs, score low on the productivity of their innovation systems according

to our analysis. Examples are the United Kingdom, Sweden, Finland, the Netherlands,

France and Denmark. This may be because these countries are spending too much on

innovation inputs, or because these innovation efforts are not well balanced. It might

Page 37: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

35

also be because the input resources are not used in an efficient way. A question that may

be asked is, for example, whether Sweden (or any of these countries) could or should try

to increase the absorption of innovations from abroad? Therefore, there are reasons for

policy analysts in these countries to further analyze this question – with the intention of

changing innovation policies if the analysis shows that such is called for.

The notion of optimality is irrelevant in an innovation context, and we cannot specify an

optimal or ideal innovation system. The only way to identify problems that should be

the subject of innovations policy is therefore to compare innovation systems with each

other (Edquist, 2012). Unfortunately, the ranking of the efficiency of the EU28

innovation systems in Annex 3 is practically useless from the point of view of

innovation policy development. There are no reasons whatsoever to benchmark

Sweden’s national innovation system with those of Greece, Bulgaria, Romania, and

Cyprus in attempting to develop an analysis to form a basis for innovation policy

changes.40

Such comparisons should, instead, be made between innovation systems that

are more similar in a structural sense and at a similar level of development. One could

try out such comparisons, however, between countries that are at similar development

stages, that have the same size, that score similarly on innovation output or innovation

input, etc.41

Let us, very briefly, mention some countries which have similar innovations inputs and

that also manage to achieve a higher innovation output than Sweden. All these countries

are also ranked above Sweden with regard to the efficiency of their innovation systems

(see Annex 3). Such countries are the Netherlands, Belgium, Luxembourg, Denmark,

Germany and France. These countries could serve as benchmarks for Sweden in

developing its innovation system through innovation policy, since their structures (i.e.,

industrial, administrative and political systems) are rather similar to those in Sweden

(see Table 6). If we focus on inputs, it can be observed that Denmark and Germany

have a level of investment similar to that in Sweden. On the output side, Belgium,

France and the Netherlands have output levels similar to those of Sweden.

40

This argument is also valid for the GII “Innovation Efficiency Ratio” discussed in section 4.3. 41

As Navarro et al. illustrate (2009), to foster learning in policy-making processes and to derive sensible

policy conclusions, countries need to be compared with others with similar characteristics.

Page 38: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

36

Table 6.- Possible benchmarks for the Swedish innovation system

Output

in

2014

[2015]

Input

in

2014

[2015]

Productivity

of

innovation

systems in

2014 [2015]

Ranking in

terms of

productivity

in 2014

[2015]

Summary

Innovation

Index

(SII) 2013

[2014]

Ranking

according

to the SII

(2013)

[2014]

Luxembourg 0.754

[0.772]

0.461

[0.225]

1.63

[3.43]

7

[2]

0.646

[0.642]

5

[6]

Germany 0.859

[0.723]

0.631

[0.718]

1.36

[1.007]

11

[21]

0.709

[0.676]

3

[4]

Belgium 0.603

[0.566]

0.507

[0.542]

1.19

[1.043]

15

[19]

0.627

[0.619]

7

[9]

Denmark 0.701

[0.728]

0.630

[0.672]

1.11

[1.084]

17

[18]

0.728

[0.736]

2

[2]

France 0.520

[0.543]

0.479

[0.487]

1.09

[1.116]

18

[17]

0.571

[0.591]

11

[10]

Netherlands 0.538

[0.570]

0.543

[0.437]

0.99

[1.304]

19

[11]

0.629

[0.647]

6

[5]

Sweden 0.575

[0.620]

0.698

[0.775]

0.82

[0.80]

24

[25]

0.750

[0.74]

1

[1]

Source: own elaboration from European Union (2014 and 2015).

Of these countries, we believe that it would be particularly interesting for Sweden to be

compared to the German national system of innovation. Such a comparison should

include a detailed analysis of innovation output as well as of the determinants of this

performance, its consequences, and the productivity of the innovation systems. It should

include quantitative analyses based on indicators, but also qualitative analyses of

institutions and organizations in the innovations systems.42

One conclusion of this as regards Swedish innovation policy is that considerable efforts

should be made to identify the sources of the inefficiencies in the Swedish national

system of innovation. In addition, existing instruments and mechanisms for innovation

policy should be used and new ones created to overcome the inefficiencies identified.

This means breaking with the linear model of innovation that still dominates innovation

policy in Sweden, a model that is still applied despite the fact that it has been

completely rejected in innovation research (Edquist, 2014a). In its place a holistic

42

We have previously carried out such comparative analyses of ten small national systems of innovations

in Edquist and Hommen (2009).

Page 39: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

37

innovation policy should be developed – one that takes into account all the determinants

(driving forces as well as obstacles) of innovations (Edquist, 2011).43

5. Conclusions

The Innovation Union Scoreboard (IUS) has, for many years, highlighted Sweden as

one of the innovation leaders in Europe, with a high “innovation performance” (e.g.,

European Union 2013, 2014, 2015). For several editions of the IUS, Sweden has been

ranked number one with regard to “EU member States’ Innovation Performance”.

However, the IUS does not provide any conceptual or theoretical underpinning for what

is meant by “innovation performance” nor is there any discussion of the 25 specific

indicators used to calculate such performance or the relations among them. Neither does

the IUS provide any discussion about the relations between the inputs and outputs in

innovation systems. Hence, it can be concluded that the IUS does not attempt to

measure productivity of innovation systems at all.

In this study we have questioned whether Sweden holds the top position within the EU

with regard to “innovation performance” in any meaningful sense of this term. From our

point of view, the “innovation performance” of an innovation system should be

understood from two perspectives: (i) the production of innovation outputs; and (ii) the

productivity/efficiency of the system as a whole.

On this basis, we have first identified a set of indicators that can be used to measure the

input and output sides of the national innovation system. Second, we have related the

levels of inputs and outputs to each other in order to reach a conclusion about the

performance of the Swedish innovation system in terms of its efficiency or productivity

in relation to the other EU Member States. Using the data presented in the IUS 2014 and

2015 reports (which, in most cases, refers to the years 2012 and 2013), we divide the

analysis into innovation inputs (i.e., four input indicators) and outputs (i.e., eight output

indicators). We have exclusively used the data from IUS 2014 and 2015. It has not been

our purpose to evaluate the quality of the IUS data, but rather to assess the methodology

used in it by comparing the results of the IUS analysis to those achieved using our

approach.

The IUS appoints Sweden to the top position (ranked number 1 of the 28 European

Member States) in terms of what they call “EU Member States’ Innovation

43

How such a holistic policy could be developed is outlined in Edquist (2014b, 2014c).

Page 40: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

38

Performance”. Our analysis demonstrates that the results based on an alternative

methodology that separates inputs and outputs provide a quite different picture. Based

on the results obtained applying our approach, it can be questioned whether Sweden can

be regarded as the innovation leader of Europe. Based on the data provided by the IUS,

our analysis shows that Sweden is number 1 (within the EU28) in terms of innovation

input, number 10 in terms of innovation output for year 2014 (number 4 for year 2015),

and number 24 out of 28 for year 2014 (number 25 for year 2015) with regard to the

efficiency or productivity of its innovation system.

Hence, we have shown that many countries which devote a smaller number of resources

to innovation than Sweden achieve outstanding levels of efficiency. We have also

shown that a country with a comprehensive innovation system such as Sweden’s does

not necessarily show efficiency levels commensurate with its innovation efforts (i.e.,

inputs).

We very much agree with Foray and Hollanders (2015) that the statistical information

provided by the IUS needs to be supplemented with other more contextual and

qualitative information regarding the innovation system under study. Following Frane

(2014), a goal of this paper was to analyze and discuss the misinterpretations that the

IUS is making of the data, and therefore we have not engaged into the elaboration of

this more contextual and qualitative information. However, Edquist and Hommen

(2009) have studied details of the structural characteristics of the Swedish innovation

system (and nine more small innovation systems in Asia and Europe).

In summary, the approach applied in the IUS offers an incorrect interpretation of the

actual state of the EU national innovation systems. The lack of conceptual and

theoretical discussions in the IUS approach is a major explanation of the flaws in the

interpretation of the results and in the Summary Innovation Index included in their

report. Dealing with these flaws is particularly relevant since they could lead to faulty

and ineffective (innovation) policy decisions.

From our point of view, the individual indicators that constitute the composite indicator

called the Summary Innovation Index need to be analyzed in much more depth in order

to reach a correct measure of the performance of an innovation system (Grupp and

Mogee, 2004; Archibugi et al., 2009; Zabala-Iturriagagoitia et al., 2007b). In addition,

policy makers need to consider the results of different and complementary analyses to

obtain a correct picture of their respective innovation systems (Hagedoorn and Cloodt,

Page 41: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

39

2003). In our view, the combination of several partial views will provide a clearer

picture than that provided by each in isolation.

Acknowledgements

An earlier version of this paper, only referring to the data from the IUS 2014, was

published as a CIRCLE Paper in Innovation Studies 2015/16, under the title “The

Innovation Union Scoreboard is Flawed: The case of Sweden – not being the innovation

leader of the EU”. A version of this paper has also been presented at the EU-SPRI 2015

conference in Helsinki, 10-12 June 2015. The authors are grateful to Jordi Molas,

Hannes Toivanen, Jakob Edler, Nick Vonortas, Olof Ejermo, Martin Andersson, Sylvia

Schwaag Serger and Lennart Stenberg for their comments on the earlier version.

Finally we are very grateful to Michael Nauruschat for his assistance with the data

collection during the early stages of the work with this paper. Jon Mikel Zabala-

Iturriagagoitia wishes to acknowledge the financial support from the Basque

Government Department of Education, Language Policy and Culture and Charles

Edquist would like to acknowledge support from the Rausing Foundation.

References

Andersson, T., O. Asplund, and M. Henrekson (2002). Betydelsen av

innovationssystem: utmaningar för samhället och för politiken, Stockholm: VINNOVA.

(In Swedish)

Archibugi, D., Denni, M. and Filipetti, A. (2009) The technological capabilities of

nations: The state of the art of synthetic indicators. Technological Forecasting and

Social Change, 76(7), 917-931.

Arundel, A. and Hollanders, H. (2008). Innovation scoreboards: indicators and policy

use. In Nauwelaers, C. and Wintjes, R. (Eds). Innovation Policy in Europe:

Measurement and Strategy. Edward Elgar, Cheltenham, pp. 29-52.

Arundel, A., Bordoy, C., Mohnen, P., Smith, K. (2008) Innovation surveys and policy:

lessons from the CIS, in C. Nauwelaers and R. Wintjes (Eds.), Innovation Policy in

Erope. Measurement and Strategy. Edward Elgar, Cheltenham, pp. 3-28.

Audretsch, D.B. (2009). The entrepreneurial society, in Journal of Technology Transfer,

34, 245-254.

Baltic Development Forum (2012). State of the region report 2012. The Top of Europe

Bracing Itself for Difficult Times: Baltic Sea Region-Collaboration to Sustain Growth.

Baltic Development Forum 2012.

Brenner, T. (2014). Science, Innovation and National Growth. Working Papers on

Innovation and Space 03.14. Philipps Universität Marburg, Germany.

Page 42: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

40

Brenner, T. and Broekel, T. (2009) Methodological issues in measuring innovation

performance of spatial units. Papers in Evolutionary Economic Geography 09.04,

Utrecht University.

Cornell University, INSEAD, and WIPO (2014). The global Innovation Index: the

Human Factor in Innovation, Fontainebleau, Ithaca and Geneva.

Edquist, C. (2005). Systems of Innovation: Perspectives and Challenges, in Fagerberg,

J., Mowery, D., and Nelson, R. (eds.) Oxford Handbook of Innovation. Oxford

University Press, Oxford, pp. 181-208.

Edquist, C. (2011). Design of innovation policy through diagnostic analysis:

Identification of systemic problems (or failures). Industrial and Corporate Change,

20(6), 1725–1753..

Edquist, C. (2014a). Striving towards a Holistic Innovation Policy in European

countries – But linearity still prevails! STI Policy Review, 5(2), 1-19.

Edquist, C. (2014b). En helhetlig innovationspolitik – varför, vad och hur? (A holistic

innovation policy – why, What and How?). In Wanger, M. Ö. (Ed.), Position Sverige –

Om innovation, hållbarhet och arbetsmarknad – en debattantologi (Position Sweden –

On innovation, sustainability and labour market – A debate anthology – in Swedish)

(pp. 59-80). Stockholm, Sweden: Ekerlids Förlag.

Edquist, C. (2014c). Efficiency of Research and Innovation Systems for economic

growth and employment – Report for the European Research and Innovation Area

Committee (CIRCLE Working Paper 2014/08). Lund University, Sweden: Centre for

Innovation, Research and Competence in the Learning Economy.

Edquist, C., and Hommen, L. (eds.) (2009). Small Country Innovation Systems:

Globalization, change, and policy in Asia and Europe. Edward Elgar, Cheltenham.

Edquist, C., Hommen, L., and McKelvey, M. (2001). Innovation and Employment:

Process versus Product Innovation. Edward Elgar Publishing, Cheltenham.

Edquist, C. and McKelvey, M. (1998). High R&D Intensity Without High Tech

Products: A Swedish Paradox?. In Nielsen, K. and Johnson, B. (eds), Institutions and

Economic Change: New Perspectives on Markets, Firms and Technology. Edward Elgar

Publishing Ltd, Cheltenham. pp 131-149.

Edquist, C. and Zabala-Iturriagagoitiam J.M. (2015). The Innovation Union Scoreboard

is Flawed: The case of Sweden – not being the innovation leader of the EU. CIRCLE

Papers in Innovation Studies 2015/16. CIRCLE, Lund University.

Edquist, C., and Zabala-Iturriagagoitia, J.M. (2012). Public Procurement for Innovation

as mission-oriented innovation policy. Research Policy, 41(10),

http://dx.doi.org/10.1016/j.respol.2012.04.022.

Edquist, C., Vonortas, N. S., Zabala-Iturriagagoitia, J. M., and Edler, J. (Eds.) (2015).

Public procurement for innovation. Cheltenham, Edward Elgar Publishing, 304 pp.

Ejermo, O. and Kander, A. (2011). Swedish business research productivity. Industrial

and Corporate Change, 20(4), 1081–1118.

Ejermo, O., Kander, A., and Svensson Henning, M. (2011). The R&D-growth paradox

arises in fast-growing sectors. Research Policy, 40, 664-672.

Page 43: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

41

European Commission (2011). Innovation Union Scoreboard 2010. The Innovation

Union's performance scoreboard for Research and Innovation. 1 February 2011.

European Commission (2013a). State of the Innovation Union 2012 Accelerating

change. European Commission, Directorate General for Research and Innovation,

Brussels, 21.3.2013. COM(2013) 149 final..

European Commission (2013b). Research and Innovation performance in EU Member

States and Associated countries. Innovation Union progress at country level 2013.

Publications Office of the European Union, 2013.

European Union (2013). Innovation Union Scoreboard 2013. Brussels: European

Commission.

European Union (2014). Innovation Union Scoreboard 2014. Brussels: European

Commission.

European Union (2015). Innovation Union Scoreboard 2015. Brussels: European

Commission.

Evangelista, R., Sandven, T., Sirilli, G., and Smith, K. (1998). Measuring Innovation in

European Industry. International Journal of the Economics of Business (5:3), pp. 311-

333.

Foray, D. and Hollanders, H. (2015). An assessment of the Innovation Union

Scoreboard as a tool to analyse national innovation capacities: The case of Switzerland.

Research Evaluation, 24, 213-228.

Frane, A. (2014). Measuring National Innovation Performance: The Innovation Union

Scoreboard Revisited. Heidelberg, Springer.

Freeman, C. and Soete, L. (1988). Innovation Diffusion and Employment Policies.

Ricerche Economiche, 40, 836-854.

Furman, J.L., Porter, M.E. and Stern, S. (2002) The determinants of national innovative

capacity. Research Policy, 31, 899-933.

Gómez-Uranga, M., J.C. Miguel, and Zabala-Iturriagagoitia, J.M. (2014). Epigenetic

economic dynamics: The evolution of big internet business ecosystems, evidence for

patents. Technovation, 34(3), 177–189.

Granberg, A. and Jacobsson, S. (2006). Myths or reality – a scrutiny of dominant beliefs

in the Swedish science policy debate. Science and Public Policy, 33(5), 321-340.

Grupp, H. and Mogee, M.E. (2004). Indicators for national science and technology

policy: how robust are composite indicators? Research Policy, 33, 1373-1384.

Grupp, H. and Schubert, T. (2010). Review and new evidence on composite innovation

indicators for evaluating national performance. Research Policy, 39, 67-78.

Hagedoorn, J. and Cloodt, M. (2003). Measuring innovative performance: is there an

advantage in using multiple indicators. Research Policy, 32(8), 1365-1379.

Heidenreich, M. (2009) Innovation patterns and location of European low- and medium-

technology industries. Research Policy, 38, 483-494.

Page 44: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

42

Hollanders, H. and Tarantola, S. (2011). Innovation Union Scoreboard 2010-

Methodology report. January 2011.

IUS (2011). Innovation Union Scoreboard 2010 – Methodology report. Written by

Hollanders, H. and Tarantola, S. European Commission 2011.

Jacobsson, S. and Rickne, A. (2004). How large is the Swedish academic sector really?

A critical analysis of the use of science and technology indicators. Research Policy, 33,

1355-1372.

Laursen, K. and Salter, A. (2006). Open for Innovation: the role of openness in

explaining innovation performance among U.K. manufacturing firms. Strategic

Management Journal, 27, pp. 131-150.

Mahroum,S. and Al-Saleh, Y. (2013). Towards a functional framework for measuring

national innovation efficacy. Technovation, 33(10-11), 320-332.

Mairesse, J. and Mohnen, P. (2002). Accounting for Innovation and Measuring

Innovativeness: An Illustrative Framework and an Application. American Economic

Review, 92(2), 226-230.

Makkonen, T. and van der Have, R.P. (2013). Benchmarking regional innovative

performance: composite measures and direct innovation counts. Scientometrics, 94,

247-262.

Navarro, M., Gibaja, J.J., Bilbao-Osorio, B. and Aguado, R. (2009). Patterns of

innovation in EU-25 regions: a typology and policy recommendations. Environment

and Planning C: Government and Policy, 27(5), 815 – 840

OECD (2005). Oslo Manual. Guidelines for collecting and interpreting innovation data.

Third Edition. OECD/European Commission 2005.

Samara, E., Georgiadis, P., Bakouros, I. (2012). The impact of innovation policies on

the performance of national innovation systems: A system dynamics analysis

Technovation, 32(11), 624–638.

Zabala-Iturriagagoitia, J.M., Voigt, P., Gutiérrez-Gracia, A., Jiménez-Sáez, F. (2007a).

Regional innovation systems: how to assess performance. Regional Studies, 41 (5), 661-

672.

Zabala-Iturriagagoitia, J.M., Jiménez-Sáez, F., Castro-Martínez, E., Gutiérrez-Gracia,

A. (2007b). What indicators do (or do not) tell us about Regional Innovation Systems.

Scientometrics, 70(1), 85- 106.

Page 45: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

43

Annex 1: Innovation outputs of the national innovations systems of the EU28 member states

Year 2014

Latest data

year

2010 2012 2012 2011 2010 2012 2011 2010

SII 2013 2.2.1 2.3.3 2.3.4 3.1.1 3.1.2 3.2.2 3.2.3 3.2.4 Innovation Output

and ranking

Germany 0,709 3 0,933 1 0,595 6 0,884 4 1,000 1 1,000 1 0,930 1 0,790 5 0,742 4 0,859 1

Luxembourg 0,646 5 0,806 4 1,000 1 1,000 1 0,792 3 0,960 2 0,285 25 1,000 2 0,241 22 0,754 2

Denmark 0,728 2 0,813 3 0,561 8 0,971 3 0,649 11 0,616 8 0,336 24 0,959 3 0,704 7 0,701 3

Cyprus 0,501 14 0,833 2 1,000 2 0,605 6 0,493 14 0,494 15 0,606 12 0,564 8 0,687 10 0,660 4

Austria 0,599 10 0,692 9 0,756 4 1,000 2 0,662 10 0,609 9 0,661 9 0,225 22 0,494 16 0,637 5

Belgium 0,627 7 0,786 5 0,398 14 0,515 12 0,848 2 0,596 11 0,601 13 0,553 9 0,525 14 0,603 6

Italy 0,443 15 0,650 10 0,396 16 0,743 5 0,608 12 0,624 6 0,721 5 0,291 19 0,697 8 0,591 7

Finland 0,684 4 0,607 13 0,497 11 0,569 9 0,721 9 0,535 13 0,552 16 0,421 12 0,727 5 0,579 8

Ireland 0,606 9 0,758 7 0,409 13 0,152 23 0,738 8 0,667 5 0,587 14 1,000 1 0,314 20 0,578 9

Sweden 0,750 1 0,729 8 0,573 7 0,574 8 0,781 4 0,605 10 0,579 15 0,510 10 0,248 21 0,575 10

Portugal 0,410 18 0,632 11 0,364 17 0,600 7 0,739 6 0,717 4 0,481 21 0,336 15 0,659 11 0,566 11

Estonia 0,502 13 0,617 12 0,678 5 0,521 11 0,739 7 0,473 17 0,355 23 0,448 11 0,521 15 0,544 12

Netherlands 0,629 6 0,767 6 0,541 9 0,514 13 0,749 5 0,493 16 0,535 18 0,313 17 0,392 19 0,538 13

France 0,571 11 0,519 15 0,308 19 0,441 15 0,445 16 0,619 7 0,741 4 0,400 13 0,689 9 0,520 14

Greece 0,384 19 0,594 14 0,147 27 0,052 27 0,551 13 0,801 3 0,238 28 0,744 6 1,000 1 0,516 15

Czech

Republic 0,422 16 0,445 16 0,290 20 0,486 14 0,453 15 0,583 12 0,672 8 0,320 16 0,725 6 0,497 16

Spain 0,414 17 0,306 19 0,537 10 0,416 17 0,340 20 0,296 21 0,650 11 0,186 24 0,982 3 0,464 17

Slovenia 0,513 12 n/a - 0,312 18 0,423 16 0,443 17 0,509 14 0,802 2 0,181 25 0,406 17 0,440 18

Page 46: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

44

Slovakia 0,328 21 0,300 20 0,196 24 0,183 22 0,293 21 0,286 22 0,677 7 0,194 23 1,000 2 0,391 19

Malta 0,319 22 0,318 18 1,000 3 0,246 21 0,360 19 0,365 19 0,655 10 0,000 28 0,182 25 0,391 20

UK 0,613 8 n/a - 0,419 12 0,352 19 0,184 23 0,358 20 0,694 6 0,889 4 0,174 26 0,384 21

Romania 0,237 26 0,000 26 0,171 25 0,070 26 0,000 28 0,249 24 0,512 20 0,605 7 0,658 12 0,283 22

Croatia 0,306 23 0,388 17 0,035 28 0,000 28 0,393 18 0,385 18 0,542 17 0,109 26 0,398 18 0,281 23

Hungary 0,351 20 0,018 24 0,161 26 0,104 25 0,082 24 0,180 26 0,756 3 0,268 20 0,616 13 0,273 24

Poland 0,279 25 0,016 25 0,238 23 0,567 10 0,027 27 0,129 27 0,521 19 0,304 18 0,223 23 0,253 25

Bulgaria 0,188 28 0,060 23 0,398 15 0,379 18 0,078 25 0,051 28 0,247 27 0,254 21 0,193 24 0,207 26

Lithuania 0,289 24 0,133 21 0,248 22 0,107 24 0,187 22 0,267 23 0,454 22 0,024 27 0,128 27 0,193 27

Latvia 0,221 27 0,100 22 0,261 21 0,260 20 0,059 26 0,187 25 0,263 26 0,385 14 0,000 28 0,190 28

Source: Own elaboration from European Union (2014).

Page 47: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

45

Year 2015

Latest data

year

2012 2013 2013 2012 2012 2013 2012 2012

SII 2014 2.2.1 2.3.3 2.3.4 3.1.1 3.1.2 3.2.2 3.2.3 3.2.4 Innovation Output

and ranking

Luxembourg 0,642 6 0,749 6 1,000 2 1,000 2 0,732 1 0,851 1 0,591 12 1,000 2 0,252 21 0,772 1

Denmark 0,736 2 0,561 11 0,669 7 1,000 1 0,512 11 0,590 10 0,492 18 1,000 3 1,000 1 0,728 2

Germany 0,676 4 0,787 3 0,670 6 0,662 11 0,717 2 0,720 3 0,892 2 0,820 5 0,518 7 0,723 3

Sweden 0,740 1 0,779 4 0,661 8 0,999 3 0,656 6 0,540 12 0,648 9 0,524 11 0,156 24 0,620 4

Ireland 0,628 8 0,792 2 0,581 12 0,251 23 0,554 10 0,797 2 0,549 15 1,000 1 0,326 20 0,606 5

Finland 0,676 3 0,728 8 0,622 11 0,918 4 0,660 5 0,513 13 0,398 22 0,563 8 0,422 13 0,603 6

Austria 0,585 11 0,600 10 0,792 4 0,830 7 0,555 9 0,686 6 0,723 6 0,250 24 0,354 18 0,599 7

Malta 0,397 18 0,521 12 1,000 3 0,868 6 0,467 15 0,653 7 0,705 7 0,124 26 0,371 16 0,589 8

Italy 0,439 16 0,733 7 0,545 15 0,583 13 0,630 7 0,687 5 0,611 11 0,372 18 0,414 14 0,572 9

Netherlands 0,647 5 0,797 1 0,631 10 0,744 10 0,679 4 0,471 17 0,460 21 0,322 20 0,459 9 0,570 10

Belgium 0,619 9 0,753 5 0,542 17 0,516 16 0,713 3 0,505 14 0,529 17 0,546 9 0,426 12 0,566 11

France 0,591 10 0,516 13 0,462 20 0,506 17 0,474 14 0,607 9 0,725 5 0,513 12 0,545 5 0,544 12

Cyprus 0,445 15 0,492 14 1,000 1 0,541 15 0,398 18 0,481 16 0,477 19 0,496 13 0,436 11 0,540 13

Estonia 0,489 13 0,479 15 0,782 5 0,823 8 0,490 12 0,382 18 0,471 20 0,538 10 0,246 22 0,526 14

Slovenia 0,534 12 0,433 18 0,555 14 0,914 5 0,480 13 0,488 15 0,686 8 0,234 25 0,390 15 0,523 15

United

Kingdom 0,636 7 n/a - 0,578 13 0,486 18 0,364 19 0,560 11 0,564 14 0,971 4 0,579 4 0,513 16

Portugal 0,403 17 0,654 9 0,529 18 0,439 19 0,617 8 0,643 8 0,335 24 0,374 17 0,489 8 0,510 17

Czech

Republic 0,447 14 0,476 16 0,456 21 0,571 14 0,438 16 0,360 20 0,830 4 0,405 15 0,541 6 0,510 18

Greece 0,365 21 0,456 17 0,367 24 0,121 26 0,409 17 0,693 4 0,023 28 0,744 6 0,454 10 0,408 19

Page 48: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

46

Slovakia 0,360 22 0,135 21 0,351 25 0,255 22 0,120 22 0,271 21 0,850 3 0,335 19 0,869 2 0,398 20

Spain 0,385 19 0,149 20 0,653 9 0,398 20 0,139 21 0,188 25 0,532 16 0,312 21 0,590 3 0,370 21

Poland 0,313 24 0,000 27 0,436 22 0,816 9 0,009 26 0,000 28 0,578 13 0,376 16 0,168 23 0,298 22

Hungary 0,369 20 0,012 26 0,340 26 0,169 25 0,003 27 0,251 22 0,899 1 0,290 22 0,348 19 0,289 23

Croatia 0,313 23 0,254 19 0,217 28 0,075 28 0,215 20 0,364 19 0,379 23 0,086 27 0,362 17 0,244 24

Latvia 0,272 26 0,103 22 0,426 23 0,311 21 0,073 24 0,199 24 0,247 26 0,412 14 0,099 26 0,234 25

Bulgaria 0,229 27 0,041 24 0,545 16 0,606 12 0,021 25 0,077 27 0,183 27 0,286 23 0,057 27 0,227 26

Romania 0,204 28 0,012 25 0,285 27 0,098 27 0,000 28 0,088 26 0,616 10 0,658 7 0,029 28 0,223 27

Lithuania 0,283 25 0,102 23 0,473 19 0,234 24 0,082 23 0,247 23 0,261 25 0,026 28 0,123 25 0,194 28

Source: own elaboration from European Union (2014).

Page 49: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

47

Annex 2: Innovation inputs of the national innovations system of the EU28 member states

Year 2014

Latest data

year 2012 2012 2012 2010

SII 2013 1.3.1 1.3.2 2.1.1 2.1.2

Innovation input

and ranking

Sweden 0,750 1 0,979 2 0,503 8 0,991 2 0,319 10 0,698 1

Finland 0,684 4 0,990 1 0,544 3 1,000 1 0,241 18 0,694 2

Germany 0,709 3 0,856 4 0,369 11 0,835 5 0,464 6 0,631 3

Denmark 0,728 2 0,918 3 0,516 7 0,840 4 0,246 17 0,630 4

Estonia 0,502 13 0,794 6 n/a - 0,532 9 0,557 3 0,628 5

UK 0,613 8 0,485 15 0,762 2 0,485 12 n/a - 0,577 6

Slovenia 0,513 12 0,515 13 n/a - 0,926 3 0,272 14 0,571 7

Netherlands 0,629 6 0,825 5 0,523 6 0,519 10 0,306 11 0,543 8

Belgium 0,627 7 0,588 10 0,538 4 0,649 7 0,253 16 0,507 9

Austria 0,599 10 0,773 7 0,192 17 0,835 6 0,150 23 0,488 10

France 0,571 11 0,670 9 0,537 5 0,619 8 0,088 26 0,479 11

Luxembourg 0,646 5 0,371 20 1,000 1 0,424 14 0,050 27 0,461 12

Lithuania 0,289 24 0,546 12 n/a - 0,095 24 0,701 2 0,447 13

Czech Republic 0,422 16 0,763 8 0,037 18 0,429 13 0,350 8 0,395 14

Cyprus 0,501 14 0,216 25 n/a - 0,017 28 0,936 1 0,390 15

Poland 0,279 25 0,443 16 0,392 9 0,134 23 0,551 4 0,380 16

Portugal 0,410 18 0,567 11 0,350 12 0,294 16 0,254 15 0,366 17

Ireland 0,606 9 0,412 17 0,317 13 0,511 11 0,117 24 0,339 18

Spain 0,414 17 0,495 14 0,308 14 0,286 18 0,169 21 0,315 19

Page 50: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

48

Malta 0,319 22 0,206 26 n/a - 0,208 19 0,513 5 0,309 20

Hungary 0,351 20 0,309 23 0,373 10 0,359 15 0,176 20 0,304 21

Italy 0,443 15 0,412 18 0,200 15 0,290 17 0,293 13 0,299 22

Slovakia 0,328 21 0,361 21 n/a - 0,139 21 0,326 9 0,275 23

Croatia 0,306 23 0,289 24 n/a - 0,139 22 0,302 12 0,243 24

Greece 0,384 19 0,330 22 0,014 19 0,095 25 0,379 7 0,205 25

Latvia 0,221 27 0,392 19 n/a - 0,056 26 0,153 22 0,200 26

Romania 0,237 26 0,175 27 0,199 16 0,043 27 0,213 19 0,157 27

Bulgaria 0,188 28 0,113 28 0,000 20 0,160 20 0,106 25 0,095 28

Source: Own elaboration from European Union (2014).

Page 51: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

49

Year 2015

Latest data

year 2013 2013 2013 2012

SII 2014 1.3.1 1.3.2 2.1.1 2.1.2

Innovation input

and ranking

Sweden 0,74 1 0,957 3 0,536 7 0,956 2 0,412 10 0,775 1

Germany 0,676 4 0,88 4 0,378 12 0,868 4 0,746 3 0,718 2

Denmark 0,736 2 0,989 1 0,604 3 0,868 3 0,158 23 0,672 3

Finland 0,676 3 0,957 2 0,555 5 1 1 0,163 21 0,669 4

Belgium 0,619 9 0,609 11 0,574 4 0,687 7 0,3 13 0,543 5

Austria 0,585 11 0,793 7 0,229 15 0,841 6 0,212 19 0,519 6

Estonia 0,489 13 0,837 5 n/a - 0,357 14 0,871 1 0,516 7

France 0,591 10 0,674 9 0,548 6 0,626 8 0,161 22 0,487 8

Netherlands 0,647 5 0,772 8 0,497 8 0,493 10 0,047 27 0,437 9

United

Kingdom 0,636 7 0,457 15 0,672 2 0,454 11 0,12 25 0,426 10

Czech

Republic 0,447 14 0,804 6 0,035 19 0,445 12 0,376 11 0,415 11

Slovenia 0,534 12 0,522 12 n/a - 0,863 5 0,224 18 0,402 12

Malta 0,397 18 0,283 25 n/a - 0,194 19 0,659 4 0,379 13

Hungary 0,369 20 0,304 24 0,395 10 0,423 13 0,357 12 0,37 14

Portugal 0,403 17 0,5 13 0,382 11 0,278 18 0,298 14 0,365 15

Poland 0,313 24 0,38 18 0,349 13 0,159 22 0,559 6 0,362 16

Lithuania 0,283 25 0,63 10 n/a - 0,097 25 0,597 5 0,331 17

Ireland 0,628 8 0,326 22 0,417 9 0,493 9 0,173 20 0,331 18

Croatia 0,313 23 0,304 23 n/a - 0,172 20 0,508 7 0,328 19

Page 52: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

50

Italy 0,439 16 0,446 16 0,21 16 0,286 16 0,279 16 0,305 20

Spain 0,385 19 0,489 14 0,301 14 0,282 17 0,123 24 0,299 21

Latvia 0,272 26 0,326 21 n/a - 0,066 26 0,764 2 0,289 22

Greece 0,365 21 0,402 17 0 20 0,11 24 0,462 8 0,244 23

Slovakia 0,36 22 0,337 20 n/a - 0,159 23 0,415 9 0,228 24

Luxembourg 0,642 6 0,348 19 0,858 1 0,304 15 0,023 28 0,225 25

Cyprus 0,445 15 0,228 26 n/a - 0,022 28 0,283 15 0,133 26

Bulgaria 0,229 27 0,13 28 0,048 18 0,167 21 0,229 17 0,115 27

Romania 0,204 28 0,152 27 0,141 17 0,044 27 0,115 26 0,113 28

Source: Own elaboration from European Union (2015).

Page 53: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

51

Annex 3: The Efficiency of the EU28 Innovation Systems

Year 2014

Output Input

Productivity

(innovation

performance)

of innovation

system

Ranking in

terms of

productivity

(innovation

performance)

Summary

Innovation

Index (SII)

2013

Ranking

according

to the SII

(2013)

Greece 0.516 0.205 2.52 1 0.384 19

Bulgaria 0.207 0.095 2.19 2 0.188 28

Italy 0.591 0.299 1.98 3 0.443 15

Romania 0.283 0.157 1.80 4 0.237 26

Ireland 0.578 0.339 1.70 5 0.606 9

Cyprus 0.660 0.390 1.69 6 0.501 14

Luxembourg 0.754 0.461 1.63 7 0.646 5

Portugal 0.566 0.366 1.55 8 0.410 18

Spain 0.464 0.315 1.48 9 0.414 17

Slovakia 0.391 0.275 1.42 10 0.318 21

Germany 0.859 0.631 1.36 11 0.709 3

Austria 0.637 0.488 1.31 12 0.599 10

Page 54: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

52

Czech

Republic 0.497 0.395 1.26 13

0.422 16

Malta 0.391 0.309 1.26 14 0.319 22

Belgium 0.603 0.507 1.19 15 0.627 7

Croatia 0.281 0.243 1.16 16 0.306 23

Denmark 0.701 0.630 1.11 17 0.728 2

France 0.520 0.479 1.09 18 0.571 11

Netherlands 0.538 0.543 0.99 19 0.629 6

Latvia 0.190 0.200 0.95 20 0.221 27

Hungary 0.273 0.304 0.90 21 0.351 20

Estonia 0.544 0.628 0.87 22 0.502 13

Finland 0.579 0.694 0.83 23 0.684 4

Sweden 0.575 0.698 0.82 24 0.750 1

Slovenia 0.440 0.571 0.77 25 0.513 12

United

Kingdom 0.384 0.577 0.67 26

0.613 8

Poland 0.253 0.380 0.67 27 0.279 25

Lithuania 0.193 0.447 0.43 28 0.289 24

Source: Own elaboration from European Union (2014).

Page 55: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

53

Year 2015

Output Input

Productivity

(innovation

performance)

of innovation

system

Ranking in

terms of

productivity

(innovation

performance)

Summary

Innovation

Index (SII)

2014

Ranking

according

to the SII

(2014)

Cyprus 0,540 0,133 4,053 1 0,445 15

Luxembourg 0,772 0,225 3,431 2 0,642 6

Romania 0,223 0,113 1,976 3 0,204 28

Bulgaria 0,227 0,115 1,974 4 0,229 27

Italy 0,572 0,305 1,873 5 0,439 16

Ireland 0,606 0,331 1,833 6 0,628 8

Slovakia 0,398 0,228 1,749 7 0,360 22

Greece 0,408 0,244 1,677 8 0,365 21

Malta 0,589 0,379 1,554 9 0,397 18

Portugal 0,510 0,365 1,399 10 0,403 17

Netherlands 0,570 0,437 1,304 11 0,647 5

Slovenia 0,523 0,402 1,299 12 0,534 12

Spain 0,370 0,299 1,239 13 0,385 19

Page 56: Papers in Innovation Studies - Lunds universitetwp.circle.lu.se/upload/CIRCLE/workingpapers/201527_Edquist_Zabala... · 1 The Innovation Union Scoreboard is flawed: The Case of Sweden

54

Czech

Republic 0,510 0,415 1,228 14 0,447 14

United

Kingdom 0,513 0,426 1,204 15 0,636 7

Austria 0,599 0,519 1,154 16 0,585 11

France 0,544 0,487 1,116 17 0,591 10

Denmark 0,728 0,672 1,084 18 0,736 2

Belgium 0,566 0,543 1,044 19 0,619 9

Estonia 0,526 0,516 1,020 20 0,489 13

Germany 0,723 0,718 1,007 21 0,676 4

Finland 0,603 0,669 0,902 22 0,676 3

Poland 0,298 0,362 0,823 23 0,313 24

Latvia 0,234 0,289 0,809 24 0,272 26

Sweden 0,620 0,775 0,800 25 0,740 1

Hungary 0,289 0,370 0,782 26 0,369 20

Croatia 0,244 0,328 0,744 27 0,313 23

Lithuania 0,194 0,331 0,585 28 0,283 25

Source: Own elaboration from European Union (2015)