Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr....

89
1072014 1 Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School 1

Transcript of Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr....

Page 1: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

1

Introduction to Design ScienceMethodology

Prof. Dr. Roel Wieringa

University of Twente

The Netherlands

10 July 2014 eBISS Summer School 1

Page 2: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

2

Outline

• Design science

• Theories

• Methods

– Empirical research setup

– Patterns of reasoning

10 July 2014 eBISS Summer School 2

Page 3: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

3

• R.J. Wieringa. Design Science Methodology forInformation Systems and Software Engineering. Springer, 2014.

• More information at http://wwwhome.ewi.utwente.nl/~roelw/

10 July 2014 eBISS Summer School 3

Page 4: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

4

• Design science is the design and investigation of artifacts in context

410 July 2014 eBISS Summer School

Page 5: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

5

• Design science is the design and investigation of artifacts in context

510 July 2014 eBISS Summer School

Page 6: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

6

6

Subjects of design science

Artifact:

SW component/system,HW component/system,Organization,Business process,Service,Method, Technique,Conceptual structure, ...

Problem context:

SW components & systems, HW components & systems,Organizations,Business processes, Services,Methods,  Techniques,Conceptual structures, People,Values, Desires, Fears, Goals, Norms, Budgets,...

Interaction

Something to be designedSomething to be influenced10 July 2014 eBISS Summer School

Page 7: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

7

• Design science is the design and investigation of artifacts in context

710 July 2014 eBISS Summer School

Page 8: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

8

Research problems in design science

• “Design a DoA estimation system for satellite TV reception in a car.”

• “Design a multi‐agent aircraft taxi‐route planning system for use on airports”

• “Design an assurance method for data location compliance for CSPs”

• “Is the DoA estimation accurate enough?”

• “Is this agent routing algorithm deadlock‐free?”

• “Is the method usable and useful for cloud service providers?

8

To design an artifactto improve a 

problem context

To answer knowledgequestions about the artifact in 

context

Problems & Artifactsto investigate

Knowledge,Design problems

The design researcher iterates over these two activities

10 July 2014 eBISS Summer School

Page 9: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

9

Discussion

• What is the research problem that you are working on?

10 July 2014 eBISS Summer School 9

Page 10: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

10

Design problems

10 July 2014 eBISS Summer School 10

Page 11: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

11

Template for design problems

• Improve <problem context> 

• by <treating it with a (re)designed artifact> 

• such that <artifact requirements>

• in order to <stakeholder goals>

• Improve my body / mind health 

• by taking a medicine 

• such that relieves my headache

• in order for me to get back to work

10 July 2014 eBISS Summer School 11

Page 12: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

12

Template for design problems

• Improve <problem context> 

• by <treating it with a (re)designed artifact> 

• such that <artifact requirements>

• in order to <stakeholder goals>

• Improve my body / mind health 

• by taking a medicine 

• such that relieves my headache

• in order for me to get back to work

10 July 2014 eBISS Summer School 12

Problem context andstakeholder goals

Page 13: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

13

Template for design problems

• Improve <problem context> 

• by <treating it with a (re)designed artifact> 

• such that <artifact requirements>

• in order to <stakeholder goals>

• Improve my body / mind health 

• by taking a medicine 

• such that relieves my headache

• in order for me to get back to work

10 July 2014 eBISS Summer School 13

Artifact and its desiredinteractions

Page 14: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

14

• Design problems are usually not considered to beresearch problems

• They are stated in the form of questions

– How to plan aircraft taxi routes dynamically?

– Is it possible to plan aircraft routes dynamically?

– Etc.

• And they are called “technical research questions”.

• This way, stakeholders, goals, and requirements stayout of the picture!

10 July 2014 eBISS Summer School 14

Page 15: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

15

Discussion

• What is your top‐level design problem?

10 July 2014 eBISS Summer School 15

Page 16: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

16

The engineering cycle

10 July 2014 eBISS Summer School 16

(Checklist for solving design problems)

Page 17: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

17

10 July 2014 eBISS Summer School17

Engineering cycle

Implementation evaluation = Problem investigation

Treatment designDesign validation

Designimplementation

•Stakeholders? Goals? •Conceptual problem framework?•Phenomena? Causes? Effects?•Effects contribute to Goals?

•Specify requirements!•Requirements contribute to goals?•Available treatments?•Design new ones!

•Context & Artifact → Effects?•Effects satisfy Requirements?•Trade‐offs for different artifacts?•Sensitivity for different Contexts?

Legend: Knowledge questions? Actions!

Page 18: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

18

10 July 2014 eBISS Summer School18

Implementation (transfer to problem context) is not part of research

Implementation evaluation = Problem investigation

Treatment designDesign validation

Designimplementation

•Stakeholders? Goals? •Conceptual problem framework?•Phenomena? Causes? Effects?•Effects contribute to Goals?

•Specify requirements!•Requirements contribute to goals?•Available treatments?•Design new ones!

•Context & Artifact → Effects?•Effects satisfy Requirements?•Trade‐offs for different artifacts?•Sensitivity for different Contexts?

Page 19: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

19

• Research projects may focus on

– Implementation evaluation

– Problem investigation

– Treatment design and validation

10 July 2014 eBISS Summer School 19

Page 20: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

20

Research problems in design science

20

To design an artifactto improve a 

problem context

To answer knowledgequestions about the artifact in 

context

Problems & Artifactsto investigate

Knowledge,Design problems

10 July 2014 eBISS Summer School

Page 21: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

21

Knowledge questions

• Descriptive questions: – What happened?– When?– Where?– What components were involved?– Who was involved?– etc. 

• Explanatory questions: – Why? 

• What has caused the phenomena? • Which mechanisms produced the phenomena? • For what reasons did people do this?

10 July 2014 eBISS Summer School 21

Journalisticquestions

Researchquestions

Page 22: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

22

Effect questions

– Central effect question

• Effect question: Context X Artifact→ Effects?

– Generalizations

• Trade‐off question: Context X Alternative artifact  → Effects?

• Sensitivity question: Other context X artifact → Effects?

– Descriptive or explanatory questions

10 July 2014 eBISS Summer School 22

Page 23: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

23

Contribution questions– Central contribution question:

• Contribution question: Do Effects contribute to Stakeholder goals?

– Preliminary questions:

• Stakeholder question: Who are the stakeholders?

• Goal question: What are their goals?

– In academic research projects, the answers to these questions may be speculative

• From utility‐driven to curiosity‐driven projects

10 July 2014 eBISS Summer School 23

Page 24: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

24

Example knowledge questions

• Effect: – What is the execution time of the 

DoA algorithm?

– What is its accuracy?

• Trade‐off:– Comparison between algorithms

on these two variables

– Comparison between versions of one algorithm

• Sensitivity:– Assumptions about car speed?

– Assumptions about processor?

• Stakeholders:― Who are affected by the DoA

algorithm? 

• Goals:– What are their goals?

• Contribution evaluation(after DOA algorithm is in use)– How well does the DoA

algorithm contribute to these goals?

10 July 2014 eBISS Summer School 24

Page 25: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

25

Discussion

• Which knowledge questions do you have?

– Effect questions

– Trade‐off

– Sensitivity

– Satisfaction of requirements

– Contribution to stakeholder goals

10 July 2014 eBISS Summer School 25

Page 26: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

26

Outline

• Design science

– Design problems

– Engineering cycle

– Knowledge questions

• Theories

• Methods

– Empirical research setup

– Patterns of reasoning

10 July 2014 eBISS Summer School 26

Page 27: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

27

Outline

• Design science

– Design problems

– Engineering cycle

– Knowledge questions

• Theories

• Methods

– Empirical research setup

– Patterns of reasoning

10 July 2014 eBISS Summer School 27

Page 28: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

28

• To answer a knowledge question, you may have to– Read the scientific literature

– Read the professional literature

– Ask experts

– Do original research

10 July 2014 eBISS Summer School 28

If scientific research, this is very expensive

Page 29: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

29

Empirical research

• The goal of empirical research is to develop, test or refine theories

10 July 2014 eBISS Summer School 29

Prior beliefs:• Theories• Specifications• Experiences• Lessons

learned

Empiricalresearch

Knowledge questionsPosterior beliefs:Updated• Theories• Specifications• Experiences• Lessons

learned

Page 30: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

30

• A theory is a belief that there is a pattern in phenomena– Speculations

– Opinions

– Ideologies

– …

• A scientific theory is a theory that

– Has survived tests against experience

– Has survived criticism by critical peers

• All theories about the real world are fallible

10 July 2014 eBISS Summer School 30

Page 31: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

31

The structure of scientific theories

1. Conceptual framework– E.g. The concepts of beamforming, of multi‐agent planning, of data 

location compliance

2. Generalizations stated in terms of these concepts, thatexpress beliefs about patterns in phenomena.

– E.g. relation between angle of incidence and phase difference

3. Scope of the generalizations.  Population, or similarityrelation

– Assumptions about the phenomena to which relation is applicable: plabne waves, narrow bandwidth, etc.

10 July 2014 eBISS Summer School 31

Page 32: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

32

The structure of design theories

1. Conceptual framework to specify artifact and describecontext

2. Generalizations of the form Artifact X Context → Effects

3. The scope: – constraints on artifact design, 

– assumptions about the context

10 July 2014 eBISS Summer School 32

Page 33: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

33

Variables

• Conceptual frameworks may define variables.

• Variables have data types and scales

• Generalizations are stated in terms of variables

• Examples (variables in bold):– DOA performance graphs relating noise, angle of incidence, and

accuracy of estimation

– DOA analytical generalization: change in angle of incidence causeschange in phase difference

– Software engineering empirical generalization: Introduction of agile development causes customer satisfaction to increase

– Software engineering laboratory generalization: Programmerproductivity correlates well with conscientiousness

10 July 2014 eBISS Summer School 33

Page 34: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

34

Architectures

• Conceptual frameworks may define an architecture forphenomena in terms of components and relationships

• Components have capabilities

• Generalizations can be stated in terms of capabilities of components and of interactions of components (mechanisms)

• Examples (components in bold):– DOA mechanistic theory: e.g. input‐output relation is explained by

components and structure of the algorithm

– A mechanism in observed in agile development: In agile developmentfor SME, the SME does not put customer on‐site. SME resources are limited and focus is on business. 

– A mechanism observed in requirements engineering: Introduction of change control board reduces requirements creep.

10 July 2014 eBISS Summer School 34

Page 35: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

35

Functions of scientific theories

10 July 2014 eBISS Summer School 35

Page 36: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

36

The functions of scientific theories

• To analyze a conceptual structure

• To describe phenomena (descriptive statistics, interpretation)

• To explain phenomena

• To predict phenomena (important for design)

• To design an artifact by which to treat a problem

10 July 2014 eBISS Summer School 36

Page 37: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

37

The functions of scientific theories

• To analyze a conceptual structure

• To describe phenomena (descriptive statistics, interpretation)

• To explain phenomena

• To predict phenomena (important for design)

• To design an artifact by which to treat a problem

10 July 2014 eBISS Summer School 37

Page 38: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

38

The functions of scientific theories

• To analyze a conceptual structure

• To describe phenomena (descriptive statistics, interpretation)

• To explain phenomena (the classical function of theories)

• To predict phenomena (important for design)

• To design an artifact by which to treat a problem

10 July 2014 eBISS Summer School 38

Page 39: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

39

Causal explanations(cause‐effect relation between variables)

• If Y has been caused by X, then Y changed because X changedearlier in a particular way

• Examples– Light is on because switch was turned

– Cost increased because the organization had to perform additionaltasks

• Causation may be nondeterministic– Forward nondeterminism: X sometimes causes Y

– Backward: Y is sometimes caused by X

• In the field, the causal influence of X on Y may be swamped bymany other causal influences. – Lab research versus field research

10 July 2014 eBISS Summer School 39

Page 40: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

40

Architectural explanations(interactions among components)

• If system phenomenon E was produced by the interaction of system components C1, …, Cn, then C1, …, Cn is called a mechanistic explanation of E.

• Examples– Light is on because it is connected by to electricity supply when switch 

was turned on

– Cost increased because new people had to be hired to performadditional tasks

• May be nondeterministic

• May be interfered with by other mechanisms in the field

10 July 2014 eBISS Summer School 40

Page 41: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

41

Checklist for empirical research

(the empirical cycle)

10 July 2014 eBISS Summer School 41

Page 42: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

42

Empirical research

• The goal of empirical research is to develop, test or refine theories

10 July 2014 eBISS Summer School 42

Prior beliefs:• Theories• Specifications• Experiences• Lessons

learned

Empiricalresearch

Knowledge questions

Posterior beliefs:• Updated

theories

Page 43: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

43

Checklist questions about research context

10 July 2014 eBISS Summer School43

1. Improvement goal?2. Knowledge goal?3. Current knowledge?

Engineering cycle Empirical cycle

17. Contribution to knowledge goal?18. Contribution to improvement goal?

4. … ….16. …

Page 44: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

44

10 July 2014 eBISS Summer School44

Research problem analysis4. Conceptual framework?5. Research questions?6. Population?

Research execution11. What happened?

Research & inference design7. Object of study?8. Treatment specification?9. Measurement specification?10. Inference?

Data analysis12. Data?13. Observations?14. Explanations?15. Generalizations?16. Answers?

Empirical cycle

New research problem

Design validation7. Object of study validation?8. Treatment specification validation?9. Measurement specification validation?10. Inference validagtion?

Page 45: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

45

Outline

• Design science

• Theories

– Structure: Conceptual framework, generalizations

– Functions: explanation etc.

– Empirical cycle

• Methods

– Empirical research setup

– Patterns of reasoning

10 July 2014 eBISS Summer School 45

Page 46: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

46

Outline

• Design science

• Theories

– Structure: Conceptual framework, generalizations

– Functions: explanation etc.

– Empirical cycle

• Methods

– Empirical research setup

– Patterns of reasoning

10 July 2014 eBISS Summer School 46

Page 47: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

47

The empirical research setup

• The researcher wants to answer a question about a population. 

• He or she selects a sample of objects of study (OoS) that representpopulation elements. 

• In experimental research: S/he treats some/all OoS’s in the sample.

• S/he measures phenomena in the OoS’s. 

10 July 2014 eBISS Summer School 47

Page 48: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

48

• Observational versus experimental setup

• Case‐based versus sample‐based research

10 July 2014 eBISS Summer School 48

Page 49: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

49

Observational setup

• The researcher wants to answer a question about a population. – E.g. How is the UML used? About all SE projects that use UML.– What are the causes of project failure? About all IS development projects

• He or she selects a sample of objects of study (OoS) that representpopulation elements.– All projects in a company – Some projects in some companies– One project in some company

• S/he measures phenomena in the OoS’s. – Modeling effort, model correctness, – Using as instruments primary documents, interviews, questionnaires, email 

logs, UML models, …

10 July 2014 eBISS Summer School 49

Page 50: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

50

Experimental setup

• The researcher wants to answer a question about a population. – E.g. what is the effect of using UML? About all SE projects that use UML.

• He or she selects a sample of objects of study (OoS) that representpopulation elements. – Some projects in some companies

– One project in some company

• S/he treats some/all OoS’s in the sample.– Ask some projects to use the UML

• S/he measures phenomena in the OoS’s• Modeling effort, model correctness, using similar instruments as before10 July 2014 eBISS Summer School 50

Page 51: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

51

Case‐based research

• Observational or experimental

• Study one OoS at a time:

• The sample is studied in series, with an analysis in between twocase studies.– What is the effect of using the UML?

– How is the UML used? 

– Which architecture does the case have? (e.g. actors, documents, artifacts)

– Which mechanism take place? ( interactions, communications, coordination)

10 July 2014 eBISS Summer School 51

Page 52: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

52

Sample‐based reseach

• Observational or experimental

• Study samples of OoS’s as a whole

• Sample statistics are used to derive estimations of statisticalpopulation parameters– What is the effect of using UML? 

– What is the average modelling effort, compared to the modelling effort of other projects of similar size?

10 July 2014 eBISS Summer School 52

Page 53: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

53

Case‐based research Sample‐based research

Observational setup (No treatment)

Observational case study: Study the structure andmechanisms of a single case

Survey: Study a large population sample statistically

Experimental setup (treatment)

Single‐case mechanismexperiment:Testing a prototype, simulating a system, ….

Technical action research:Experimental use of a novel artifact

Statistical difference‐making experiment:Comparison of difference in statisticaloutcomes of treatmentson two samples

10 July 2014 eBISS Summer School 53

Page 54: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

54

Outline

• Design science

• Theories

• Methods

– Empirical research setup

• Observational or experimental

• Case‐based or sample‐based

– Patterns of reasoning

10 July 2014 eBISS Summer School 54

Page 55: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

55

10 July 2014 eBISS Summer School55

Research problem analysis4. Conceptual framework?5. Research questions?6. Population?

Research execution11. What happened?

Research & inference design7. Object of study?8. Treatment specification?9. Measurement specification?10. Inference?

Data analysis12. Data?13. Observations?14. Explanations?15. Generalizations?16. Answers?

Empirical cycle

Research setup and justification

Design validation7. Object of study validation?8. Treatment specification validation?9. Measurement specification validation?10. Inference validation?

Page 56: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

56

• Each of the choices in the design of a research setup has consequences for the kinds of inferences fromdata that we can do

– Validity of research setup wrt planned inferences

– Validity of inferences wrt research setup

10 July 2014 eBISS Summer School 56

Page 57: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

57

Inferences from data

• Conclusion validity: How well is a statistical inference from a sample to a population supported?

• Internal validity: How well is an explanation supported by anabductive argument?

• External validity: How well is a generalization beyond a population supported by an analogy?

10 July 2014 eBISS Summer School 57

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 58: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

58

Descriptive inference from raw data

• Removal of outliers, computation of statistics

• Visualization of data

• Interpretation of words and images

• Descriptive validity:– Descriptive inference should add no information to the data  (= non‐

ampliative = non‐defeasible)

10 July 2014 eBISS Summer School 58

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 59: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

59

Statistical inference

• Estimation of population parameters 

• Computational explanation of observations by some statisticalmodel

• Validty wrt research setup

10 July 2014 eBISS Summer School 59

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 60: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

60

Examples

• Statistical inference is sample‐based

• From an observational setup:– Classify the vulnerabilities found in a sample of 20 open source web 

applications

– Find that in this sample, on the average 70% of the vulnerabilities in anOS WA are implementation vulnerabilities

– Infer a confidence interval for the average proportion of implementation vulnerabilities in the population of web applications.

– Validity: Assume that sample is random draw from a population, whichhas a constant probability of implementation vulnerabilities

10 July 2014 eBISS Summer School 60

Page 61: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

61

• From an experimental setup:– Teach two programming techniques to two groups of students

– Let them write programs using these techniques, and ask otherstudents to perform maintenance tasks on these programs

– Measure effort ( = time to perform maintenance task) 

– Compute difference in average effort in the two groups.

• Two kinds of statistical inference:– (a) Estimate confidence interval for the average effort in the 

population; if 0 is not in this confidence interval, infer that there is a statistically discernable difference in average maintenance effort in the two populations …

– (b) Compute probability of observing at least the measured differenceif the population difference would be 0; if this probability is small, conclude that there is a difference in the population

• Validity– Random sampling & allocation, sufficient sample size, stable

probability distribution, assumptions about the distribution (e.g. normality).

10 July 2014 eBISS Summer School 61

Page 62: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

62

Abductive inference

• Explanations of observations or of population‐level generalizations– Causal explanations (one variable makes a difference to another)

– Architectural explanations (components, capabilities, mechanisms)

– Rational explanations (desires, goals, motivations)

• Validity wrt research setup10 July 2014 eBISS Summer School 62

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 63: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

63

Causal explanations• Single‐case causal experiment

– Apply stimulus to object of study, withhold the stimulus, compare the effects.

– Validity: Effect is transient, and all other conditions remain constant.

• Comparative case causal experiment– Apply stimulus to one OoSD, withhold from the other, compare

effects.

– Validity: OoS’s are imilar, all other conditions constant.

• Randomized controlled trial– E.g. maintenance example given earlier.

– In the long run, the only plausible cause of outcome difference is difference in treatments

• Quasi‐experiment– Same, but with non‐random sampling/allocation. Pre & posttest of 

relevant variables

– Rank all possible causes on plausibility

10 July 2014 eBISS Summer School 63

Page 64: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

64

Architectural explanations

• Explain a phenomenon by interaction among componentsthat produced the phenomenon

• Components have capabilities/limitations

• The architecture constrains possible interactions

• Mechanism = interaction triggered by stimulus

10 July 2014 eBISS Summer School 64

Page 65: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

65

• Glennan ‐ ``Mechanisms and the nature of causation’’.  1996

10 July 2014 eBISS Summer School 65

Page 66: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

66

10 July 2014 eBISS Summer School 66

• Glennan ‐ ``Mechanisms and the nature of causation’’.  1996

A voltage switch

Page 67: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

67

• Bechtel & Abrahamsen – ``Explanation; a mechanistic alternative.’’ 2005

10 July 2014 eBISS Summer School 67

Page 68: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

68

• Bechtel & Abrahamsen –``Explanation; a mechanisticalternative.’’ 2005

10 July 2014 eBISS Summer School 68

Page 69: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

69

10 July 2014 eBISS Summer School 69

Page 70: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

70

10 July 2014 eBISS Summer School 70

Page 71: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

71

Two kinds of effect questions

• Test effect of programming techniqueon effort.

• Effect of treatment

• Statistical observation of difference

• Causal explanation of outcome difference bydifference in treatment

• This calls for a furtherarchitectural explanation

• Test effect of personalityon productivity.

• Difference amongcapabilities

• Statistical observation of difference

• Explanation of differencein outcome by differencesin capability.

• This too needs a furtherarchitectural explanation

10 July 2014 eBISS Summer School 71

Page 72: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

72

Rational explanation

• Explain behavior of actors in terms of their goals– Explain project failure by power struggles,

– Deviation from business processes by personal goals of people, etc.

• Validity: we know their goals, and they are motrivated by theirgoals

• NB rational explanations extend architectural explanations.

10 July 2014 eBISS Summer School 72

Page 73: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

73

Analogic inference

• Analogy– (Similar cases / similar populations) will exhibit similar observations

produced by the same (causes / mechanisms / reasons)

– Similarity in variables or similarity in architecture?

– The explanation is generalized by analogy.

• Validity wrt research setup

10 July 2014 eBISS Summer School 73

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 74: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

74

Examples

• Case‐based analogies:– This agile project done for an SME is similar to that one, so probably

the SME will not put a client on‐site of the project here too.

– This machine is of the same typer as that one, so it will probablycontain the same mechanisms

• Sample‐based analogy:– The elements of population are architecturally similar to that one, so

the distribution of X is probably similar too.

• Validity:– Architectural similarity; no other mechanisms that interfere

10 July 2014 eBISS Summer School 74

Page 75: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

75

• Variable‐based analogy is weak basis for analogicgeneralization

• Superficial analogy (similar values of variables) is the logic of sympathetic magic

• Inference is correct in rare cases– Benjamin Franklin

• We need similarity in architecture, so that we can assumesimilarity in mechanisms

10 July 2014 eBISS Summer School 75

Page 76: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

76

10 July 2014 eBISS Summer School 76

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 77: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

77

• Case‐based research and sample‐based research have theirown typical patterns of reasoning

10 July 2014 eBISS Summer School 77

Page 78: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

78

Case‐based inference

• In case‐based inference, 

– we postulate a mechanism to explain behavior observedin a case and

– reason by analogy that in architecturally similar cases, these mechanisms will produce similar effects

10 July 2014 eBISS Summer School 78

Data

Explanations

Observations

Populationgeneralizations

Abduction

AnalogyDescription

Page 79: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

79

Case‐based inference

• Examples– Observational case study: Studying agile projects in the real world

– Observational case study: studying coordination phenomena in a global software engineering project

– Simulation: Testing a software prototype in a simulated context

– Technical action research: Applying an experimental risk assessment technique for a client

10 July 2014 eBISS Summer School 79

Page 80: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

80

Sample‐based inference

• In sample‐based inference, 

– we statistically infer a property of a population from statistics of a sample,

– Postulate one or more possible mechanism to explain thisproperty,

– Speculate about possible generalizations to other populations.10 July 2014 eBISS Summer School 80

Data

Explanations

Observations

Populationgeneralizations

Statistical inference

AbductionAnalogyDescription

Page 81: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

81

• Analogy plays a role twice in sample‐based inference

– Once in the definition of the population

– Once in generalizing to similar populations (externalvalidity)

10 July 2014 eBISS Summer School 81

Page 82: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

82

Sample‐based inference• Examples

– Survey of sample of agile projects

– Survey of coordination phenomena in global software engineering projects

– Statistical difference‐making experiment: Comparing two software engineering techniqaues in two samples of student projects

10 July 2014 eBISS Summer School 82

Page 83: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

83

Prediction

• We may stop after description, generalize by analogy, and thenuse this for prediction– Must assume a stable architecture in cases generalized about, even if we do 

not know it.

• We may stop after statistical inference, and use for prediction of statistics of future samples from same (or different!) population– Must assume stable architecture in population, even if we do not know it.

10 July 2014 eBISS Summer School 83

Data

Explanations

Observations

Populationgeneralizations

Abduction

Statistical inference

Description Analogy Abduction

Page 84: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

84

Patterns of reasoningCase‐based research: case‐based inference

Sample‐based research: sample‐based inference

Observational setup (No treatment)

Observational case study:Architectural explanation, analogy.

SurveyStatistical inference.

Experimental setup (treatment)

Single‐case mechanismexperimentArch explanation, analogy;Causal reasoning too ifsimilarity high enough

Technical action researchArchitectural explanation, analogy

Statistical difference‐making experiment;Statistical inference; causal inference.

10 July 2014 eBISS Summer School 84

Page 85: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

85

Take home

• Design science

– Design problems

– Engineering cycle

– Knowledge questions

• Theories

– Structure: Conceptualframework, generalizations

– Functions: explanations etc.

– Empirical cycle

• Methods

– Empirical research setup

– Empirical research setup

• Observational or experimental

• Case‐based or sample‐based

– Patterns of reasoning

• Description, 

• Statistical inference, 

• Abduction (causal. Architectural, rational)

• Analogy

10 July 2014 eBISS Summer School 85

Page 86: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

86

• Wieringa, R.J. Design Science Methodology for Information Systems and Software Engineering. Springer, 2014.

• Wieringa, R.J. Daneva, M. ``Six Strategies for Generalizing Software Engineering Theories’’. Science of Computer Programming, to be published.

• Wieringa, R.J. (2014) “Empirical research methods for technology validation: Scaling up to practice.” Journal of systems and software, on‐line first.

• Wieringa, R.J. and Condori‐Fernández, N. and Daneva, M. and Mutschler, B. andPastor, O. (2012) Lessons learned from evaluating a checklist for reportingexperimental and observational research. In: Proceedings, ESEM 2012, pp. 157‐160. ACM. 

• Wieringa, R.J. and Morali, A. (2012) Technical Action Research as a ValidationMethod in Information Systems Design Science. In: Design Science Research in Information Systems. Advances in Theory and Practice 7th International Conference, DESRIST 2012. pp. 220‐238. LNCS 7286. Springer.

• Wieringa, R.J. (2010) Relevance and problem choice in design science. In: Global Perspectives on Design Science Research (DESRIST). pp. 61‐76. LNCS 6105. Springer.

• Wieringa, R.J. (2009) Design Science as Nested Problem Solving. In: Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology, Philadelphia. pp. 1‐12. ACM. 

10 July 2014 eBISS Summer School 86

Page 87: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

87

The big picture

Back to the design cycle

10 July 2014 eBISS Summer School 87

Page 88: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

88

Summary of research designs and research goals

10 July 2014 eBISS Summer School 88

Research goals

Research designs Evaluation research / Problem research 

Treatment survey

Validation research

Survey To survey problemowners / implementations

To surveypossibletreatments

Observational case study

To study a problem /Implementation

Single‐casemechanismexperiment;Expert opinion about an artifact

To diagnose a problem/ Test animplementation in context

To test anartifact without context

To validate an artifactin context

Technical Action Research (TAR)

To validate usabilityand usefulness of  anartifact in practice

Statistical difference‐making experiment

To compare the effect of interventions on random samples

To compare the effect of treatments on random samples

Page 89: Introduction to Design Science Methodology...Introduction to Design Science Methodology Prof. Dr. Roel Wieringa University of Twente The Netherlands 10 July 2014 eBISS Summer School

10‐7‐2014

89

• Single‐case mechanism experiments: Investigate underlyingmechanisms (interaction between components) in single cases– Test a single instance of the artifact in the lab/field

– Technical action research: Use the artifact to solve real‐world problem

• Statistical difference‐making experiments: Investigate averageeffects (average difference between treating and not treating) in large samples

10 July 2014 eBISS Summer School89

Small samples Large  samples Population

More realistic conditions of practice

Larger generalizations

Idealized

Practical

Statistical difference‐making experiments

Single‐case mechanismexperiments

Technical action research