ISCRAM Impact Evaluation

20
Designing towards an impact evaluation framework for a collaborative information supply chain KENNY MEESTERS, BARTEL VAN DE WALLE ISCRAM BADEN-BADEN, MAY 2013

description

Emerging technologies provide opportunities for the humanitarian responders’ community to enhance the effectiveness of their response to crisissituations. A part of this development can be contributed to a new type of information supply chains -driven by collaboration with digital, online communities- enabling organizations to make better informed decisions. However, how exactly and to what extend this collaboration impacts the decision making process is unknown. To improve these new information exchanges and the corresponding systems, an evaluation method is needed to assess the performance of these processes and systems. This paper builds on existing evaluation methods for information systems and design principles to propose such an impact evaluation framework. The proposed framework has been applied in a case study to demonstrate its potential to identify areas for further improvement in the (online) collaboration between information suppliers and users.

Transcript of ISCRAM Impact Evaluation

Page 1: ISCRAM Impact Evaluation

Designing towards an impact evaluation framework for a collaborative information supply chainKENNY MEESTERS, BARTEL VAN DE WALLEISCRAM BADEN-BADEN, MAY 2013

Page 2: ISCRAM Impact Evaluation

OutlineOutline

Domain Processes

Evaluation types

Systems evaluatio

n

Evaluationperspectives

Scope Measurement

Concept Supply Usage

Findings

Objectives

Indicators

Conclusion

Results

V&TC

Design

Approach

Research

Future work

Page 3: ISCRAM Impact Evaluation

V&TC

Data collection•Media•Geo-location•SMS

Data processing•Analysis•Verification

Dissemenation•Information products

•Maps, reports, etc

Information consumers•Decision making•Monitoring

Outline

Page 4: ISCRAM Impact Evaluation

Volunteer Training

Open*

SBTF

Transition

Impact Evaluation Decision Makers

Data Scramble

Data licensing

Preparedness

UN OCHAIM

ISCRAM

IMMAP, Google, Mapaction

Woodrow, UN,Harvard, OSM

Mapaction,GISCorp,Munster

ISCRAM, ICT4Peace,Mapaction

SBTF, UNV

ISCRAM, Harvard, UvT

TBC

UNV, Munster

“The challenge is to improve coordination between the structured humanitarian system and the relatively loosely organized volunteer and technical communities. ” - Valerie Amos, UN Under-Secretary-General 

V&TC

ISCRAM, Harvard, UvT

Impact Evaluation

Impact Evaluation

Page 5: ISCRAM Impact Evaluation

What do we need?What do we want to know?What do we know?

BusinessV&TC

Practice

Theory

Impact Evaluation

Page 6: ISCRAM Impact Evaluation

What do we need?In general…

BusinessV&TC

Practice

Theory(2) Measure

Status quo

New situation

(1) Define

Indicators

Situations

(3) Analyze

Comparison

Conclusion

Page 7: ISCRAM Impact Evaluation

In general…Applications

BusinessV&TC

Practice

Theory

Project A

1. Impact evaluation

2. Impact assessment

3. Program evaluation

Project B

Page 8: ISCRAM Impact Evaluation

BusinessV&TC

Practice

Theory

• Determine how well specific initiatives perform• Adjust and fine tune specific decisions/projects• Determine ‘best’ response• Manage provided solutions• Secure resources• Advocate V&TC

• Impact evaluation

• Impact assessment

• Program evaluation

Use for V&TCApplications

Page 9: ISCRAM Impact Evaluation

BusinessV&TC

Practice

Theory

Use for V&TC

• Design principles of frameworks• Types

• Measurements

• Indicators

• V&TC• Objective

• Scope and focus

• Indicators

• Evaluate the framework• Case studies

• Refine-able, usable tools

Next steps

Page 10: ISCRAM Impact Evaluation

Next stepsEvaluation types

Evaluation

Perspective

Systems

General Formative

Resource centered

Efficiency oriented

Goal centered

Effectiveness oriented

Summative

Page 11: ISCRAM Impact Evaluation

Evaluation typesSystem evaluation

Evaluation

Perspective

Efficiency oriented

Effectiveness oriented

Systems

Resource investment

Production capability

Resource consumption

Organization

Organizational performance

User performance

System performance

Page 12: ISCRAM Impact Evaluation

System evaluationEvaluation implementation

Organizational performance

Department A

(Sub)Project A

(Sub)Project B

(Sub)Project C

Department B

Eval

uatio

n fr

amew

ork

Project Management

Efficiency orientedEffectiveness oriented

Page 13: ISCRAM Impact Evaluation

Scope

Overall impact of the response to a crisis

Impact of the decision making process on crisis

Impact of information productson the decision making process

Effect of data processing on information products

Impact of data collectionon data processing

Soft- en hardware impacton the system performance

1.

2.

3.

4.

5.

6.

SUPPLIER

CON

SUM

ER

Evaluation implementation

Page 14: ISCRAM Impact Evaluation

level 0:Request / definition

level 1:Resource allocation

level 2:Team capability

level 3:Investments

Impactlevel 1:

Support & information

level 2:Decision making

level 3:Response effectiveness

ScopeMeasurements

Efficiency-oriented

perspective

Effectiveness-oriented

perspective

System implementatio

n

Product generation efficiency

Response effectiveness

V&TC deployment

Page 15: ISCRAM Impact Evaluation

MeasurementsIndicators

level 1:Resource allocation

level 2:Team capability

level 1:Support & information

level 2:Decision making

Impact

ObjectivePerformance measure

Applied to V&TC

System development

Facilities allocation Availability of required (tech.) facilities

Schedule compliance Time required to setup required systems

Requirements definition

The clarity of requested products

Operational resources

Data collection efforts Time/effort required to analyze data

System maintenance Time/effort required to maintain system

Training/support/comm

Efforts for user assistance.ObjectivePerformance measure

Applied to V&TC

Team capacityProductivity rate Level of V&TC body deploymentRequired man-hours The total amount of hours used

Operational capability

Throughput Products delivered/users servedUtilization rate Hours to product ratioResponse time Turn-around time on specific

requests

ObjectivePerformance measure

Applied to V&TC

System quality

Usability Ease of use of information productsSystem features Customization of information

productsAccess / Availability Ease to reach information products

Information quality

Understandability Presentation of gathered information

Consistency Provided information is consistentImportance / Relevance

Relevance of provided information

ObjectivePerformance measure

Applied to V&TC

Individual impact

Awareness / Recall Better situational awareness Decision effectiveness Enhanced effectiveness of jobIndividual productivity Increased personal productivity

Organizational impact

Cost-effective Information products save resources

Increased capacity Increased effectiveness of operations

Overall productivity Potentially improved outcomes

Page 16: ISCRAM Impact Evaluation

4 suppliers, 7 consumers7 suppliers, 12 consumers

IndicatorsCase Study

Developpers and entry team

Specific knowledgeTime critical

No budgetGeographically seperate

Users are ‘unkown’

Developpers vs data entryExpertise availableTime limitedLimited budgetLocated in 1 officeDirect contact w. users

==≈≈≠≠

NGODHN

Page 17: ISCRAM Impact Evaluation

Case StudyInformation SupplySystem NGO V&TC FA:SC:RD:

Facilities allocation

Schedule compliance

Requirements definitionData collection efforts

System maintenance

Training, support and communication

Operational resources System developmentNGO Development V&TC Deployement

Resource NGO V&TCDC:SM:TS:

Productivity

Required man-hours

Throughput

Utilization rate

Response time

Operational capability Team capacityNGO Development V&TC Deployement

System NGO V&TC PD:MH:

Resource NGO V&TCTP:UR:RT:

Level 2: Capabilities

Level 1: Resources

Page 18: ISCRAM Impact Evaluation

Information SupplyInformation useSystem NGO V&TC SF:AV:US:

Inform. NGO V&TCUS:CO:IM:

Individ. NGO V&TC US:AW:EF:PR:Organ. NGO V&TCOP:CI:CE:US:

Level 2: Processes

Level 1: Information

Availability

Usability

System featuresUnderstandability

Consistency

Importance

NGO Development V&TC DeployementSystem quality Information quality

Usage

Awareness

Effectiveness

ProductivityUsage

Cost-effective

Capacity increase

Overall productivity

NGO Development V&TC DeployementIndividual impact Organization impact

Page 19: ISCRAM Impact Evaluation

Information useFindings

• Agile vs. Waterfall• Organizational use• Strong integration• Requirement analysis• Sample selection• Identifying population• Also for other information supply chains

• Difference in system use

• Increasing impact

• Improving evaluation

Page 20: ISCRAM Impact Evaluation

Future work

V&TC

Feedback:

Increase deployment impact

Advocacy:

Secure resources

Manage:

Improve products

Findings

• Historical data• Feedback loops• Add/remove variables• Scope of evaluation

• Refinements

• Framework design

• Application

Apply framework

Select case

Impact EvaluationFramework

Select particiapants

Control group

Conducted InterviewApply framework

Select particiapants

Control group

Conducted Interview

Verify results Verify resultsRefine Framework

Statistical analysis

Data store

Model Refinement loop

Impact evaluation outcome

Evaluation approach

V&TC

Feedback:

Increase deployment impact

Advocacy:

Secure resources

Manage:

Improve products

Feedback:

Manage pool of

resources

Advocacy:

Common understan-diing of IS impact

Manage:

Identify gaps,

ensure good fit

Coordination

Feedback:

Manage pool of

resources

Advocacy:

Common understan-diing of IS impact

Manage:

Identify gaps,

ensure good fit

Coordination

Feedback:

Improve effectivenss by IS

use

Advocacy:

Articulate needs,

and require-ments

Manage:

Improve IS use in future

responses

Decision Makers

Feedback:

Improve effectivenss by IS

use

Advocacy:

Articulate needs,

and require-ments

Manage:

Improve IS use in future

responses

Decision Makers

Impact Evaluation for the V&TC:Communicate, Learn, Advocate