Introduction to assessment performance Mikko Pohjola, THL.

22
Introduction to assessment performance Mikko Pohjola, THL

Transcript of Introduction to assessment performance Mikko Pohjola, THL.

Introduction to assessment performanceMikko Pohjola, THL

Contents

• Concepts

• General framework

• Common perspectives (& examples)• Quality assurance/quality control• Uncertainty• Model performance

• Properties of good assessment

• Summary

Concepts & rationale

• Some basic concepts:• Performance = goodness!• Assessment, Management• Model• Process (making/using), Product• Output, Outcome• Assessor, Decision/Policy maker, Stakeholder• Participant, User

Concepts & rationale

• Why evaluation of assessment performance is important?

• Efficient use of resources?• Value of work done?• Importance/meaning of information?• Implications of information?• Actual impacts of information?• …• …because funder, customer, user, boss, peer,

stakeholder etc. wants/needs to know!

Roles and interests

Experts Data quality, analysis procedure, coherence, comprehensiveness, …

Funders Relevance, efficiency, timeliness, importance, …

Users (DM) Understandability, reliability (of source), acceptance, practicality, …

Interested (SH) (same as DM, but different perspective)

General RA/RM framework

• Process, product, use

Assessment Use

Assessmentprocess

Assessmentproduct

Decision making

Knowledgeneed

Productrequirement

Processrequirement

Common perspectives & examples

• Quality assurance/quality control• Focus on assessment process• An “engineering” perspective

• Uncertainty• Focus on assessment output• A scientists perspective???

• Model performance• Focus on modelling and model

• Combines QA/QC and uncertainty perspectives• A modellers perspective

Quality assurance/quality control

• Principle:• Good process guarantees good outputs/outcomes!

• Question:• How should an assessment process be conducted?

• Examples:• Ten steps by Jakeman et al.(2006)• IDEA framework (Briggs, 2008)• (Over)appreciation of randomized controlled trials

(RCT’s)

Ten iterative steps in development and evaluation of environmental models

Jakeman et al.: Ten iterative steps in development and evaluation of environmental models. Environmental Modelling & Software Issue 5, May 2006, Pages 602-614

IDEA framework (INTARESE)

Briggs: A framework for integrated environmental health impact assessment of systemic risks. Environmental Health 2008, 7:61.

Uncertainty

• Principle:• Performance is an intrinsic property of an information

product!

• Question:• How good is the answer provided by the

assessment?

Uncertainty

• Examples:• Statistical uncertainty analysis

• Mean, variance, confidence limits, distributions, …• Cf. D. Lindley: Philosophy of Statistics, 2000

• Sources of uncertainty• E.g. model, parameter & scenario uncertainty (as

applied e.g. by the U.S.EPA)• Extensive approaches

• E.g. inclusion of qualitative aspects, sources of uncertainty as in NUSAP (www.nusap.net)

NUSAP

• N: numeral

• U: unit

• S: spread

• A: assessment (qualitative judgment)

• P: pedigree (historical path leading to result)

NUSAP - pedigree

Jeroen van der Sluijs: NUSAP- some examples. Presentation. Available: http://tinyurl.com/5uwln2r

Model performance

• Principle:• The model is the essence of the assessment!

• Question:• How good is the model?

• Examples:• Verification, validation, (reliability, usability, …)• Outcome-oriented approach by Matthews et al. 2011

Outcome-oriented modelling approach

Matthews et al.: Raising the bar? – The challenges of evaluating the outcomes of environmental modelling and software. Environmental Modelling & Software, March 2011, Pages 247-257.

Summary of common perspectives

• Assessment process and product addressed in many ways

• Use of results mostly not considered• The link between outputs and outcomes (cf. Matthews

et al. 2011)• Evaluation often a separate process

• Expert processes of making assessments and using their results

• Expert processes of evaluating performance

• Alternative perspectives?

Properties of good assessment

Properties of good assessment

• Ex post (after assessment) evaluation

• Ex ante (before/during assessment) evaluation• Guidance of design and execution

• Links process and output with use• Thereby also linking them to outcomes

Example: what makes a good hammer?

Example: what makes a good hammer?

• How is the hammer made? By whom?

• What properties does the hammer have?

• What do you want to do with the hammer?

• How does the hammer help you do it?

Summary

• Consideration of (intended) use is essential• Consideration of process and product in light of use

• Consider the instrumental value of information• Cf. absolute value (a common science view)• Cf. Ad hoc solutions (a common practice view)• Contextuality, situatedness, practicality, …• In policy-support information is a tool (a means to an

end)• A model is a tool for producing information

• How does this relate to the previous lectures about DA and the DA study plan exercise?