Uncertainty and qualityUncertainty and qualityin scientific policy assessmentin scientific policy assessment
-introductory remarks--introductory remarks-
Martin Krayer von Krauss, WHO/EEA
Integrated Assessment of Health Risks of Environmental Stressors in EuropeINTARES
E
IntroductionsIntroductions Intarese contingent
– National public institutes– Academia– Other: WHO, CEFIC, etc…
EEA contingent Others Marco Martuzzi & David Gee Jeroen van der Sluijs & Arthur Petersen Jerry Ravetz and Andrea Saltelli Jacqueline McGlade
ContextContext
Stakes are high; Values are in dispute; Facts are uncertain.
In a situation where very few of our assessments can truly be “validated”, and where the consequences of error could be far reaching, how do we ensure the quality of our work?
Workshop ObjectiveWorkshop Objective
Provide you with an understanding of: The context within which science for policy is conducted; The rational for the interest in uncertainty and quality in policy assessments; Qualitative as well as quantitative conceptions of uncertainty; Qualitative and quantitative approaches to managing uncertainty and quality; The relationship between uncertainty, quality and stakeholder participation.
Setting the scene: Setting the scene: The RIVM credibility crisis of 1999 The RIVM credibility crisis of 1999
and the subsequent responseand the subsequent response
Chair: Gordon McInnes, Deputy Director, EEA Presentations:
– Arthur Petersen, Director of the Methodology and Modelling Programme, MNP
– Jeannette Beck, Program Leader on Air Quality, MNP – Jan Wijmenga, Netherlands Ministry of Housing,
Spatial Planning and the Environment Discussion: Quality and uncertainty management
needs in science for policy– Sigfus Bjarnason, Head of group, EEA
Session 2: the contextSession 2: the context 13.00-14.00 13.00-14.00
Lecturing (30 minutes)– Introduction by Jerry Ravetz– Reflexivity and the Reflexive Practitioner (AP)– Levels of uncertainty (mkvk)
Discussion (30 minutes)
Uncertainty: a 3 dimensional Uncertainty: a 3 dimensional conceptconcept
Location
Level
Nature
See: Walker et al., (2003). J. of Integrated Assessment, 4 (1): 5-17.
Location of UncertaintyLocation of Uncertainty(1(1stst dimension) dimension)
Refers to the location at which the uncertainty manifests itself in the model
Examples of common assessment models:– Risk Assessment:
Risk = Probability x Consequence– Environmental & Health assessment of chemicals:
Risk = Exposure x Effect
Generic model locations: – Context (e.g. boundaries, framing) – Input data– Model uncertainty– Calibration data– Parameter uncertainty– Model output (conclusion)
Level of Uncertainty Level of Uncertainty (2(2ndnd dimension) dimension)
StatisticalUncertainty
ScenarioUncertainty
RecognisedIgnorance
TotalIgnorance
e.g. see Knight, 1921; Smithson, 1988; Funtowicz and Ravetz, 1990; Faber et al., 1992; Wynne, 1992; Schneider & Turner, 1994; ESTO, 2001.
Location
Level
Nature
Statistical Statistical UncertaintyUncertainty
Normal Distribution
-1.9
-1.6
-1.3 -1
-0.7
-0.4
-0.1
0.2
0.5
0.8
1.1
1.4
1.7 2
= 0
= .5
•There exist solid grounds for the assignment of a discrete probability to each of a well-defined set of outcomes.
•We have a well known functional relationship
•We have an adequate combination of: (i) number of parameters and (ii) amount and character of data.
-1.9
-1.6
-1.3 -1
-0.7
-0.4
-0.1 0.2
0.5
0.8
1.1
1.4
1.7 2
Scenario UncertaintyScenario UncertaintyWe can describe a set of outcomes to be expected, but we cannot associate probabilities very well. Assumptions; various plausible scenarios; unverified “what if ?” questions; Ambiguous results.
?
?
?
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?
Example of scenario uncertainty:Example of scenario uncertainty: Antibiotics in animal feedstuffAntibiotics in animal feedstuff
Scenario: resistance to antibiotics The widespread use of the antibiotics could lead to the development of
resistant bacterial strains; In the long run, antibiotics would no longer be effective in the
treatment of disease in humans.
• Scientific evidence: the development of bacterial resistance can take place. • How quickly and to what extent will it become a problem?
The outcome is clear, the probability of it occurring is unknown.
IgnoranceIgnoranceWe do not know the essential functional
relationships. There exist neither grounds for the assignment
of probabilities, nor even the basis for the definition of a complete set of outcomes.
More information may become known later through research, but little is known for the time being.
Recognized ignorance: We know that we don’t know!
Location
Level
Nature
StatisticalUncertainty
ScenarioUncertainty
RecognisedIgnorance
TotalIgnorance
Consider the case of a scientist asked to assess the risks or the costs of BSE at the time of its discovery in 1986.
No historical data on BSE was available and scientific understanding of how the disease is contracted was limited.
The extent of the public outcry that would eventually occur remained unknown, as did the extent of the loss of exports and the drop in domestic demand that ensued.
Knowledge on the relationship between BSE and CJD would not become available for another 10 years.
Any assessment would necessarily rely on a large number of assumptions, there would be no credible basis for the assignment of probabilities.
There would not even be a credible basis to claim that all
of the potential outcomes of the BSE epidemic had been thought of.
Example of ignorance: Example of ignorance: Mad cow disease & CJDMad cow disease & CJD
Different levels of uncertainty call for different approaches to uncertainty assessment and management!
StatisticalUncertainty
ScenarioUncertainty
RecognisedIgnorance
TotalIgnorance
Session 3: Basic conceptsSession 3: Basic concepts 14.00-15.30 14.00-15.30
Lecturing (45 minutes): Jeroen van der Sluijs – Knowledge quality assessment– Problem framing and context– Indicators– Intro: approaches to uncertainty management
Coffee break (10 minutes) Discussion (30 minutes) Comfort break (5 minutes)
Session 4: Approaches to Session 4: Approaches to managing uncertaintymanaging uncertainty
15.30-17.00 15.30-17.00 Lecturing (45 minutes)
– Quantitative approaches by Andrea Saltelli– Qualitative approaches (JvdS)– Procedural approaches (JvdS)
Comfort break (5 minutes)Discussion (40 minutes)
Session 5: ExerciseSession 5: Exercise09.30-14.20 09.30-14.20
Presentation (15 minutes) (JvdS/AP/MKvK)– Intro to case study – Intro to assignment
Facilitated group work 12.00-13.00 Lunch 13.00-14.20 Session 5 Continued
– Presentation & discussion of results (60 minutes)– Presentation of assignment for interim period (15
minutes) Coffee break (10 minutes)