2.6. Characterizing Uncertainty and "Levels of Confidence" in Climate
Assessment
Box 2-1. Examples of Sources of Uncertainty
Problems with Data
- Missing components or errors in the data
- "Noise" in data associated with biased or
incomplete observations
- Random sampling error and biases
(nonrepresentativeness) in a sample
Problems with Models
- Known processes but unknown functional
relationships or errors in structure of model
- Known structure but unknown or erroneous
values of some important parameters
- Known historical data and model structure but reasons to believe
parameters or model structure will change over time
- Uncertainty regarding predictability (e.g., chaotic or stochastic
behavior) of system or effect
- Uncertainties introduced by approximation
techniques used to solve a set of equations that characterizethe model
Other Sources of Uncertainty
- Ambiguously defined concepts and terminology
- Inappropriate spatial/temporal units
- Inappropriateness of/lack of confidence in underlying assumptions
- Uncertainty resulting from projections of human behavior (e.g., future
consumption patterns or technological change), as distinct from
uncertainty resulting from "natural" sources (e.g., climate
sensitivity, chaos)
|
Uncertaintyor, more generally, debate about the level
of certainty required to reach a "definitive" conclusionis a
perennial issue in science. Difficulties in explaining uncertainty have become
increasingly salient as society seeks policy advice to deal with global environmental
change. How can science be useful when evidence is incomplete or ambiguous,
the subjective judgments of experts in the scientific and popular literature
differ, and policymakers seek guidance and justification for courses of action
that could causeor preventsignificant environmental and societal
changes? How can scientists improve their characterization of uncertainties
so that areas of slight disagreement do not become equated with paradigmatic
disputes, and how can individual subjective judgments be aggregated into group
positions? In short, how can the full spectrum of the scientific content of
public policy debates be fairly and openly assessed?
The term "uncertainty" implies anything from confidence just short
of certainty to informed guesses or speculation. Lack of information obviously
results in uncertainty; often, however, disagreement about what is known or
even knowable is a source of uncertainty. Some categories of uncertainty are
amenable to quantification, whereas other kinds cannot be expressed sensibly
in terms of probabilities (see Schneider et al., 1998, for a survey of
literature on characterizations of uncertainty). Uncertainties arise from factors
such as lack of knowledge of basic scientific relationships, linguistic imprecision,
statistical variation, measurement error, variability, approximation, and subjective
judgment (see Box 2-1). These problems are compounded by
the global scale of climate change, but local scales of impacts, long time lags
between forcing and response, low-frequency variability with characteristic
times that are greater than the length of most instrumental records, and the
impossibility of before-the-fact experimental controls also come into play.
Moreover, it is important to recognize that even good data and thoughtful analysis
may be insufficient to dispel some aspects of uncertainty associated with the
different standards of evidence (Morgan, 1998; Casman et al., 1999).
This section considers methods to address such questions: first by briefly
examining treatments of uncertainties in past IPCC assessments, next by reviewing
recommendations from a guidance paper on uncertainties (Moss and Schneider,
2000) prepared for the TAR, and third by briefly assessing the state of the
science concerning the debate over the quality of human judgments (subjective
confidence) when empirical evidence is insufficient to form clear "objective"
statements of the likelihood that certain events will occur.
|