REPORTS - ASSESSMENT REPORTS

Working Group II: Impacts, Adaptation and Vulnerability


Other reports in this collection

19.5.4. Sensitivity of Aggregate Estimates

At a time when the quality of numerical results still is low, a key benefit of aggregate impact analysis lies in the insights it provides regarding the sensitivity of impacts. Sensitivity analysis offers critical information about attributes of the damage function that are likely to be most influential for the choice of policy and, by implication, where additional climate change impacts research is most needed.

19.5.4.1. Composition of Impact Function


Figure 19-5: Aggregate impact of climate change as a function of global mean temperature. Displayed are hypothetical examples of a linear function, which assumes that impacts are proportional to temperature change since preindustrial times; a cubic function, which assumes that impacts are proportional to temperature change to the power of three; and a hockey-stick function, which assumes that impacts are approximately proportional to temperature change until a critical threshold is approached. Aggregate damage functions used in integrated assessments are mostly illustrative. They should be regarded as "placeholders" that will be replaced by more accurate functional forms as our knowledge of impact dynamics improves.

Most aggregate analysis is based on IAMs (see Chapter 2). Impact functions used in IAMs vary greatly with respect to the level of modeling sophistication, the degree of regional aggregation, the choice of numeraire, and other characteristics (see Tol and Fankhauser, 1998). Many models have used monetary terms (e.g., U.S. dollars) to measure impacts. Spatially detailed models (e.g., Alcamo et al., 1998) pay some attention to unique ecosystems. Disruptive climate changes have received little attention, except for a survey of expert opinions (Nordhaus, 1994b) and analytical work (e.g., Gjerde et al., 1999). Some climate change impact studies restrict themselves to sectors and countries that are relatively well studied (e.g., Mendelsohn and Neumann, 1999). Others try to be comprehensive, despite the additional uncertainties (e.g., Hohmeyer and Gaertner, 1992). Some studies rely on an aggregate description of all climate change impacts for the world as a whole (e.g., Nordhaus, 1994a). Other studies disaggregate impacts with substantial spatial and regional detail (e.g., Alcamo et al., 1998). The aggregate approaches tend to point out implications for efficiency and in practice often ignore equity (see Tol, 2001a, for an exception). The detailed approaches tend to identify distributional issues, but working out the equity implications typically is left to the reader.

19.5.4.2. Shape of Damage Function

Most impact studies still look at the equilibrium effect of one particular level of GHG concentration, usually 2xCO2. Full analysis, however, requires impacts to be expressed as a function of change in GHG concentrations. With so little information to estimate this function, studies have to rely on sensitivity analyses. Different damage functions can lead to profoundly different policy recommendations. Compare, for example, the profile of impacts under a linear and a cubic damage function (see Figure 19-5). Relative to the linear specification, a cubic function implies low near-term impacts but rapidly increasing impacts further in the future. Using conventional discounting, this means that early emissions under a cubic damage function will cause less damage over their atmospheric lifetime, compared to a scenario with linear damages. The marginal damage caused by emissions further in the future, on the other hand, is much higher if we assume a cubic damage function (Peck and Teisberg, 1994).

Some studies explore the implications of more nonlinear impact functions. For instance, Manne and Richels (1995) use a "hockey-stick" function that suddenly turns upward at arbitrarily chosen thresholds. Such studies are designed to reflect relatively small impacts before 2xCO2 and rapidly worsening impacts beyond 2xCO2. In this analysis, it is economically efficient to stabilize CO2 concentrations, but the desired level of stabilization depends on the shape of the hockey stick and the location of its kink. Other analyses, which rely on more linear impact functions, have difficulty justifying concentration stabilization at any level.

19.5.4.3. Rate of Change

Although most impact studies focus on the level of climate change, the rate of climate change generally is believed to be an important determinant, in many instances because it affects the time that is available for adaptation. Again, the paucity of underlying impact studies forces integrated assessors to use exploratory modeling. Under most "business-as-usual" scenarios, the rate of climate change is greater in the short run than in the long run because emissions increase faster in the short run; this is even more pronounced in emission reduction policy scenarios. Indeed, in considering the rate of change, tolerable window and safe-landing analyses (Alcamo and Kreileman, 1996; Toth et al., 1997; Petschel-Held et al., 1999) often find the rate of change to be the binding constraint in the first half of the 21st century.

19.5.4.4. Discount Rate and Time Horizon

Aggregate models suggest that the most severe impacts of climate change will occur further in the future. The chance of large-scale discontinuities (thermohaline circulation, West Antarctic ice sheet) also is higher in the future. The outcome of policy analysis therefore is sensitive to the weight afforded to events occurring in the remote future. In other words, estimates are sensitive to the choice of time horizon (Cline, 1992; Azar and Sterner, 1996; Hasselmann et al., 1997) and the discount rate (i.e., the value of future consumption relative to today's value). The literature on discounting is reviewed in Portney and Weyant (1999) and in TAR WGIII Chapter 7. Numerical analysis (e.g., Tol, 1999a) has shown that estimates of marginal damage ( i.e., the additional damage caused by an extra ton of emissions) can vary by as much as a factor of 10 for different (and reasonable) assumptions about the discount rate. This makes the discount rate the second-most important parameter for marginal damage. The most important parameter is the degree of cooperation in reducing emissions (Nordhaus and Yang, 1996; Tol, 1999b).

19.5.4.5. Welfare Criteria

Comparison of impacts (i.e., the relative weight assigned to impacts in different regions and at different times) is one of the most sensitive aspects of aggregate analysis. With the exception of the discount rate, little explicit attention is paid to this aspect of climate change impacts, although studies differ considerably in their implicit assumptions. Fankhauser et al. (1997) and Azar (1999) are among the few studies that make their aggregation assumptions explicit. They find that, in general, the higher the concerns about the distribution of the impacts of climate change, the more severe the aggregate impacts. Fankhauser's (1995) estimate of the annual global damage of 2xCO2, for instance, is based on the implicit assumption that people are neutral with respect to distribution (that is, losses to the poor can be compensated by equal gains to the rich) and risk (that is, a 1:1,000,000 chance of losing $1 million is equivalent to losing $1 with certainty). Replacing these assumptions with standard risk aversion or mild inequity aversion, the global damage estimate increases by about one-third (Fankhauser et al., 1997). Marginal impacts are more sensitive. For the same changes in assumptions, Tol (1999a) finds a three-fold increase in the marginal damage estimate. The sensitivity of aggregate impact estimates is further illustrated in Figure 19-4.

19.5.4.6. The Treatment of Uncertainty

Sensitivity analysis is the standard approach to deal with impact uncertainty. Some studies, however, have gone one step further and explicitly model uncertainty as a hedging problem. The premise underlying these models is that today's policymakers are not required to make once-and-for-all decisions binding their successors over the next century. There will be opportunities for mid-course adjustments. Climate negotiations are best viewed as an ongoing process of "act, then learn." Today's decisionmakers, in this view, must aim at evolving an acceptable hedging strategy that balances the risks of premature actions against those of waiting too long.

The first step, then, is to determine the sensitivity of today's decisions to major uncertainties in the greenhouse debate. How important is it to be able to predict impacts for the second half of this century? Or to know what energy demands will be in 30 years and identify the technologies that will be in place to meet those demands? An exhaustive analysis of these questions has yet to be undertaken, but considerable insight can be gleaned from an Energy Modeling Forum study conducted several years ago (EMF, 1997). In the study, seven modeling teams addressed a key consideration in climate policymaking: concerns about events with low probability but high consequences.

The study assumed uncertainty would not be resolved until 2020. Two parameters were varied: the mean temperature sensitivity factor and the cost of damages associated with climate change and variability. The unfavorable high-consequence scenario was defined as the top 5% of each of these two distributions. Two surveys of expert opinion were used for choosing the distribution of these variables (for climate sensitivity, see Morgan and Keith, 1995; for damages, see Nordhaus, 1994a).

The analysis showed that the degree of hedging depends on the stakes, the odds, and nonimpact parameters such as society's attitude toward risk and the cost of greenhouse insurance. Also critical is the timing of the resolution of key uncertainties. This underscores the importance of scientific research.

height="1" vspace="12">

Other reports in this collection

IPCC Homepage


height="5">