REPORTS - ASSESSMENT REPORTS

Working Group II: Impacts, Adaptation and Vulnerability


Other reports in this collection

15.2.3.1.5. Role of changing water resources

Although several studies have examined the potential implications of climate change for streamflows and water delivery reliability from reservoir systems in regions where irrigated agriculture is now important (see Section 15.2.1), there have been few direct analyses of the economic impacts on irrigated agriculture of changes in water availability. Some assessments of the impacts of climate change on agriculture in North America have relied on optimistic assumptions regarding the availability of irrigation water to offset precipitation deficiencies (Mendelsohn et al., 1994). Other studies have attempted to estimate the impacts of projected climate change on the potential use of irrigation water. A study of potential climate change impacts on irrigation water use in the United States concluded, "The greatest impact of a warmer climate on the agricultural economy will be in the West where irrigators will be hard put to maintain even present levels of irrigation" (Peterson and Keller, 1990). That conclusion is based on first-order impacts of reduced water availability and does not consider possible earnings from sale or lease of water rights.

Studies of the impacts of drought events may provide useful insights into the impacts of substantial changes in seasonal streamflows that may result from climate warming—particularly in western North America, where mountain snowpacks now sustain streamflows into the summer months (see Section 15.2.1). However, the impacts of short-term droughts are an imperfect analog to long-term impacts of a drier climate because farmers are likely to adjust crop choices and farming practices as they acquire experience with any new climate regime.

Under some scenarios, demand for irrigation water declines (e.g., as a result of more rapid crop maturation and/or increased growing-season precipitation). Scenarios investigated for the U.S. National Assessment (Reilly et al., 2000) suggest that demand for water resources by agriculture would decline nationwide on the order of 5-10% by 2030 and 30-40% by 2090. Land under irrigation showed similar magnitudes of decline. Crop yield studies generally favor rainfed over irrigated production and show declines in water demand on irrigated land. Such adaptations could help to relieve some of the stress on regional water resources by freeing water for other uses (Hurd et al., 1999). However, the interplay between changes in irrigation demand and changes in water supplies has not been fully assessed.

15.2.3.1.6. Carbon sequestration

North American soils have lost large quantities of carbon since they first were converted to agricultural systems, leaving carbon levels in agricultural soils at about 75% of those in native soils (Bruce et al., 1999c). Because carbon in agricultural soils is a manageable pool, it has been proposed that these soils be managed to sequester carbon from atmospheric CO2.

The rate at which carbon is lost has subsided for most agricultural soils, and carbon levels in some soils have been maintained or even begun to increase as conservation farming practices have been adopted in the past 15-20 years. On cultivated land, these practices include conservation tillage (i.e., reduction or elimination of tillage) and residue management, use of winter cover crops, elimination of summer fallow, and methods to alleviate plant-nutrient and water deficiencies and increase primary production (Lal et al., 1998). Revegetation of marginal lands and modified grazing practices on pastures can be used to increase soil carbon levels. On degraded soils, preventing and controlling erosion and reducing salinization help to maintain or increase soil carbon. Greater adoption of these measures in the United States and Canada could result in agricultural soils more effectively capturing carbon from atmospheric CO2.

However, these agricultural practices that are effective in building soil carbon also may result in greater emissions of other GHGs (e.g., N2O). Therefore, research is needed to weigh the positive and negative effects of building up soil carbon with respect to the overall goals of reducing GHG emissions. Moreover, implementation of such mitigation strategies and their effects on adaptation need further evaluation from the perspective of practical economics and land management decisions (see Box 15-1).

Some scenario studies suggest that interactions between soil and atmosphere will occur under a positive feedback system as temperatures increase: Higher temperatures will cause greater decomposition of soil carbon, in turn causing greater emissions from soil of CO2, which will enhance the greenhouse effect and cause even higher temperatures. However, there is evidence that negative feedback mechanisms also exist. Some experiments indicate that more primary production is allocated to roots as atmospheric CO2 rises (Schapendonk et al., 1997), and these roots decompose more slowly than those grown at ambient CO2 levels (Van Ginkel et al., 1997). Recent comprehensive analyses of field data of forest soils suggests that increased temperature alone will not stimulate decomposition of forest-derived carbon in mineral soil (Giardina and Ryan, 2000).

Analysis of yield trends for 11 major crops over the period 1939-1994 indicates that the rate at which yield increased ranged from 1% on average to more than 3% yr-1 (Reilly and Fuglie, 1998). Conservative extrapolation of yields implies that the average annual increase in yield for the 11 crops between 1994 and 2020 would range from 0.7 to 1.3% yr-1. More optimistic estimates of growth rates indicate that yield increases could be as high as 3% yr-1. These yield increases could lead to substantial increases in soil carbon if crop residues are retained.

height="1" vspace="12">

Other reports in this collection

IPCC Homepage


height="5">