Revision as of 17:07, 19 May 2005 editSEWilco (talk | contribs)Extended confirmed users, Pending changes reviewers24,016 edits →Accuracy of models that predict global warming: More updates.← Previous edit | Revision as of 17:59, 19 May 2005 edit undoWilliam M. Connolley (talk | contribs)Autopatrolled, Extended confirmed users, Pending changes reviewers, Rollbackers66,008 edits Rv to SH. Far too unbalanced.Next edit → | ||
Line 42: | Line 42: | ||
===Accuracy of models that predict global warming=== | ===Accuracy of models that predict global warming=== | ||
Whether these models are sufficiently "correct" to be useful or not is a matter of debate. GCMs are capable of reproducing the general features of the observed global temperature over the past century . | Whether these models are sufficiently "correct" to be useful or not is a matter of debate. GCMs are capable of reproducing the general features of the observed global temperature over the past century . Thousands of climate researchers around the world use climate models to understand the climate system, and publish thousands of papers about this in peer reviewed journals - and a part of this research is work improving the models. A rather more complete discussion of climate models is provided by the (]). | ||
Thousands of climate researchers around the world use climate models to understand the climate system. Much has yet to be learned, as cloud physicist Robert Charlson observes: "To make it sound like we understand climate is not right." There are thousands of papers published about model-based studies in peer reviewed journals - and a part of this research is work improving the models. However, climate modeler Tim P. Barnett of the ] observed in ] that he didn't know that they reproduced climate any better than they did 5 years earlier. Climate modeler Jeffrey Kiehl of the ] stated that "we have made progress, but sometimes progress means you learn you need to know more." | |||
A rather more complete discussion of climate models is provided by the (]). | |||
] | ] |
Revision as of 17:59, 19 May 2005
A general circulation model (GCM) aims to describe geophysical flow by integrating a variety of fluid-dynamical, chemical, or even biological equations that are either derived directly from physical laws (e.g. Newton's law) or constructed by more empirical means. There are both atmospheric GCMs (AGCMs) and ocean GCMs (OGCMs). When coupled together (along with other components such as a sea ice model and a land model) an ocean-atmosphere coupled general circulation model (AOGCM) forms the basis for a full climate model.
A recent trend in GCMs is to extend them to become Earth system models, that would include such things as submodels for economic growth and its effect on carbon dioxide emissions.
Nevertheless, for over a decade the major climate prediction uncertainties have not been reduced at all. Progress has been made, but additional uncertainties and unknowns have been found. There also are so many adjustables in the models, and there is a limited amount of observational data, so the models can always be brought into agreement with the data.
Note that many simpler levels of climate model exist; some are of only heuristic interest; others continue to be scientifically relevant.
Structure
Three (or more properly, four) dimensional GCMs discretise the equations for fluid motion and integrate these forward in time. They also contain parametrisations for processes - such as convection - that occur on scales too small to be resolved directly. More sophisticated models may include representations of the carbon and other cycles.
A simple general circulation model (SGCM), a minimal GCM, consists of a dynamical core, for example the primitive equations, energy input into the model, and energy dissipation in the form of scale-dependent friction, so that atmospheric waves with the highest wavenumbers are the ones most strongly attenuated. Such models may be used to study atmospheric processes within a simplified framework but are not suitable for future climate projections.
Atmospheric GCMs (AGCMs) model the atmosphere (and typically contain a land-surface model as well) and impose sea surface temperatures. A large amount of information including model documentation is available from AMIP . They may include atmospheric chemistry.
- AGCMs consist of a dynamical core which integrates the equations of fluid motion, typically for:
- surface pressure
- horizontal components of velocity in layers
- temperature and moisture in layers
- parametrizations handle other processes, these include :
- radiation (solar/short wave and terrestrial/infra-red/long wave)
- convection
- land surface processes and hydrology
The method by which AGCMs discretise the fluid equations may be the familiar finite difference method or the somewhat harder to understand spectral method. Typical AGCM resolution is between 1 and 5 degrees in latitude or longitude: the Hadley Centre model HadAM3, for example, uses 2.5 degrees in latitude and 3.75 in longitude, giving a grid of 73 by 96 points; and has 19 levels in the vertical. This results in approximately 500,000 "basic" variables, since each grid point has four variables (u,v,T,Q), though a full count would give more (clouds; soil levels).
A GCM will contain a number of prognostic variables that are directly integrated (e.g. surface pressure) together with a number of diagnostic variables which are deduced from these (e.g. the 1.5 metre temperature, used to compare with observations, is deduced from the surface and lowest-model-layer temperatures).
Oceanic GCMs (OGCMs) model the ocean (with fluxes from the atmosphere imposed) and may or may not contain a sea ice model. The standard resolutionof HadOM3 is 1.25 degrees in latitude and longitude, with 20 vertical levels, leading to approximately 1,500,000 variables.
Coupled atmosphere-ocean GCMs (AOGCMs) (e.g. HadCM3) combine the two models. They thus have the advantage of removing the need to specify fluxes across the interface of the ocean surface. These models are the basis for sophisticated model predictions of future climate, such as are discussed by the IPCC.
AOGCMs represent the pinnacle of complexity in climate models and internalise as many processes as possible. They are the only tools that could provide detailed regional predictions of future climate change. However, they are still under development. The simpler models are generally susceptible to simple analysis and their results are generally easy to understand. AOGCMs, by contrast, are often as hard to analyse as the real climate system.
Flux correction
Early generations of AOGCMs required a somewhat ad hoc process of "flux correction" to achieve a stable climate. More recent models do not require this. "Flux correction" was always felt to be an undesirable feature of the models by those that wrote them (as well as by those that did not like them) and was resorted to out of necessity. One of the principle (valid) objections to flux corrections was that they amounted to a "tuning" of the models towards the current climate and made future projections less reliable. The model improvements that now make flux corrections unnecessary are various, but include improved ocean physics, improved resolution in both atmosphere and ocean, and a better fit between atmosphere and ocean models. Most recent simulations show "plausible" agreement with the measured temperature anomalies over the past 150 year, when forced by observed changes in "Greenhouse" gases and aerosols, but better agreement is achieved when natural forcings are also included .
IPCC scenarios
Coupled ocean-atmosphere GCMs are used to project/predict future temperature changes under various scenarios. The IPCC TAR figure 9.3 shows the global mean response of 19 different coupled models to an idealised experiment in which CO2 is increased at 1% per year . Figure 9.5 shows the response of a smaller number of models to more realistic climate forcing - although the forcings used depend on projections of future CO2 emissions, which are themselves uncertain.
Accuracy of models that predict global warming
Whether these models are sufficiently "correct" to be useful or not is a matter of debate. GCMs are capable of reproducing the general features of the observed global temperature over the past century . Thousands of climate researchers around the world use climate models to understand the climate system, and publish thousands of papers about this in peer reviewed journals - and a part of this research is work improving the models. A rather more complete discussion of climate models is provided by the IPCC TAR chapter 8, Model Evaluation (2001).
- The model mean exhibits good agreement with observations.
- The individual models often exhibit worse agreement with observations.
- Many of the non-flux adjusted models suffered from unrealistic climate drift up to about 1°C/century in global mean surface temperature.
- The errors in model-mean surface air temperature rarely exceed 1°C over the oceans and 5°C over the continents; precipitation and sea level pressure errors are relatively greater but the magnitudes and patterns of these quantities are recognisably similar to observations.
- Surface air temperature is particularly well simulated, with nearly all models closely matching the observed magnitude of variance and exhibiting a correlation > 0.95 with the observations.
- Simulated variance of sea level pressure and precipitation is within ±25% of observed.
- All models have shortcomings in their simulations of the present day climate of the stratosphere, which might limit the accuracy of predictions of future climate change.
- There is a tendency for the models to show a global mean cold bias at all levels.
- There is a large scatter in the tropical temperatures.
- The polar night jets in most models are inclined poleward with height, in noticeable contrast to an equatorward inclination of the observed jet.
- There is a differing degree of separation in the models between the winter sub-tropical jet and the polar night jet.
- For nearly all models the r.m.s. error in zonal- and annual-mean surface air temperature is small compared with its natural variability.
- There are problems in simulating natural seasonal variability.( 2000)
- In flux-adjused models, seasonal variations are simulated to within 2 K of observed values over the oceans. The corresponding average over non-flux-adjusted models shows errors up to ~6 K in extensive ocean areas.
- Near-surface land temperature errors are substantial in the average over flux-adjusted models, which systematically underestimates (by ~5 K) temperature in areas of elevated terrain. The corresponding average over non-flux -adjusted models forms a similar error pattern (with somewhat increased amplitude) over land.
- In Southern Ocean mid-latitudes, the non-flux-adjusted models overestimate the magnitude of January-minus-July temperature differences by ~5 K due to an overestimate of summer (January) near-surface temperature. This error is common to five of the eight non-flux-adjusted models.
- Over Northern Hemisphere mid-latitude land areas, zonal mean differences between July and January temperatures simulated by the non-flux-adjusted models show a greater spread (positive and negative) about observed values than results from the flux-adjusted models.
- The ability of coupled GCMs to simulate a reasonable seasonal cycle is a necessary condition for confidence in their prediction of long-term climatic changes (such as global warming), but it is not a sufficient condition unless the seasonal cycle and long-term changes involve similar climatic processes.
- There are problems in simulating natural seasonal variability.( 2000)
- Coupled climate models do not simulate with reasonable accuracy clouds and some related hydrological processes (in particular those involving upper tropospheric humidity). Problems in the simulation of clouds and upper tropospheric humidity, remain worrisome because the associated processes account for most of the uncertainty in climate model simulations of anthropogenic change.
There is a debate over how to reconcile climate model predictions that upper air (tropospheric) warming should be greater than surface warming, with observations some of which appear to show otherwise (see also satellite temperature record) . Possible explanations include errors in the surface or upper-air records; a problem with the model; or just the shortness of the observed record. This problem is no longer as intense as it used to be a few years ago, because (1) all versions of the satellite record now show warming; (2) there are now multiple versions of the satellite temperature record, some of which show warming greater than that observed at the surface.
The precise magnitude of future changes in climate is still uncertain , with a range of +1.4°C to +5.8°C for the temperature change between 1990 and 2100 (note: this range also includes a slight cooling due to sulphate aerosols; for CO2 alone the numbers would be higher). Much of this uncertainty results from not knowing future CO2 emissions, but there is also uncertainty about the accuracy of climate models, chiefly in the areas of clouds and aerosols which have the potential significantly change the net effect of CO2, although the uncertainty is such that it is not known if they under- or overpredict future climate change. The paleo climate data, suggests that the predicted impact of CO2 should be lower than the models predict, although the paleo data reflects long term equilibria and cannot rule out higher temperature increases within the lifespans of humans living today.
The computer models cannot decide among the variable drivers, like solar versus lunar change, or chaos versus ocean circulation versus greenhouse gas increases. Unless they can explain these things, the models cannot be taken seriously as a basis for public policy.
Forecasts of climate change are inevitably uncertain. A basic problem with all such predictions to date has been the difficulty of providing any systematic estimate of uncertainty, a problem that stems from the fact that these models do not necessarily span the full range of known climate system behavior.
Relation to weather forecasting
The global climate models used for climate projections are very similar in structure to (and often share computer code with) numerical models for weather prediction but are nonetheless logically distinct: see climate vs weather for details.
Most weather forecasting is done on the basis of interpreting the output of GCM data. Since forecasts are short - typically a few days or a week - such models do not usually contain an ocean model but rely on imposed SSTs. They also require accurate initial conditions to begin the forecast - typically these are taken from the output of a previous forecast, with observations blended in. Because the results are needed quickly the predictions must be run in a few hours; but because they only need to cover a week of real time these predictions can be run at higher resolution than in climate mode. Currently ECMWF runs at 40 km resolution as opposed to the 100-200 km scale used by typical climate models. Nested models can be run forced by the global models for boundary conditions, to achieve higher local resolution: the Met Office runs a mesoscale model with an 11 km resolution covering the UK.
Using the output from these models, together with satellite pictures and conventional data, weather forecasting may examine the effects of a single storm front moving in several different ways across part of a continent.
Formerly, weather forecasting also used methods which have less similarity to climate studies. Simply searching past records for situations similar to current conditions upwind of a city can produce a forecast based upon what usually happens in that city, but this "analogues" approach is now less common. More often, an experienced forecaster will use local knowledge to interpret the model output. A river valley or lake may produce local effects which local forecasting is interested in, but the effects are too small to affect continental-scale patterns.