Nifwl Seirff

Publications: Climate change simulation

This paper was assessed as part of the Monash Computer Science honours subject: Epistemology of computer simulation.

{Climate Change Simulation Guiding Policy or Policy Driven? }

Climate Change Simulation
Guiding Policy or Policy Driven?

Kymberly Fergusson,
Monash University, Clayton, Australia
email: kef@csse.monash.edu.au
Computer Science & Software Engineering
© 2003, Kymberly Fergusson, Melbourne, Australia

August 3, 2003

Abstract

There are many difficulties associated with modelling and simulation in general, and simulation of climate change in particular, causing heated debates world wide. The impact of even a small global change may be drastic, varied and could potentially affect everything on the planet. Environmental policy is a controversial issue, and often pits politicians and scientists against one another. The global nature of climate change, the delay between cause and effect of human influence on such a huge scale system, and the potentially disastrous consequences of climate change are why this area of simulation is one of the most often discussed in scientific, social and political contexts. This paper covers some of the problems associated with climate change modelling and simulation, and discusses some of the implications of basing policy decisions on results of such simulations.

1 Introduction

Climate change modelling is one of the first truly global uses of modelling and simulations which relies on contributions from a large range of scientific disciplines, geology, physics, systems theory, chemistry, biology, etc. Climate models range from simple weather forecasting models, to Earth Systems models (ESMs). ESMs incorporate models of the worlds atmospheric processes typically combined with oceanic models, to form a ‘complete’ global climate model.

An early global world model, not dealing with climate change attempted to simulate population growth by incorporating models of environmental, social and economical relations. The relationships between these processes in the WORLD3 model were not well understood and highly abstracted, sparking a huge debate about simulation in general  [MMRBI72Imh00]. However, simple climate models are currently used to forecast regional weather and small scale climatic events. This type of modelling has been in use for many years, and is continuing to be refined. Many people rely on these forecasts to go about their work and social lives, despite the continuing debate about climate simulation.

The Intergovernmental Panel on Climate Change (IPCC), consisting of a large group of scientists from around the world, (experts on climate, social impacts and ecology), was formed in 1988 by the United Nations (UN). The IPCC is responsible for researching what the climate would have been had the industrialisation not occurred, predicting the magnitude of the impact of human influence, discovering what aspects of human activity may affect the climate and searching for ways to mitigate any negative changes resulting from human interaction. The report released by the IPCC in 1996 stated that “the balance of evidence suggests a discernible human influence on global climate”  [HMFC+96, 5].

As climate change modelling has both political and scientific impact, the possibility of huge social ramifications has caused the debate to migrate into the mass media. Many policy makers are attempting to ‘play-down’ the issues surrounding the predicted global climate change. They typically request empirical confirmation of climate model predictions before accepting the recommendations from the IPCC for global environmental policies  [Edw99]. The risks of ignoring these recommendations must be weighed against the cost of mitigating them, however many are unwilling to make any decisions because of the inherent uncertainties in climate modelling outcomes.

2 Climate change

The global climate is a huge collection of environmental processes, with an extremely complex system of interaction. Most of these processes do not display drastic changes over a short period of time. The impact of human influence may be measurable over a time span of mere years or centuries. It is difficult to say whether any notable changes are natural or of human origin because there is no control system. We have one world, which humans are experimenting on without fully understanding the possible repercussions  [Sch92].

Without some model of atmospheric processes the shape of the curve of global climate change could not be projected, and policy makers would be reduced to having no possible forecasts about the likely direction and nature of global change  [Edw99]. The creation of any policy comprises choosing a balance between the amount of risk and the cost of mitigating that risk. Any information that may help with that decision, including forecasting the likely outcome of various decision, is of benefit (even if the forecasts are inaccurate).

Typically forecasts from various simulations that are used when considering policy decisions have been shown to be fairly reliable in illustrating past trends (allowing some inaccuracy), otherwise these simulations would be unlikely to be used. It is reasonable to assume that the forecasts would continue to show ‘reasonable’ trends, that is trends that are likely to happen, but results may be somewhat inaccurate at times. Climate change simulation results for past situations are fairly accurate, and exhibit the current trend of global temperature rise, although some predictions are a little high. It is better to have some idea of future trends than no clue for policy decisions.

Although knowing and accepting that climate change science is fraught with uncertainty, incomplete data and conflicting data, the scientists involved with the 1996 study conducted by the IPCC concluded that humans are influencing the climate in many ways. In which case, it is useful to know how future climate trends may be worsened or mitigated by human interaction through using simulations. The results produced by a variety of simulations in this situation should be taken into consideration when designing environmental policies.

2.1 Global warming and the ozone layer

One of the more commonly known climate change ‘threats’ is that of global warming. This is based on the well established scientific theory of the greenhouse effect, incorporating the theory of thermal dynamics and chemical reactions. There are three main gases which contribute to the greenhouse effect, the concentrations of which have increased since industrialisation. These are carbon dioxide (CO2), methane (CH4) and nitrous oxide (N2O). All are released mainly from human activities (the burning of fossil fuels, land-use change and agriculture)  [HMFC+96Gre00]. The interaction of these gases with the atmosphere may force an overall rise in the average temperature (called positive radiative forcing)  [HMFC+ 96].

The depletion of the ozone layer, caused by chlorofluorocarbons (CFCs) has resulted in a negative radiative forcing. The stratospheric ozone layer depletion has slowed as increases in CFC concentrations in the atmosphere are now negligible. This was due to an international policy established in five years (1985 to 1990) to phase out CFCs  [Edw96]. Halocarbons, replacing CFCs, warm the atmosphere at lower altitudes, but when broken down at high altitudes have a cooling effect  [Gre00]. However, the concentration of ozone in the troposphere is still increasing, leading to further positive radiative forcing in the lower levels of the atmosphere.

As these gases remain in the atmosphere for an extended period, they will continue to affect the climate for a long time after reducing the output to zero (even if that were possible).

2.2 Risks and policy making

As climate change may have extreme consequences, politicians need to be aware of the risks of not implementing policies based on the recommendations from the IPCC.

The risk of inaction where global warming is concerned is high. Even a change of a few degrees in the average world temperature may destroy habitats and cause the sea level to rise flooding coastal cities. It may cause the water table to rise decreasing arable farming land due to salinity. Some areas may actually develop a more moderate climate, as opposed to others which may become more extreme. Longer, higher floods, longer periods of drought may affect regional ecological systems, causing some species of plants/animals to flourish, others to become extinct. The change in temperature may provide a more hospitable environment for some diseases, less hospitable for others. Other rapid changes in regional climates (such as the changes cause by the destruction of the Amazon forests), may have irreversible regional effects, the extinction of local flora and fauna being the most obvious  [Ben92].

Implementing procedures to reduce greenhouse gases will not have an immediate affect as these gases remain in the atmosphere for hundreds of years. But if implemented globally, such a policy probably would lessen the effect in the future. Ignoring the risks and continuing with the current rate of increase in greenhouse gas would increase the greenhouse effect and global warming in the future. All of which would create a huge cost to global society (socially and economically).

The direct risk to human life may not be extreme, perhaps a small rise in the mortality rate during the more extreme summer temperatures, but that may be balanced by a reduction in winter mortality rates  [Gre00]. It is uncertain how a raise in temperature may affect natural disasters such as typhoons, cyclones and tornadoes, but this may pose some risk to human life.

The cost of action would be to research and implement alternative fuel sources, as well as more ecologically sensitive farming and industrial development methods. This would have a higher immediate cost than continuing current practises. There would be social implications for those countries which rely on fossil fuel production for economic viability.

As the complex interaction of climatic processes is not well understood, there is much uncertainty when discussing possible influences and impacts. Scientists are obviously limited when predicting indirect impacts and researching possible procedures to ameliorate the negative consequences of climate change  [Gre00].

However, these risk/benefit analyses are crucial to policy development in all fields, uncertainty of cause and effect also exist in all fields of policy development to an extent. Politicians continually deal with uncertainty and make value judgements when planning and making policy decisions. The main difference in the case of global climate change, is the time span of cause/effect, so that policies with their associated risks and cost need to be considered over several generations  [SRSW99].

Uncertainty should not cause inaction, as the long-term risks of ignoring current observed climate change trends are very great  [Bru96]. Policy initiation does not and should not depend on a clear and accurate prediction, but on a “rational decision process integrating goals, alternatives and policies with the best available scientific projections”  [Bro92a, 19].

Many opponents of environmental policy cry out for ‘proof’ that the climate is changing because of human interference  [Bas03]. However, as with any natural system, validation of a numerical model of that system is impossible as those systems are not closed  [OSFB94] Politicians make policy decisions based on simulations but typically without proof that the projections are certain to occur. This is especially true for policy decisions based on economic forecasts, where simulations are used to explore the possible consequences of a number of policies  [Ben01]. Judgement and decision making is not and should not be affected by the lack of proof  [KG93Ber91]. Nor should policy making rely on science alone  [Bro92b].

2.3 Measured changes

The IPCC found that the average global temperature had risen by 0.3 to 0.6oC over the past century, and night-time temperatures have shown a larger increase than daytime temperatures  [HMFC+96]. The average global sea level has risen between 10 and 25 cm over this time, which may be attributed to the rise in global temperature  [HMFC+96].

The temperature increase was more pronounced on land based regions as opposed to sea regions and was not uniform. The higher levels of the atmosphere did not show an increase in temperature, despite many climate simulation predicting such a change  [Bas03Ben92]. This may be attributed to the decrease in the stratospheric ozone layer or volcanic activity.  [Gre00]

The rise in temperature in the first half of the 20th century is most likely caused by an observed increase in solar activity, however the warming observed in the remainder of that century is most probably the result of human activity  [Gre00].

The IPCC is one of the first to admit that the understanding of the representation of climatic processes is incomplete  [HMFC+96Bas03]. Much more research this area needs to be conducted in order to understand the relationships and interactions between natural and human influences. This knowledge would contribute to the creation of more accurate climate models, which may then exhibit the unbalanced warming between the stratosphere and troposphere that has been observed.

3 Climate models

It is only fairly recently that a global view of the climate has become popular, and climate modelling is one of the earliest truly global applications of modelling and simulation which incorporates theories from a range of scientific disciplines.

An early global world model (WORLD3) utilised systems theory to simulate population growth in the real world system but without modelling any climatic processes  [MMRBI72]. The ‘Limits to Growth’ debate centred on computer simulation and the validity of the assumptions made during the creation of the world system model  [Imh00]. Assumptions to simplify models and simulations are still one of the most controversial issues in simulation today. This is especially true of climate change modelling, where the number and range of scale of environmental processes is huge, the processes themselves are not yet fully understood, interprocess interaction is relatively unknown.

Computer models of climate change range from simple programs, to ones that strain the capabilities of recent supercomputers. Simple “zero dimensional” models reduce the earth to a single point and computer a single global average temperature. Some two dimensional models give an indication of the atmosphere’s vertical temperature structure.

Simpler models in use today simulate regional weather patterns and small scale climatic events. More sophisticated models, Earth Systems models (ESMs), combine many different types of climatic models, such as Oceanic General Circulation models (OGCMs) and Atmospheric General Circulation models (AGCMs) [SRSW99]. These highly complex models “may also include models of sea ice, snow cover, vegetation, agriculture, and other phenomena with important effects on climate”  [Edw01, 38]

3.1 Global climate models

Global Climate Models (GCMs) use a three dimensional representation, with a horizonal surface resolution between 250 to 500 km. The atmosphere is broken up into eight to twenty vertical ‘layers’ of varying depth for up to a height of about 20km, creating a three dimensional grid or lattice. There are “more layers at lower altitudes where the atmosphere is denser and more weather occurs” [Edw01]. This is quite a low resolution compared to those employed in models used to forecast weather, called numerical weather prediction (NWP) models, which typically use a grid scale of under 1km.

NWP models differ from GCMs in that they are predictive and are initialised with relatively complete recent observational data. They then predict the likely atmospheric activity over a short period (a couple of days to a week). In contrast GCMs simulate climate, initialised with a state consisting of geological data, as opposed to recent observational data. They then take some time to reach equilibrium and ‘settle’ into a stable climate.

The limitations of computing power require the complexity of the model to be reduced, in order for models to be represented on the computer and simulations to be feasible. Thus some less important processes are simply represented as parameters (a process called parameterisation), or the resolution of the model needs to be decreased.

Several phenomena are simulated in a climate model, one of the most important and most difficult is cloud formation, occurring on a scale 0-10km. This is usually below the grid resolution of a GCM and is often parameterised. If done badly, a grid location could have a cloud formation that ‘blinks’ - present in one time step and absent in the next  [Edw99]. Most other sub-scale processes also require parameterisation, including atmospheric absorption of solar radiation.

A necessary modification allows for the exchange of heat and water between the ocean and the atmosphere, called ‘flux’ adjustment. This is below the grid resolution and needs to be incorporated into the model. Many arguments centre around the necessity in most climate models for these arbitrary adjustments, and the abstraction of processes to simple parameters  [SRSW99].

Ideally in a climate model, the only fixed conditions would be the layout of the land and oceans. “All other variables would be generated internally by the model itself from the lower-level physical properties of air, water and other basic constituents of the climate system”  [Edw99].

However, as can be seen, climate models are incredibly complex, and even with current computing power, require extreme simplifications to be feasible. Even though those simplifications may be incorrect or inaccurate, the outcomes of the simulation may still be valid. Explicit verification and acceptance of the uncertainty is important when making any claims or forecasts from such models.

3.2 The epistemology of climate change simulation

Climate change originally was considered a local or regional problem, rarely thought about on a global scale until mid-1980s. Climate change was equated with other regional environmental concerns, droughts, smog, earthquakes, cyclones, etc, and was thought to similarly affect some areas more than others  [Edw01]. Even though weather was a global concept developed in the early 1900s, it wasn’t until the advent of computer-based regional climate models in the late 1970s that the shift to a global focus began.

Atmospheric science is sourced from various areas in physics, such as theories about gases (behaviour, radiation absorption, emission), Newton’s laws of motion, thermodynamic energy equation, mass conservation and the hydrodynamic state equation  [NS01]. Meteorological theoretical foundations began construction at the end of the nineteenth century, and in the early 1900s Vilhelm Bjerknes, a Norwegian meteorologist defined seven differential equations, derived from existing basic physics. The simultaneous solution of these equations would predict the large scale atmospheric movements, but would remain incalculable until the advent of computers in the 1940s.

In 1946, von Neumann became interested in using computers to model and predict weather  [Asp90]. The ENIAC ran the first weather forecast in 1950, covering North America and a section of the surrounding ocean. In 1954 a separate project in Sweden first used computer models for real-time forecasting.

Weather models remained regional, or limited to a continent, and did not attempt to forecast more than a few days, due to the computing limitations of computers. In 1955 Normal Phillips created a two-layer computer general circulation model (GCM)  [Phi56].

In 1960, the UN sponsored the International Geophysical Year, creating the first global data exchange, so desperately needed for developing atmospheric models. The coverage was still sparse, and uneven, but it started the change in focus to a global atmospheric model.

Models proceeded to get more complex and contain fewer assumptions as computing power increased. The first global GCM was used by the European Centre for Medium Range Weather Forecasts (ECMWF) in 1979  [Edw01].

The first ESM was created in the mid 1990s, and a second type of global simulator, integrated assessment models (IAMs) started to be used for simulating the impacts of climate change on society about the same time. IAMs use GCM outputs to simulate impacts, and formulate a cost-benefit assessment of mitigating those impacts of climate change. They forecast trends and compare policy scenarios but they only make qualitative rather than quantitative predictions  [Edw01].

3.3 Verification and validation of models

For linear models, sensitivity analysis is performed - testing outputs of one model against another to verify results. However, in climate science, this is not feasible due to model complexity and nonlinearity (changes in parameters cause results through complex feedback processes)  [NS01].

Instead, adjoint sensitivity analysis (ADA) is used, where the simulation is modified after each run, to diagnose the dependence of specific effects on individual parameters  [Cac81]. This is where individual components are isolated and tested against reality. This method can not guarantee that the individual components are properly treated. For example “the model may be good at predicting the average cloudiness but bad at representing cloud feedback”  [Sch92, 24].

A different method for investigating robustness tracks multiple-variable changes, creating ‘crucial experiments’ that eliminate all plausible alternate interpretations by tightly constraining the solution space. (In effect, no plausible alternative model could produce the same signal)  [NS01].

Schneider  [Sch92] notes there are several different verification techniques in use, which, when used in combination, present a strong circumstantial case. The first test is checking the ability of the model to simulate today’s climate, in particular, the seasonal cycle. Most GCMs in use today map this very well.

The ability of the model to reproduce extreme climates should be tested, such as glacial-interglacial cycles, the Mesozoic Era and even climates of other planets (the hot conditions on Venus and cold conditions on Mars). This also displays the model’s ability to represent such extremes  [Sch92].

4 Modelling and data

The global and extensive nature of climate modelling obviously requires an equivalently global and extensive collection of data and a global and reliable collection system. Unfortunately this still doesn’t exist. It is beyond human comprehension to analyse such huge data sets (in the order of hundreds of terabytes).

4.1 Data and model filtered data

Data is mainly collected using satellite based instrumentation, rockets, radiosondes, aircraft and conventional observing stations. Ideally, each point in the atmosphere would be monitored equally, over time and compared to the other points, but this does not happen.

Raw climate data has a number of problems that need to be addressed before it is usable. The first and foremost being the uneven nature of the sampling, over time and area. The raw data needs to be fed through a model to interpolate intermediate (unmeasured) values from known values and fit the data to a three dimensional grid and uniform time steps. This used to be done by hand before the process was automated, incorporating ‘smoothing’ to remove anomalous data points, in the 1960s  [Edw01].

Other errors include urban heat bias as well as changes in measuring instrumentation and techniques  [Edw99KKC93]. Therefore more models and data manipulation is required to correct for these errors and inconsistencies.

Opponents claim on this basis that predictions are not based on historical data, but on computer models of that data  [Bas03Gie99]. Even where empirical data is inadequate, models can be used for sensitivity analysis, and to guide further study  [OSFB94].

As can be seen, to get the data into a remotely usable state, it has been fed through various intermediate models. This process typically occurs in any physical experiment, not only in computer simulation. Errors in instruments need to be corrected with some kind of filtering model, artifacts caused by experiment controls need to be allowed for and the data altered accordingly. This is commonly referred to as data reduction. Calibration of measuring equipment is required to correct some measurement errors, thus even data in experiments has been corrected via one or more mathematical models. All experiment data and simulation data has been reduced at some point. The main aims of “experiment control and data reduction are to produce data about uncontaminated real effects by removing artifacts introduced by instrumentation, observational environment and extraneous influences”  [NS01, 72].

Most natural systems are analogue and measurements of these systems require discretisation, introducing round-off errors. Many of these errors may cause undetected artifacts in simulation results  [Win01]. As data correction and measurement discretisation is used in the majority of scientific, economic and political disciplines (mostly to account for human, measurement and equipment errors), using model filtered data does not and should not invalidate that data, or models based on that data.

5 Why is climate modelling so controversial?

Many opponents to climate modelling, state parameterisation, low resolution and uncertainty as the main reasons to object to the IPCCs claims, specifically attacking the models and modelling techniques used.

5.1 Resolution

Theoretically, climate modelling should perform better at the resolutions that are commonly used by NWPs, however, the computation required to calculate the subsequent states of the GCM increases exponentially with resolution. Therefore a trade-off is necessary between a model’s resolution and the complexity of the model (number of phenomena simulated and level of detail of the model)  [Edw99].

Unfortunately one of the main problems with climate modelling is that there are many processes that are beneath the grid scale, cloud formation being the obvious example. Clouds tend to shade a few kilometres and currently GCM resolutions are in the realms of hundreds of kilometres  [Sch92].

5.2 Parameterisation and tuning

All sub-grid-scale phenomena must be represented parametrically, ensuring that “virtually all physical processes operating in the atmosphere require parameterisation in models”  [Kie92, 336]. Climate modellers decide how to parameterise a process by reviewing expert literature and taking ‘guesses’. If a relation is found, these parameters are noted as physically based, otherwise ad hoc schemes are often used.

Even the ad hoc schemes are validated by isolating individual and groups of parameters and doing sensitivity testing the system against reality. This is done to ensure the system and it’s parameters are behaving as close to reality as possible.

Parameterised models require calibration, the same as physical instruments, the process of which is called ‘tuning’. It is a very difficult process as many of the parameters interact with many others in complex ways. Tuning should help to bring the model’s output closer to the actual data, making it a more accurate model. Tuning is often done during verification to ensure a model’s parameters cause the system to be as close to reality as possible.

Even though parameterisation of cloud formation is an unrealistic representation of this process, the model may still exhibit the same average cloud cover and temperature interaction, providing the parameters are properly tuned. This supports Bedau’s view that unrealistic models may accurately portray complex systems  [Bed99]. Norton also supports the view that “simulations may embody false or unsubstantiated simplifying assumptions and still detect real effects”  [NS01, 91].

5.3 Flux adjustments and tuning

Flux adjustment is another method of ‘tweaking’ which most GCMs require, as provision for the exchange of heat, momentum and water between the atmosphere and ocean must be allowed for  [Edw99]. When atmospheric GCMs and oceanic GCMs are coupled, the climate of the coupled model tends to drift into a substantially different state than when the coupled models are run separately  [SRSW99]. It may appear as a steady drop in ocean surface temperature or salinity. This is problematic as there exist feedback processes that depend on those characteristics, and sometimes the size of the drift is larger than the greenhouse climate change response  [SRSW99]. It is therefore important to test both coupled and un-coupled systems to ensure consistent results  [KKC93].

Flux adjustments in coupled climate models violate thermal dynamics and thus have no physical basis  [SRSW99]. These adjustments can introduce additional errors into the model  [Sch96] This encouraged the National Centre for Atmospheric Research (NCAR), to develop a model which does not require flux adjustments. This model performs on par with other models, by incorporating the effects into the parameterisation, passing it to larger model scale. It is suggested that this is a more realistic method of mixing heat through the ocean than any earlier model employed  [Ker97].

5.4 Summary

Many of the reasons that these skeptics claim for the invalidation of climate modelling, could be applied to all scientific experimentation and knowledge, rendering all existing scientific knowledge ‘invalid’.

Uncertainty is present in any measurement or experiment, and subsequently nearly all theoretical and scientific knowledge is uncertain to some extent. In almost all areas of climate change simulation there are obvious causes of uncertainty and in some cases, incorrect representations. Instrumental errors, anomalous readings, inconsistent data, data correction models, extreme simplification of atmospheric processes, partitioning of the atmosphere into grid squares and layers, and the difficulty in empirically validating any simulation outcome all contribute to the skeptic’s arsenal of complaints.

However, in all science there is uncertainty of some magnitude, but that does not invalidate the scientific knowledge. Care must be taken by modellers to accept and make explicit the uncertainty inherent in climate models, and limit claims to those effects that are well above grid scale level, and those that are robust real effects (do not change with different model scenarios)  [Edw99NS01]. Claims should be suitably qualified and vague, and should be limited to constrained solution spaces  [Ros75].

6 Conclusion

Climate modelling and simulation is one of the most complex areas of simulation, requiring huge and global data sets and huge computing power to generate climate change models understandable by humans. As with any area that requires a large amount of data filtering and models that are highly dependent on abstractions, there is a level of uncertainty in simulation outcomes.

Computational ability currently limits the complexity of GCMs in a number of ways. Resolution is quite large, which necessitates the parameterisation of sub grid-scale phenomena. Flux adjustments are commonly used to represent the heat transfer process between the atmosphere and ocean, these adjustments are physically impossible. Both parameterisation and flux adjustments require tuning and verification that the system conforms to reality over a wide range of possible climates and does not drift into implausible representations.

As seen in the previous section, climate data is incomplete and riddled with errors, reasons that the data requires ‘massaging’ to be useful.

However although more is needed, the coverage of data gathering has increased tremendously, and the data provided is of a much higher standard. This is also in part due to more reliable measuring equipment. The need for flux adjustments is declining as a method for parameterisation of the heat transfer process is being developed. As computing power increases, the resolution can become finer, and more processes can be modelled completely.

Verification and validation of the models will remain a problem, but if several independent or different techniques are used to ‘match’ the performance of the model against reality, a compelling circumstantial case can be established. When many different models conform to the same outcomes when environmental forcing is applied (such as increases in greenhouse gases), and are all comprehensively verified, climate change simulation may become more respected.

Any area requiring inductive reasoning, measurement or experimentation is subject to uncertainty, but that does not mean that the knowledge gained from those areas is invalid. If this was the case, then all scientific knowledge is invalid. Proper verification techniques can and should be used to constrain model solutions to highlight robust non-artifactual outcomes.

However, these simulations do not provide ‘proof’ that human interaction is affecting the climate, or in fact that the climate is changing. Policy decisions are rarely made on the basis of absolute facts, but instead on likely outcomes, which climate change simulation can currently provide. Those who are holding out for high degrees of empirical confirmation are supported by those policy makers who want to delay action  [Edw99].

Decisions can be made (and are made all the time) despite uncertainty. Value judgements are continuously made by politicians, weighing the risks and costs of action against those of inaction. It is common sense to allow climate change scientists to advise policy makers who are creating or avoiding environmental policy decisions that may have a minor or drastic effect for future generations. Obviously an informed policy decision is better than those being made without any understanding at all of the global atmospheric functions, and possible future trends in climate change.

References

[Asp90]   W. Aspray. John von Neumann and the Origins of Modern Computing. MIT Press, Cambridge, MA, 1990.

[Bas03]   J. Bast. Eight reasons why ‘global warming’ is a scam. Heartland Institute, 2003. http://www.heartland.org/Article.cfm?artId=11548 accessed 19/05/2003.

[Bed99]   M Bedau. Can unrealistic computer models illuminate theoretical biology? In A Wu, editor, Proceedings of 1999 Genetic and Evolutionary Computation Conference Workshop Program, pages 20-23, Orlando, Florida, July 13 1999.

[Ben92]   L. Bengtsson. Climate system modeling prospects. In K. Trenberth, editor, Climate System Modeling, pages 705-725. Cambridge University Press, 1992.

[Ben01]   K. Benoit. Simulation methodologies for political scientists. The Political Methodologist, 10(1):12-16, 2001.

[Ber91]   J. Bernabo. Letter to. Science, 251:1475, June 1991.

[Bro92a]   G.E. Jr Brown. Global change and the new definition of progress. Geotimes, pages 19-21, June 1992.

[Bro92b]   G.E. Jr Brown. The objectivity crisis. American J. Physics, 60:779-781, September 1992.

[Bru96]   R. Brunner. Policy and global change research. Climatic Change, 32:121-147, 1996.

[Cac81]   D. G. Cacuci. Sensitivity theory for nonlinear systems: I. nonlinear functional analysis approach. Journal of Mathematical Physics, 22:2784-2802, 1981.

[Edw96]   P. Edwards. Global comprehensive models in politics and policymaking. Climatic Change, 32:149-161, 1996.

[Edw99]   P Edwards. Global climate science, uncertainty and politics: Data-laden models, model filtered data. Science as Culture, 8(4):437-472, 1999.

[Edw01]   P Edwards. Representing the global atmosphere: Computer models, data and knowledge about climate change. In Miller C and P Edwards, editors, Changing the Atmosphere: Expert Knowledge and Environmental Governance, pages 31-65. Cambridge, MA: MIT, 2001.

[Gie99]   R. Giere. Using models to represent reality. In L. Magnani, N. Nersessian, and P. Thagard, editors, Model-Based Reasoning in Scientific Discovery, pages 41-57. Kluwer Academic/Plenum Publishers, New York, 1999.

[Gre00]   K. Green. Exploring the Science of Climate Change. Reason Public Policy Institute, 2000.

[HMFC+ 96]   J. T. Houghton, L. G. Meira-Filho, A. Callander, N. Hams, A. Kattenberg, and K. Maskell, editors. Climate Change 1995: The Science of Climate Change, volume 1 of 3. Cambridge University Press, Cambridge, England, 1996.

[Imh00]   P. Imhof. Computer simulation in the controversy over limits to growth. Technical report, Technische Universitaet Hamburg-Harburg, 2000.

[Ker97]   R Kerr. Model gets it right - without fudge factors. Science, 276(5315):1041, 1997.

[KG93]   G. Kleindorfer and R. Ganeshan. The philosophy of science and validation in simulation. In G. Evans, M. Mollaghasemi, E. Russell, and W. Biles, editors, Proceedings of 1993 Winter Simulation Conference, pages 50-57, 1993.

[Kie92]   J. T. Kiehl. Atmospheric general circulation modeling. In K. E. Trenberth, editor, Climate System Modeling. Cambridge University Press, Cambridge, 1992.

[KKC93]   T Karl, R Knight, and J Christy. Global and hemispheric temperature trends: Uncertainties related to inadequate spatial sampling. Journal of Climate, 7:1144-1163, 1993.

[MMRBI72]   D. Meadows, D. Meadows, J. Randers, and W. Behrens III. The Limits to Growth. Universe Books, New York, 1972.

[NS01]   S Norton and F Suppe. Why atmospheric modeling is good science. In Miller C and P Edwards, editors, Changing the Atmosphere: Expert Knowledge and Environmental Governance, pages 88-133. Cambridge, MA: MIT, 2001.

[OSFB94]   N. Oreskes, K. Shrader-Frechette, and K. Belitz. Verification, validation, and confirmation of numerical models in eart sciences. Science, 263(5147):641-646, 1994.

[Phi56]   N. A. Phillips. The general circulation of the atmosphere: A numerical experiment. Quarterly Journal of the Royal Meteorological Society, 82(352):123-164, 1956.

[Ros75]   A. Rosenberg. The virtues of vagueness in the languages of science. Dialogue: Canadian Philosophical Review, 14:281-305, 1975.

[Sch92]   S. Schneider. Introduction to climate modeling. In K. Trenberth, editor, Climate System Modeling, pages 3-26. Cambridge University Press, 1992.

[Sch96]   E. Schneider. Flux correctiong and the simulation of changing climate. Annales Geophysicae, 14:336-341, 1996.

[SRSW99]   S. Shackley, J. Risbey, P. Stone, and B Wynne. Adjusting the policy expectations in climate change modeling: An interdisciplinary study of flux adjustments in coupled atmosphere-ocean general circulation models. Report 48, MIT, May 1999.

[Win01]   E. Winsberg. Simulated Experiments: Methodology for a Virtual World. PhD thesis, University of South Florida, 2001.