Predicting changes: Testing climate model accuracy
You need to have Flash Player 10 or above installed to view this video
Scientists use climate models to investigate the behaviour of the climate system and to simulate how the climate may respond to changes, including the increase in atmospheric greenhouse gases caused by human activities. To enable scientists to asses their accuracy, computer models are also used to simulate historical climate periods. Scientists examine data from model simulations of the present and 20th-century climate and compare them with observations from the same period. The models aren’t perfect, but their general performance at simulating climate patterns and trends gives scientists confidence in the models’ ability to represent the climate.
Computer models are used in many fields, from biology to aeroplane design. Some models rely on statistics and extrapolations from past behaviour, such as those used to predict the behaviour of financial markets. Other models are based on knowledge of the underlying principles governing the way a system works, such as car safety tests or simulations of the human body. Climate models are built on scientific understanding of the physical laws governing the climate’s behaviour. Scientists test their knowledge by comparing their expectations of how certain processes work with real-world measurements of those processes. Scientific understanding isn’t perfect, but climate models incorporate the best representations they can of the complex interactions between our planet’s atmosphere, oceans, land, ice and vegetation.
Despite their many limitations, climate models have demonstrated the ability to simulate what scientists consider to be the important features of the climate system. Prompted by the daily and seasonal sunlight cycles, models reproduce the corresponding daily and seasonal cycles of temperature, air pressure, rainfall and snow and ice cover observed in the real world. The models also exhibit the dominant spatial patterns found in our climate, such as the differences in temperature and rainfall at different latitudes, and the large-scale circulation of the atmosphere and oceans. Natural climate variability – such as the El Niño/La Niña cycle – also arises spontaneously in climate models.
Inputting climate models with the measured changes in natural and human influences on 20th-century climate helps scientists assess the models’ ability to simulate the climate’s response to changes such as increasing greenhouse gases and variations in solar output. Comparing model output with surface temperature measurements over the same period, scientists found that the models reproduced – within the uncertainties – the observed long-term global warming trend. The models also correctly simulated the observed warming trends over each individual continent, though there are larger uncertainties at regional scales. Some models have also been tested by inputting the aerosols given off by large volcanic eruptions, and correctly simulated the 0.5°C drop in global surface temperature following the Mount Pinatubo eruption in 1991.
Surface air temperature strongly influences many features of our climate. Scientists think the global warming trend observed over the 20th century also led to some changes in rainfall. It’s more difficult to measure global rainfall than global temperature, because rainfall has more variability and requires a greater number of weather stations to build an accurate record. Nevertheless, measurements show increases in average rainfall at mid to high northern latitudes and decreases at some tropical latitudes. Most climate models show these same patterns of rainfall change. Measurements also show an increase in average rainfall intensity, which is consistent with what scientists expect in a warming world. Model simulations show a similar intensification of the water cycle.
All global climate models have been tested by simulating the climate over the 20th century, but some have also simulated the climate of the past 1000 years. This requires more time and computer resources than the standard 20th-century simulation, but gives insights into longer-term model performance. Scientists must rely on proxy data to reconstruct global temperatures from times before direct measurements began, resulting in larger uncertainties. But many independent analyses have indicated that the pre-1900 global climate was mostly stable, with a slight warming in medieval times followed by a slight cooling. Model simulations produce global variations that are broadly consistent with these long-term temperature records, giving scientists further confidence in the models’ ability to represent the climate system.
Climate models are the most sophisticated tools available to predict how our climate could change. But scientists recognise the models’ limitations and are careful to use not only a range of different models but also a range of different prediction methods. Climate models predict that the increase in global average surface temperature following a doubling of carbon dioxide – known as the ‘climate sensitivity’ – lies somewhere in the range 1.5–4.5 °C. This is in broad agreement with estimates derived from basic physics or from examining past climate changes. There are many uncertainties, but the fact that these very different methods produce such similar predictions gives scientists confidence in the models’ ability to predict how the climate could respond to increasing greenhouse gas levels.