New data visualizations from the NASA Center for Climate Simulation and NASA’s Scientific Visualization Studio at Goddard Space Flight Center, Greenbelt, Md., show how climate models used in the new report from the United Nations’ Intergovernmental Panel on Climate Change (IPCC) estimate possible temperature and precipitation pattern changes throughout the 21st century.
For the IPCC’s Physical Science Basis and Summary for Policymakers reports, scientists referenced an international climate modeling effort to study how the Earth might respond to four different scenarios of carbon dioxide and other greenhouse gas emissions throughout the 21st century. The Summary for Policymakers, the first official piece of the group’s Fifth Assessment Report, was released Fri., Sept. 27.
This modeling effort, called the Coupled Model Intercomparison Project Phase 5 (CMIP5), includes dozens of climate models from institutions around the world, including from NASA’s Goddard Institute for Space Studies.
The model has not changed today. It has not changed in 35 years. It has not changed in 100 years. Better data, same model.
Science is all about creating models of the natural world. Better data leads to better models that more accurately reflect the reality of the natural world.
Science works by endlessly trying to construct a better model. It does this by competition between multiple models. The model that best explains all the data, that can best predicts the most results without requiring them, and that can best survive attempts to falsify it usually wins.
That does not mean it is a perfect representation of reality. Few things humans do can claim that.
It is simply the best model we frail, imperfect humans have now.
Sometimes better data leads to a major shift in the scientific model. We saw this with Copernicus, Galileo, Boyle, Darwin, Einstein and Feynman.
Most times, though, better data simply leads to a greater understanding of the best scientific model, a minor shift. A good model is resilient.
We now have exponentially better data for climate change models than 35 years ago, which was exponentially greater data than 100 years ago.
The basic scientific model, presented by the great scientist Svante Arrhenius, was proposed over 100 years ago. As I wrote 3 years ago:
I have read the Arrhenius paper. It provides an estimate of global temperatures based on increasing or decreasing carbon dioxide levels. No one has shown that his basic premise is wrong – increasing carbon dioxide levels increase the global temperatures.
Over 100 years ago the first scientific model of the effect on global temperatures by carbon dioxide was proposed. In the 70s, increasing data gathering and the use of computers allowed another look to be made at this proposal.
Here is a graph based on a model from data 35 years ago on what temperatures would do with increasing carbon dioxide. It is a primitive model based on very primitive data. We can also see just how well this primitive model predicted its future and our past:
Not bad at all.
None of the data discussed 35 years ago changed the basics of the model proposed 100 years ago. The data just provided greater granularity, allowing is to gauge the extent of change over time. Not too bad a job for the computing equivalent of using abacuses.
The primitive data we had 35 years ago actually did a pretty remarkable job of predicting the future. If we had made policy bets based on that model, we would have been correct.
Unfortunately we did not make many policy bets. We waited for better data. Then we waited again. And again.
None of that data has shifted the model that is over 100 years old. Exponentially better data have not shifted it. Exponentially greater data have not shifted it.
The current model is the best at fitting all the data. It has shown a tremendous ability to predict the future. And it has stood up to decades of falsification attempts.
It is based on a model that was first proposed before airplanes, before the Model T, before the NFL, before women could vote, before airlines. Before the idea of a 20th century even existed.
There simply is no better model. More data will not substantially alter this model.
It is not only the best we have right now. It is the best we have had for 100 years. Nothing suggests that waiting another 10, 35, 100 years will alter its predictions.
The chance of it being totally wrong is vanishing low, and getting lower every year. The chance of it being mostly correct is tremendously high. and getting higher every year.
We have the tools to fix its predictions. We have the ability to shift the path of the future. With all our oars in the water, humanity can find a solution that works.
But waiting for better data accomplishes nothing. Denying the model accomplishes nothing. Just as it did 35 years ago. Just as it did 100 years ago.