Climate Scientists Get an A For Their Warming Predictions

The key metric in all models of the earth’s climate is sensitivity. That is, how much will the globe warm for every ton of greenhouse gases that we dump into the atmosphere? If sensitivity is low, we have little to worry about. If sensitivity is high, we’re well on our way to broiling ourselves to death.
Naturally, then, it’s important to get this right. Today, a new paper was released that reviews how accurate climate scientists have been at determining this, and the answer is that they’ve been remarkably good at it. Here’s the original chart from the paper, which covers 15 models that have been published since 1970:
This is a little hard to follow, so I’ve created an unauthorized version that shows how far off each model has been in percentage terms:
As you can see, once you get past the very earliest crude models, the climate community has done pretty well. With only a couple of exceptions, their models have predicted sensitivity within ±20 percent or so. The average of all the modern models is -11 percent, which means (a) the models have been very close to reality, and (b) if anything, the models have been a little low. The earth is actually warming faster than they’ve predicted.
Moral of the story: listen to the climate scientists. Their models are pretty good, and there’s little reason to think they’ve missed anything important. Keep this in mind when your skeptic friends start going on about urban heat islands or solar cycles or whatnot. Because guess what? Climate scientists know about all these things too! Some of them don’t matter, and the ones that do have already been incorporated into current models. Climate change is real.