It was before 5:00 AM PDT on Thursday morning when I heard from The Weather Channel’s chirpy Stephanie Abrams that the debate was settled: “global warming is real.” The way we know this is that 300 scientists have come out with a new report that there has been a 1 degree Celsius increase in the global average temperature over the last 50 years.
That is, in fact, what she said: 1 degree Celsius. Somebody’s bound to have it recorded somewhere. I don’t remember how many times she said it, but it was more than once. I’m sure it was simply a burble, an inadvertent misstatement, because as I confirmed later on the web, the claimed temperature increase is 1 degree Fahrenheit, not 1 degree Celsius. This is a significant difference in the scope of the claim, of course, degrees Fahrenheit representing smaller increments than degrees Celsius (1 degree F being about 0.55 degrees C). The increase in global average temperatures thus sounds a whole lot like the 0.6C advertised to us over the last decade, until discrepancies in NOAA’s data processing and the revelation of ClimateGate-o-List gave us the parody-ready expression “hide the decline.”
But like all the unequivocal assertions periodically made about “global warming,” this newest one reverts automatically, on the slightest examination, to either banal meaninglessness or probable falsehood. The verbal slip of “Celsius” versus “Fahrenheit” just seems emblematic of the eager, relentless carelessness with which the case is regularly made to the public.
For one thing, the statement “the earth is warming up” is not the same thing as saying “human activity is causing the earth to warm,” nor does it have any meaning at all outside of some climatologically relevant context. (The earth warms up – and then cools down – virtually every day of the year over most of its surface.)
I suspect a good 7 out of 10 skeptics of anthropogenic global warming would stipulate that there does seem to be fairly good evidence of a slight increase in average recorded global temperatures over the last 50 years. But we could say that about multiple 50-year periods in the last several thousand years – and indeed, we need only extend our data set back another 50 to discover that the last 50 years have not seen “warming” more extensive than the previous 50 years, when measured by recorded temperature data.
Trumpeting the findings of the 300 scientists as incontrovertible evidence of “global warming” amounts to chopping logic and burying premises for rhetorical effect. Unfortunately for the narrow substance of the claim, there are also good reasons to doubt its particulars. For one thing, as this post outlines, much of the evidence it offers is either cherry-picked or anecdotal. Its principal observations don’t show a linear trend (e.g., temperatures don’t show a linear upward trend), begging the indispensable question of what temperatures were doing before the last 50 years. The memory of the “hockey stick graph” is fresh enough in our minds that we won’t blindly accept a 50-year cut-off without skepticism about the longer-term trend being ignored.
But the methodology behind the NOAA data used for the new study also remains questionable. This more critical piece provides an outline of concerns, citing (among several) one analysis that came to this conclusion:
NOAA “systematically eliminated 75% of the world’s stations with a clear bias towards removing higher latitude, high altitude and rural locations, all of which had a tendency to be cooler,” explained climate researchers Joseph D’Aleo and Michael Smith in a study published by the Science and Public Policy Institute. “The thermometers in a sense, marched towards the tropics, the sea, and to airport tarmacs.”
Newman also links to this eye-opening post by Anthony Watts – at whose blog it was famously demonstrated that the CRU climatologists were, in fact, using data-norming to “hide the decline” – which shows the instances of literal data interpolation that ought to make us take to the streets with torches and pitchforks. Where there aren’t local sensors to receive data from – e.g., in parts of Africa, Canada, and Greenland – NOAA has been basically making up temperature data as if there were sensors in place. Surprise, surprise: where the data points are interpolated, the trend of “observations” is toward higher temperatures. As Watts drily notes:
There seems to be an inverse correlation between the number of [actual] stations and warming – more stations in a 5×5 degree grid and less warming is observed.
Interpolation is not, per se, a criminal practice in scientific procedures. But it should certainly be clarified and highlighted when advocates propose to make public policy based on the conclusions drawn from it. The fact is that we don’t know what the temperatures were in a number of locations where NOAA has arbitrarily assigned values. To call a data set that includes this form of input “incontrovertible evidence” is to turn empiricism and even sanity itself on their heads.
No part of the argument seems to remain intact under scrutiny. The loss of Antarctic ice, for example, is not by any means established as a seminal trend. The Antarctic seems, in fact, to be adding ice, even as some of its edges, in some areas, recede. But that isn’t the whole story either: apparently, the introduction of space-based magnetic sensors is confusing the picture, and may be exaggerating the amount of icepack recession in these edge areas.
Meanwhile, as pointed out here, the margin of error in the Antarctic temperature data used by NASA – 2-3 degrees Celsius – is too great for a temperature increase of less than that margin to be validated. With a margin of error that big, any depiction of a temperature trend that’s based on the reported observations is an arbitrary product of scientist bias. Visit the reader comments at this post for a good discussion of why icepack recession is not directly correlated with temperature in this part of the world anyway. Where temperatures never rise above freezing, ice loss is not due to the melting we traditionally think of – just as, in the Himalayas, icepack loss correlates much more strongly to solar radiation than to ambient air temperature.
Whenever you do the smallest amount of research, you discover that the most repetitive claims being touted as AGW/CC “science” are worse than junk science: they are like fairy tales for toddlers. The Bugs Bunny-Roadrunner Hour I used to watch as a kid offered more empiricism and integrity. Given the numerous holes and guesses infesting our current temperature databases, I assess that the best we can truthfully say is that it looks like there’s been a slight rise in the globally-averaged temperature in the past half-century, if we consider solely the numbers offered from those databases – but ultimately, we can’t be absolutely sure.
There are too many places on earth that still haven’t been measured comprehensively over time – and there’s been too little strictness in ensuring identical measuring conditions over time, even where we do have comprehensive observations. Meanwhile, this slight rise in temperature, if it is one, is neither unique nor obviously a unidirectional trend, because we’ve seen temperatures go up and down in a cyclical manner before. Insisting that our average temperatures are being influenced by anthropogenic carbon emissions further ignores the fact that we don’t even have a means of accurately detecting and recording the “greenhouse gas feedback process” on which the whole AGW/CC theory depends, much less identifying the precise impact on this posited process of the slight amount of terrestrial carbon emissions for which humans are responsible.
We really don’t know enough for certain to start ordering each other around, as with Cap-and-Trade legislation, EPA regulations, and judicial rulings. That’s the bottom line.
Cross-posted at Hot Air.