From C3Headlines: “The IPCC reports global warming to have increased from +0.7°C to +0.8°C over the past century. But a new peer reviewed study determines that real global warming was closer to +0.4°C, with the remaining IPCC amount claimed to be a result of man-made adjustments.”
The errors occurred because of the process of homogenization used by the IPCC to average out individual station temperature data. Anthony Watts explains the process: “In homogenization the data is weighted against the nearby neighbors within a radius. And so a station might start out as a ‘1′ data wise, might end up getting polluted with the data of nearby stations and end up as a new value, say weighted at ‘2.5′” See his graphics here.
The paper is: Steirou, E., and D. Koutsoyiannis, Investigation of methods for hydroclimatic data homogenization, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union, 2012.
We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.
From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.
One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence.
The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.
Full presentation here.
The authors give an example of station temperature data corruption shown in the graphic below, compare the raw station data with the “adjusted” data:
Among the paper’s conclusions:
Homogenization practices used until today are mainly statistical, not well justified by experiments and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic time series are regarded errors and are adjusted.
While homogenization is expected to increase or decrease the existing multiyear trends in equal proportions, the fact is that in 2/3 of the cases the trends increased after homogenization.
I surmise that much of the false warming incorporated in the revised temperature data is due to the urban heat island effect of large population centers. Stations within population centers show higher temperature readings than nearby rural stations, and the artificial high temperatures of cities are homogenized into regional data sets. If the findings of this paper hold up, it means that all government policy to control carbon dioxide emissions is both futile and unnecessary and a great waste of resources. Now, not only is there no physical evidence that carbon dioxide emissions are the major cause of warming, there is now also good evidence that the amount of warming has been greatly exaggerated.
The old (IPCC) method of station homogenization is discussed by Steve McIntyre at Climate Audit. He notes that the method used by the IPCC “is another homemade statistical method developed by climate scientists introduced without peer review in the statistical literature. As a result, its properties are poorly known.” In studying the USHCN data in 2007 and 2008, McIntyre “observed the apparent tendency of the predecessor homogenization algorithm to spread warming from ‘bad’ stations (in UHI sense) to ‘good’ stations, thereby increasing the overall trend.”
Examples of the urban heat island effect: