New study shows that 50% of warming claimed by IPCC is fake

From C3Headlines: “The IPCC reports global warming to have increased from +0.7°C to +0.8°C over the past century. But a new peer reviewed study determines that real global warming was closer to +0.4°C, with the remaining IPCC amount claimed to be a result of man-made adjustments.”

The errors occurred because of the process of homogenization used by the IPCC to average out individual station temperature data. Anthony Watts explains the process: “In homogenization the data is weighted against the nearby neighbors within a radius. And so a station might start out as a ‘1′ data wise, might end up getting polluted with the data of nearby stations and end up as a new value, say weighted at ‘2.5′” See his graphics here.

The paper is: Steirou, E., and D. Koutsoyiannis, Investigation of methods for hydroclimatic data homogenization, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union, 2012.

Abstract:

We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.

From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.

One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence.

The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

Full presentation here.

The authors give an example of station temperature data corruption shown in the graphic below, compare the raw station data with the “adjusted” data:

Example-station-homogenization

Among the paper’s conclusions:

Homogenization practices used until today are mainly statistical, not well justified by experiments and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic time series are regarded errors and are adjusted.

While homogenization is expected to increase or decrease the existing multiyear trends in equal proportions, the fact is that in 2/3 of the cases the trends increased after homogenization.

I surmise that much of the false warming incorporated in the revised temperature data is due to the urban heat island effect of large population centers. Stations within population centers show higher temperature readings than nearby rural stations, and the artificial high temperatures of cities are homogenized into regional data sets. If the findings of this paper hold up, it means that all government policy to control carbon dioxide emissions is both futile and unnecessary and a great waste of resources. Now, not only is there no physical evidence that carbon dioxide emissions are the major cause of warming, there is now also good evidence that the amount of warming has been greatly exaggerated.

The old (IPCC) method of station homogenization is discussed by Steve McIntyre at Climate Audit. He notes that the method used by the IPCC “is another homemade statistical method developed by climate scientists introduced without peer review in the statistical literature. As a result, its properties are poorly known.” In studying the USHCN data in 2007 and 2008, McIntyre “observed the apparent tendency of the predecessor homogenization algorithm to spread warming from ‘bad’ stations (in UHI sense) to ‘good’ stations, thereby increasing the overall trend.”

See also:

The Case Against the IPCC and Proponents of Dangerous Anthropological Global Warming

The Assumed Authority

IPCC Admits Its Past Reports Were Junk

Examples of the urban heat island effect:

Warmer nights no proof of global warming

Advertisements

2 comments

  1. Anyone reading this article should note that what Koutsoyiannis et al. actually say about the surface temp record in the paper is this: “The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is smaller than 0.7-0.8°C.”

    It should also be noted that the authors are not experts in data homogenization, nor is this a “paper.” It is the conference presentation of a graduate thesis. Bizarrely, this thesis was submitted to the “Climate, Hydrology and Water Infrastructure” session, rather than the session on homogenization.

    Also, lower troposphere temp trend (from UAH) for the last 33 years is +.46C, and the troposphere is slightly cooler than the surface.

    In the interest of critical thinking, any self-labeled “skeptic” should read the other side of the story:
    http://variable-variability.blogspot.de/2012/07/investigation-of-methods-for.html

  2. To: David S. Leaton 18Jul12

    First, David, for the record you and other readers should review Koutsoyiannis’ credibility, as set forth on Google Scholar: http://scholar.google.com/citations?user=OPA_BScAAAAJ. He heads up Department of Environmental Engineering at Athen’s National Technical University. His highly cited work includes 600+ publications, including 90+ “peer-reviewed” journal papers involving rather heavy statistical work. For years he has been critical of sloppy IPCC research, which might be why he is unpopular in some IPCC circles. Second, it is worth noting that traditional “peer-review” is as antiquated. (See: http://www.forbes.com/sites/patrickmichaels/2011/06/16/peer-review-and-pal-review-in-climate-science/). Many if not most serious scientists jokingly refer to the current “peer-review” process as “pal-review”. It is fast falling into disrepute as inter-net driven professional ‘crowd-review’ comes into vogue. The newly evolving standard is double-blind review by anonymous professionals with no affiliation with the anonymous author. Ideally it starts in the draft stage so bad or inadequate science gets filtered out up front.
    In contrast, the current ‘pal-review’ process typically involves a journal editor picking two peers from a field that includes co-workers who have had their own papers reviewed by the author. Sadly, the current review process is most often a simple reading without any challenge to the work at all.
    Thirdly, this whole article is about an issue that will not exist in the future. The best way to track “global temperature” is with satellite instrumentation, NOT with readings from land based recording stations plagued by inconsistent, poorly sited, and evolving equipment, untrained technical help, and inadequate record-keeping. Note that there has been no global warming since 1997, even though CO2 has increased significantly. See: http://en.wikipedia.org/wiki/Satellite_temperature_measurements.

Comments are closed.