Main temperature database used by IPCC found to contain multiple errors

An audit of the HadCRUT4 dataset, the primary global temperature database used by the Intergovernmental Panel on Climate Change (IPCC) has found multiple errors.

HadCRUT4 is also the dataset at the center of “ClimateGate” from 2009, managed by the Climate Research Unit (CRU) at East Anglia University.

The paper, An Audit of the Creation and Content of the HadCRUT4 Temperature Dataset by John McLean (PhD), was first published as a PhD thesis and now as a book. Get the book for $8 here. Read the original thesis here (free download).

The audit found more than 70 areas of concern about data quality and accuracy.

Australian researcher John McLean says that HadCRUT4 is far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world.

Main points:

The Hadley data is one of the most cited, most important databases for climate modeling, and thus for policies involving billions of dollars.

McLean found freakishly improbable data, and systematic adjustment errors, large gaps where there is no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors.

[The improper transposition of Fahrenheit temperatures to Celsius is serious. Fahrenheit 40 is a cool temperature but Celsius 40 is equivalent to 104 Fahrenheit. This erroneous transposition is real “man-made global warming.”]

Almost no quality control checks have been done: outliers that are obvious mistakes have not been corrected. For instance, one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C (176 F). One town in Romania stepped out from summer in 1953 straight into a month of Spring at minus 46°C. These are supposedly “average” temperatures for a full month at a time. St Kitts, a Caribbean island, was recorded at 0°C for a whole month, and twice!

Temperatures for the entire Southern Hemisphere in 1850 and for the next three years are calculated from just one site in Indonesia and some random ships.

Sea surface temperatures represent 70% of the Earth’s surface, but some measurements come from ships which are logged at locations 100km inland. Others are in harbors which are hardly representative of the open ocean.

When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.

Details of the worst outliers:

For April, June and July of 1978 Apto Uto, Colombia had an average monthly temperature of 81.5°C, 83.4°C and 83.4°C respectively. (178 to 182 Fahrenheit)

The monthly mean temperature in September 1953 at Paltinis, Romania is reported as -46.4 °C (in other years the September average was about 11.5°C).

At Golden Rock Airport, on the island of St Kitts in the Caribbean, mean monthly temperatures for December in 1981 and 1984 are reported as 0.0°C. But from 1971 to 1990 the average in all the other years was 26.0°C.

Bad data and bad modeling assumptions make IPCC temperature simulations diverge widely from really. That’s why we should not believe the IPCC when they cry “wolf” and say it’s the end of the world unless we give them billions of dollars and get rid of fossil fuels.

The primary conclusion of the audit (as noted by Anthony Watts) is that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.

One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.

Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as “indicative” of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect.

 

Why is it that a PhD student working from home can find mistakes that the British Met Office, a £226 million institute with 2,100 employees, could not. Significantly, the Met Office, in a statement, said they do not disagree with any of his claims.

Maybe, as President Dwight D. Eisenhower said in his farewell address:

Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

The fake two degree political limit on global warming

Climate change in perspective – a tutorial for policy makers

Advertisements