Winter Weather Predictions from NOAA and Farmers Almanac 2018-2019

“It’s tough to make predictions, especially about the future.” -attributed to Yogi Berra.

A tale of two predictions: Last fall NOAA (National Oceanic and Atmospheric Administration) and the Farmers’ Almanac issued weather predictions for the winter of 2018-2019. Let’s see how they did.

NOAA, using state of the art computer models, predicted that the U.S. would have “another mild winter.” Here is NOAA’s map of their prediction for the winter of 2018-2019. It shows warmer than normal temperatures for most of the country.

NOAA did somewhat hedge their bet: “Both the temperature and precipitation outlooks depend to a certain extent on typical El Niño impacts, but forecasters think a weak El Niño event is most likely. This means that despite the potential for El Niño, confidence in this outlook is less than we had during recent strong events like in the winter of 2015/16.” In fact, El Niño fizzled this year. NOAA got the precipitation forecast wrong also. (Read the NOAA report)

The Farmers’ Almanac won’t say exactly how they make their predictions and issue only this statement:

The editors of the Farmers’ Almanac firmly deny using any type of computer satellite tracking equipment, weather lore or groundhogs. What they will admit to is using a specific and reliable set of rules that were developed back in 1818 by David Young, the Almanac’s first editor. These rules have been altered slightly and turned into a formula that is both mathematical and astronomical. The formula takes things like sunspot activity, tidal action of the Moon, the position of the planets, and a variety of other factors into consideration.

Farmers’ Almanac predicted “teeth-chattering cold ahead” for the winter of 2018-2019.

So, whose modeling is better, NOAA or Farmers’ Almanac?

Remember, NOAA is the agency that issues the Climate Assessment reports upon which our climate policy decisions are supposed to be made. Recent NOAA reports rely more and more upon the output of the most extreme climate models, the results of which diverge widely from actual measurements.

Here are my comments on their recent reports:

National Climate Assessment = science fiction and politics

Fourth National Climate Assessment is junk science

Fourth National Climate Assessment, Part 2 – no science, just scaremongering


It looks like Yogi Berra was right.

NOAA caught manipulating temperature data – again

Dr John Bates, a recently retired senior scientist at the National Oceanic and Atmospheric Administration (NOAA), alleges that a NOAA paper written before the historic climate conference in Paris in 2015 breached NOAA’s own rules and was based on misleading and unverified data. That, to many, looks like the paper was designed to stoke up hysteria over global warming in the run-up to the conference. (Source)

NOAA has often been accused of manipulating data for political purposes. See for instance, my ADI article: The past is getting cooler which reflects a curiosity of published government temperature records that show the 1930s getting cooler and cooler with each update of the record. The more recent scandal derives from NOAA’s attempt to erase the 18-year “pause” in global warming. Even though atmospheric carbon dioxide has been rising, global temperature has failed to respond as the climate models say it should. (See El Nino to El Nino – no warming of global temperature) This recent scandal was exposed by David Rose in an article in the British paper Daily Mail.

Global temperatures published by NOAA compared to global temperatures published by the British MET office shows that NOAA temperatures are consistently higher. In the graph below (source), the red line shows the current NOAA world temperature graph, which relies on the ‘adjusted’ and unreliable sea temperature data cited in the flawed ‘Pausebuster’ paper. The blue line is the UK Met Office’s independently tested and verified ‘HadCRUT4’ record, showing lower monthly readings and a shallower recent warming trend.


David Rose notes: NOAA’s 2015 ‘Pausebuster’ paper was based on two new temperature sets of data – one containing measurements of temperatures at the planet’s surface on land, the other at the surface of the seas. Both datasets were flawed. This newspaper has learnt that NOAA has now decided that the sea dataset will have to be replaced and substantially revised just 18 months after it was issued, because it used unreliable methods which overstated the speed of warming. The revised data will show both lower temperatures and a slower rate in the recent warming trend. The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.

To add to the confusion, NOAA also changed the computer programs it uses to compile temperature data, and guess what? The new program creates global warming where there had been none before. These changes are documented in a post by Rud Istvan.

“A 2011 paper announced that NOAA would be transitioning to updated and improved CONUS software around the end of 2013. The program used until the upgrade was called Drd964x. The upgrade was launched from late 2013 into 2014 in two tranches. Late in 2013 came the new graphical interfaces, which are an improvement. Then about February 2014 came the new data output, which includes revised station selection, homogenization, and gridding. The new version is called nClimDiv.” The graphs below show some of the results for temperatures from 1900 to 2010. Left shows old system results versus new system results on right.




Another way NOAA influences the official temperature is by removal of thousands of weather station land thermometers from remote, high altitude, and/or non-urban locations since the 1970s. These are stations which do not show the warming trends predicted by models, as they are not affected by proximity to artificial or non-climatic heat sources (pavements, buildings, machinery, industry, etc.) like urban weather stations are. (Thermometers near urban heat sources can cause warming biases of between 0.1 and 0.4°C per decade.) This inflates the average temperature reported. Read more

Perhaps the Trump administration can get NOAA out of politics and back to science.

New study suggests that U.S. temperature trends published by NOAA are 50% too high

A new study presented by meteorologist Anthony Watts (proprietor of the Watts Up With That blog) at the 2015 Fall Meeting of the American Geophysical Union “suggests that the 30-year trend of temperatures for the Continental United States (CONUS) since 1979 are about two thirds as strong as officially [published] NOAA temperature trends.” That means official temperatures publicized by the National Oceanic and Atmospheric Administration (NOAA) are 50% too high. (See full post)

Using NOAA’s U.S. Historical Climatology Network, which comprises 1218 weather stations in the CONUS, the researchers were able to identify a 410 station subset of “unperturbed” stations that have not been moved, had equipment changes, or changes in time of observations, and thus require no “adjustments” to their temperature record to account for these problems. The study focuses on finding trend differences between well sited and poorly sited weather stations, based on a WMO approved metric… for classification and assessment of the quality of the measurements based on proximity to artificial heat sources and heat sinks which affect temperature measurement.

This new study is a follow-up to a study published in 2010 entitled “Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends” [link to paper] That paper concluded:

Temperature trend estimates vary according to site classification, with poor siting leading to an overestimate of minimum temperature trends and an underestimate of maximum temperature trends, resulting in particular in a substantial difference in estimates of the diurnal temperature range trends.

The map below, from the new study compares mean temperature trends from compliant class 1&2 weather stations (top left) with non-compliant class 3, 4, and 5 stations (bottom left) and with the map NOAA published (right). You will notice on the non-compliant and NOAA maps, the southwest has highly increasing temperature trends. Maybe that is why certain UofA professors keep claiming that the Southwest is “ground zero” for climate change.

USHCN comparison Watts

Results from the compliant stations are similar to results from NOAA’s other network of modern stations, the U.S. Climate Reference Network which has been in operation for about 11 years. NOAA doesn’t mention this network and does not distinguish its separate results in the compilation shown above. See my ADI story: New NOAA data show cooling trend for last 10 years. When a certain year is proclaimed the hottest ever, remember it depends on who is doing the measurements.

For a local example of results from a non-compliant station versus a compliant station, compare the temperature trends of the Tucson USHCN station (now non-compliant) with that of Tombstone. Both graphs show the temperature trends from 1890 through 2010. Tucson’s increasing temperature is due to the Urban Heat Island effect, i.e., our concrete and asphalt absorb heat during the day and make both days and nights warmer.

Tucson-Tombstone temp

It appears to me that NOAA scientists (and those in other federal agencies) are purposely manipulating the data to conform to a political agenda. Such is the sad state of “climate science.”

We have heard and probably will hear again that 2015 is the hottest year ever, but real data such as that compiled by Tony Heller (see here) shows that “2015 Was One Of The Coolest Years On Record In The US.”


The Bankruptcy of Climate Science

The past is getting cooler

NOAA Scientists Leave out Inconvenient Data from Congressional Testimony

NOAA can’t find link between global warming and extreme weather

New NOAA data show cooling trend for last 10 years

Newer NOAA data show an 18-year cooling trend in US

NOAA accused of fabricating temperature data

NOAA temperature record “adjustments” could account for almost all “warming” since 1973

NOAA experiment shows US temperatures not as warm as reported

Evidence that CO2 emissions do not intensify the greenhouse effect

The “lost” pause of global warming

Since the AGW global warming hypothesis doesn’t fit the data, the National Climate Data Center (NCDC) has changed the data to fit the hypothesis.

After about 60 journal articles failed to explain the lack of warming over the past 18 years, NCDC now claims there was no pause in global warming; it was all a bookkeeping error.

Observational data from two satellite systems and balloon-borne radiosondes show no net global warming for at least the past 18 years even though atmospheric carbon dioxide content has been rising. The lack of warming is very inconvenient for government policy. Something had to be done.

18 year temp

Ignoring physical evidence and all the journal articles about the pause, the National Climate Data Center just published a paper (Karl et al “Possible artifacts of data biases in the recent global surface warming hiatus”) in Science Magazine which claims that the widely reported and accepted temperature hiatus is an illusion – just an artifact of data analysis – and that the global climate never really stopped warming.

This propaganda is probably designed to bolster the next round of UN-IPCC meetings in Paris in December where the IPCC will try to convince countries to spend billions of dollars fighting climate change and reduce carbon dioxide emissions.

The claim is ironic because NCDC, a division of the National Oceanic and Atmospheric Administration (NOAA), is deep into data manipulation itself. A new post from the NoTricksZone shows that “Comprehensive Analysis Reveals NOAA Wrongfully Applying ‘Master Algorithm’ To Whitewash Temperature History.” The author of that post says “I caught NOAA purposefully using computer code (algorithms) to lower historic temperatures to promote present day temperatures as the warmest on record.”

That’s not the first time. In my article “The past is getting cooler” I demonstrate that published government temperature records show the 1930s getting cooler and cooler with each update of the record. This phenomenon is due to government data manipulation designed to make the present look warmer in relation to the past.

Dr. S. Fred Singer has an American Thinker article on this latest gambit by NCDC. Singer notes that “There are at least two rival data centers that may dispute the NCDC analysis:

the Hadley Centre in England and the NASA-Goddard Institute for Space Studies (GISS).In fact, Hadley’s partner, the Climate Research Unit at the University of East Anglia, was the first to announce, on the BBC, the existence of a pause in global warming. Then there are also dozens of scientists who have published research papers, purporting to provide an explanation for the reported pause.”

NCDC is basing its claim on the surface temperature record which, itself, has many problems. Singer goes on to write, “Not only that, but a look at the detailed NCDC evidence shows that much depends on polar temperatures — which are mostly guessed at, for lack of good observations. If one uses the (truly global) satellite data, analyzed either by UAH or by RSS, the pause is still there, starting around 2003.” And, “the same satellite data show no warming trend from 1979 to 2000 – ignoring, of course, the exceptional super-El-Nino year of 1998.”

A long post by Bob Tisdale and Anthony Watts demonstrates that all the claimed warming is due to NCDC manipulation of the data. In the same post, Dr. Judith Curry notes that NOAA use considerable gap filling of temperatures in the Arctic which ” introduces substantial error into their analysis.”

A separate article by Patrick J. Michaels, Richard S. Lindzen, and Paul C. Knappenberger notes that NOAA inappropriately adjusted ARGO buoy temperature data upwards:

“…the authors’ treatment of buoy sea-surface temperature (SST) data was guaranteed to create a warming trend. The data were adjusted upward by 0.12°C to make them “homogeneous” with the longer-running temperature records taken from engine intake channels in marine vessels. As has been acknowledged by numerous scientists, the engine intake data are clearly contaminated by heat conduction from the structure, and as such, never intended for scientific use. On the other hand, environmental monitoring is the specific purpose of the buoys. Adjusting good data upward to match bad data seems questionable, and the fact that the buoy network becomes increasingly dense in the last two decades means that this adjustment must put a warming trend in the data.”

In a long technical post, Bob Tisdale shows that British Night Marine Air Temperature dataset which is used by NOAA,does not support NOAA’s claims of no slowdown in global surface warming.

Physicist Lubos Motl opines that “A whole discipline of pseudoscience – one pretending to be science, like most pseudosciences – has been created. It is the ‘climate change science’ whose preachers – pretending to be scientists – shout that the sky is falling. The ‘hiatus’ is an inconvenient truth for these ‘researchers’ so as of mid May 2015, they have proposed 63 explanations of the hiatus.” Motl then goes on to discuss how NCDC manipulated sea surface temperature data and provides a more general discussion of measuring and creating uncertainty in datasets.

Doug L. Hoffman at Resilient Earth also has a long post on problems with the NCDC paper. He concludes his post with this:

“According to Karl et al the pause was not, is not, real. It is only an artifact of decades of crappy temperature data, the same data that has fed the grossly inaccurate climate models that are at the heart of the global warming scam. And that’s the real bottom line—for this paper to be correct all the historical data, all the work of climate scientists around the world over the past 40-50 years, has been in error. If this paper is correct climate science has lost all credibility.”

We see that government “climate science” is nothing more than political science applied to support policy rather than have an honest assessment of conditions. Essentially, what Karl et al. have done is revise data to match a particular hypothesis. That’s your tax dollars at work. Are we paying for climate whores?

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

Failure of climate models shows that carbon dioxide does not drive global temperature

Climate change in perspective

Mystery of the missing heat

2014 the warmest? NASA now says “never mind”

A joint press release by NOAA and NASA on January 16 claimed that “The year 2014 ranks as Earth’s warmest since 1880…” The mainstream press ran with the story as if it portended doom. Now, we find that the NOAA/NASA press release left out several vital pieces of information.

The press release failed to mention that the calculated global temperature for 2014 was only 0.02 deg C (two-hundredths of a degree) higher than the formerly hottest 2010; not anything to get excited about. The press release failed to mention that the error range in the measurements was at least ±0.1 deg C, an order of magnitude greater than the difference touted. And they failed to mention that NOAA/NASA did not include data from satellites which show that 2014 was no where near the warmest. The touted temperature high was based only on surface measurements which are subject to a great many biases and do not cover the whole globe as satellites do.

Former Harvard physicist Luboš Motl says they did it on purpose and the press release provides “A direct proof that the professional alarmists are intentionally lying.” (Source)

Reporter David Rose of the Daily Mail in England interviewed Gavin Schmidt head of NASA’s Goddard Institute for Space Studies (GISS). During that interview Schmidt conceded that because of the uncertainties in temperature data, NASA is only 38% confident that 2014 was the warmest since 1880. The press release failed to mention that fact also. (Source)

Marc Morano, publisher of Climate Depot wrote, “The Feds are conning the public on 2014 being the ‘hottest year.’ We now know that both NASA and NOAA knew their ‘hottest year’ claims would not hold up to scientific scrutiny. But both agencies chose instead to loudly push the global warming narrative to a willing and compliant news media. The ‘hottest year’ claims had already been exposed as statistically meaningless…”

Dr. Roy Spencer of UAH (one of the two keepers of NASA satellite data) comments:

“In the three decades I’ve been in the climate research business, it’s been clear that politics have been driving the global warming movement.

“We still don’t understand what causes natural climate change to occur, so we simply assume it doesn’t exist. This despite abundant evidence that it was just as warm 1,000 and 2,000 years ago as it is today. Forty years ago, “climate change” necessarily implied natural causation; now it only implies human causation.

“What changed? Not the science…our estimates of climate sensitivity are about the same as they were 40 years ago.

“What changed is the politics. And not just among the politicians. At AMS

[American Meteorological Society] or AGU [American Geophysical Union] scientific conferences, political correctness and advocacy are now just as pervasive as they have become in journalism school. Many (mostly older) scientists no longer participate and many have even resigned in protest.

“Science as a methodology for getting closer to the truth has been all but abandoned. It is now just one more tool to achieve political ends.

“In what universe does a temperature change that is too small for anyone to feel over a 50 year period become globally significant? Where we don’t know if the global average temperature is 58 or 59 or 60 deg. F, but we are sure that if it increases by 1 or 2 deg. F, that would be a catastrophe?”




NOAA data show an 18-year cooling trend in US

NOAA’s National Climatic Data Center maintains a website called “Climate at a glance” ( ) which allows you to construct graphs of temperature data using various criteria. The folks at C3Headlines tried it and came up with some interesting results. I went to the site to try it also and was able to reproduce results shown at C3Headlines. Here are the results:

US temp cooling

The starting year of 1997 was picked because the winter of 1997/1998 was one of a super El Nino that warmed the US. The so-called “pause” in global warming is aptly demonstrated using NOAA data. 2014 was definitely not the warmest year (see my article: 2014 was the third or sixth or 34th or 8000th warmest year)

According to this plot of NOAA data, the US has experienced a cooling trend of -0.19 degrees F per decade for the period 1997 to 2014. This occurred in spite of an increasing concentration of carbon dioxide in the atmosphere. The cooling trend for the 21st Century alone is -0.27 degrees F per decade according to NOAA data. By the way, NOAA says that for the US, 2014 was only the 34th warmest year (source).

None of the sophisticated climate models approved by “97% of climate scientists” predicted this result. The models are based on the assumption that carbon dioxide has a significant influence on global temperature. Observations show that this assumption is wrong, yet our political policy on climate is also based on that assumption. It looks like NOAA will have to “adjust” its data record to reflect the political climate.

And speaking of the political climate, there is a great editorial by H. Sterling Burnett titled “How Many Of World’s Poor Will Climate Alarmists Let Die?” It begins:

“‘How many people do you want to kill or let die?’ That’s how I’m going to respond from now on to anyone who argues we should end or sharply restrict fossil fuel use to prevent global warming.

Arguing the science has no effect on global warming alarmists. They are immune to facts and stick to models and fallacious arguments from biased, unscientific authorities.”

Read full article at Investor’s Business Daily.

And, surprise, surprise, even the Arizona Daily Star has a good editorial piece, “Carbon dioxide, despite word games, is profoundly Earth-friendly,” which lists the facts about carbon dioxide and climate. Judging by the comments to that post, it really upset members of the Carbon Cult.

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

Climate change in perspective

Ocean Acidification by Carbon Dioxide

Ozone theory has holes

Sea Level Rising?

New NOAA data show cooling trend for last 10 years

2014 was the third or sixth or 34th or 8000th warmest year

The propaganda press has been speculating on whether or not 2014 would be the warmest year ever. 2014 was certainly the warmest year in the last two years. The data are now in so we no longer have to speculate.

Some temperature data is based on surface stations, but those data are subject to many confounding problems that tend toward a warming bias, i.e., the Urban Heat Island effect. See my Wryheat post, The State of our Surface Temperature Records which list some problems with our temperature records.

There are also two satellite systems which measure a calculated temperature for the lower troposphere. These are regarded as a more accurate measure of global temperature.

In 2014, the Remote Sensing System (RSS) of satellites put 2014 as tied for sixth warmest since 1979. See graphs and commentary by Paul Homewood here. He writes, “RSS satellite data is now published for December, and confirms that global atmospheric temperatures for 2014 are nowhere near the record being touted by NOAA and NASA.”

The other satellites system is run by the University of Alabama at Huntsville (UAH). They put 2014 as “the third warmest year in the 36-year global satellite temperature record, but by such a small margin (0.01 C) as to be statistically similar to other recent years, according to Dr. John Christy, a professor of atmospheric science and director of the Earth System Science Center at The University of Alabama in Huntsville. 2014 was warm, but not special. The 0.01 C difference between 2014 and 2005, or the 0.02 difference with 2013 are not statistically different from zero. That might not be a very satisfying conclusion, but it is at least accurate.”

Dr. Roy Spencer, of UAH, explains why the satellite systems may come up with different numbers here. The basic reason is that “there is not a single satellite which has been operating continuously, in a stable orbit, measuring a constant layer of the atmosphere, at the same local time every day, with no instrumental calibration drifts.”

Over at Real Science, Steven Goddard proclaims “2014 Was The Least Hot Year On Record In The US” because 2014 “recorded the smallest percent of stations to reach 90 degrees in US history. Only 84% of US HCN stations reached 90 degrees. This was the first year when more than 15% of stations failed to reach 90 degrees any time during the year.”

A few days ago, Tony Davis of the Arizona Daily Star got into the propaganda business with this article “Tucson’s record warmth ‘consistent’ with climate-change forecast.” The term “consistent with” is, in my opinion, a weasel word that implies much but fails to prove anything. Late in the article Davis admits that the high temperature average for 2014 was due mainly to hotter minimum temperatures at night: “But hotter nights were the bigger drivers of 2014’s record warmth.” This is a signature of the Urban Heat Island Effect. Our asphalt and concrete is warmed during the day and radiate heat at night. (For more on UHI see Urban heat island effect on temperatures, a tale of two cities)

NOAA says that a warm December made 2014 the 34th warmest year in the US. (Source)

Finally, let’s put things in a longer perspective and look at global temperatures for the past 10,000 years. Given that perspective, it looks like 2014 was about the 8000th warmest year, give or take 1,000 years.

Cuffey and Clow

UPDATE: Dr. Tim Ball has an interesting essay: 2014: Among the 3 percent Coldest Years in 10,000 years?

New NOAA data show cooling trend for last 10 years

The National Oceanic and Atmospheric Administration (NOAA) maintains an official temperature record for the United States through its network of weather stations called the U.S. Historical Climatology Network (USHCN). There are many problems with this network including instrumental errors and siting in or near urban areas which subject the stations to the artificial warming of the urban heat island effect.

A local example of change in site conditions is illustrated by the USHCN station on the campus of the University of Arizona. The first photo below shows how it looked in 1923 – in a relatively open area. The more recent second photo shows how the station is now surrounded by buildings, asphalt streets and parking lots. The site change is the main reason why the station reports warmer temperatures than in the past.



The official temperatures reported by the National Weather Service currently come from a station at Tucson airport which is, itself, surrounded by concrete and asphalt.

NOAA has also established a parallel set of weather stations, operating for about 10 years now, that address the many problems of the USHCN. That network is the United States Climate Reference Network (USCRN). These are modern stations, sited well away from urban influence, that use state of the art instrumentation and are therefore not subject to many of the problems associated with the old USHCN network.

NOAA has just released the data from the 114 stations of the new USCRN. The difference in temperatures recorded by the two networks shows that the old USHCN has been overstating the temperature anywhere from +0.5°C on average, up to almost +4.0°C (+0.9°F to +7.2°F) in some locations during the summer months. Remember that when you see headlines blaring that a certain day, week, month, year was the warmest since…. whenever. The new USCRN data is more in line with the satellite temperature record.

The following graph is a temperature anomaly plot from the new network (Fahrenheit on left scale, Celsius on right). There have been heat waves and cold spells, but the overall trend is one of cooling for the past 10 years.

USCRN temps US

Plots of maximum temperatures and minimum temperatures also show a cooling trend.

Anthony Watts has a long post with detailed explanations and nice graphics on his WUWT blog.

The point here is that much of the “global warming” reported in the media is an artifact of spurious temperatures from an obsolete network of weather stations.

P.S. Do you think you will ever see this story in the Arizona Daily Star?

See also more articles about the old USHCN network and “adjustments” made to its data:

NOAA Temperature Record “Adjustments” Could Account for Almost All “Warming” since 1973

US Temperature Trends Show a Spurious Doubling Due to Noaa Station Siting Problems and Post Measurement Adjustments Says a New Study

The past is getting cooler

In George Orwell’s dystopian novel “1984,” the “Ministry of Truth” was constantly changing historical records to conform with political policy of the present. The same thing seems to be happening with official temperature records.

With each new publication of the official temperature record, the past, especially the 1930s is shown as cooler than it was in earlier versions, and more recent temperatures are shown as warmer. This phenomenon is due to government data manipulation designed to make the present look warmer in relation to the past. This falls in line with the global warming scam that demands we reduce our carbon dioxide emissions and use the more unreliable wind and solar energy rather than burn our abundant fossil fuels, or dire consequences will follow.

Chief temperature tamperers in the U.S. are the National Climatic Data Center, part of NOAA, which maintains the Global Historical Climatology Network (GHCN), and the Goddard Institute for Space Studies (GISS) which is part of NASA.

Steven Goddard (no relation), proprietor of the blog Real Science, has a special page which documents the data tampering with many comparative graphs. Here is how GISS plotted U.S. temperatures in 1999:

US Temp 1999

And here is how GISS plotted U.S. temperatures in 2011:

US Temp 2011

Notice how the past got cooler? Note that the vertical scales are different between the graphs. However, look particularly at the year 1999 on both graphs. On the earlier graph, the 1999 temperature anomaly was less than 1 degree C and cooler than the 1930s. In the more recent graph, however, the 1999 temperature anomaly was more than 1 degree C and warmer than the 1930s.

GISS isn’t content to mess with just U.S. temperatures, they also revise world temperatures. For instance, Paul Homewood notes in his post, “Cooling The Past In Iceland” GISS made some “homogeneity adjustments” for Reykjavik resulting in these before and after temperature graphs:

Iceland temp before

Iceland temp after

Steven Goddard notes that the New Zealand government is into data tampering also and he compares before and after “adjustments” See post here:

New Zealand temp adjustment

It seems that in the realm of climate, government science is mostly political science. Good science has been corrupted by ideology and politically correct funding. It seems that Big Brother is in charge of government climate science.

See also:

Climate data, fact or fiction

Cooking the Books – Was 2012 Really the Hottest Ever in the US?

Noaa Accused of Fabricating Temperature Data

NOAA Temperature Record “Adjustments” Could Account for Almost All “Warming” since 1973

NOAA Experiment Shows Us Temperatures Not as Warm as Reported

The State of Our Surface Temperature Records

US Temperature Trends Show a Spurious Doubling

Failure of climate models shows that carbon dioxide does not drive global temperature

Government climate science versus reality

2010 the 9000th Warmest Year (2010 was touted as the warmest year in the past 150 years.)

This article was originally published in the Arizona Daily Independent.

P.S.:  Dr. Tim Ball has an interesting article on the IPCC’s promotion of climate alarmism and how they altered the records to conform with their agenda:


P.P.S. The National Climate Data Center adjusts temperatures every month. Here is a good explanation of why this always produces a warming bias (see full post here):

“The reason why station values in the distant past end up getting adjusted is due to a choice by NCDC to assume that current values are the ‘true’ values. Each month, as new station data come in, NCDC runs their pairwise homogenization algorithm which looks for non-climatic breakpoints by comparing each station to its surrounding stations. When these breakpoints are detected, they are removed. If a small step change is detected in a 100-year station record in the year 2006, for example, removing that step change will move all the values for that station prior to 2006 up or down by the amount of the breakpoint removed. As long as new data leads to new breakpoint detection, the past station temperatures will be raised or lowered by the size of the breakpoint.”

Government winter weather forecasts botched again

Super computers can be great machines, but their use by NOAA in the U.S. and by the British Met Office demonstrate the old saying “garbage in – garbage out.”

Weather and climate are complex. Even with supercomputers, if the wrong assumptions are input, then the results are often wrong. Let’s see how NOAA and the Met Office did with winter forecasts this year.

The graphic below was made last fall by NOAA showing their predictions for winter temperatures in the U.S. The orange with an “A” shows areas that NOAA predicted would be above average, the blue with a “B” is for below normal temperatures, and the white area with “EC” are equal changes of being above or below normal.

NOAA winter forecast 2013

So much for predictions.  The conditions that really happened are shown on the graphic below. (Source: NoTricksZone). The blue area is colder than normal; green is much colder than normal.

2013 US winter actual

Remember, these are the people who claim they can predict climate change 10 to 100 years into the future.

Back in November, the British Met Office predicted that this winter would be colder and dryer than normal (Source). However, this winter the UK has had record rainfalls. (Source and here).

And, (from NoTricksZone): “This year western Europe has experienced a mild winter as a parade of low pressure systems coming in from off the Atlantic has fed the continent with a steady supply of mild southerly winds. For Germany this winter will be the first mild one in 6 years after a record 5 consecutive winters of colder than normal winters.”

This year was not the first time that the Met Office got the forecast very wrong. Back in 2012, when they touted their new supercomputer which is capable of 1,000 billion calculations every second, and uses 1.2 megawatts of energy to run – enough to power a small town, the head of the Met Office claimed that this new computer “will enable the Met Office to deliver more accurate forecasts, from hours to a century ahead.” (See my post “British supercomputer botches weather forecasts”) As it turned out, spring in 2012 was one of the wettest on record in the UK.

I pick on 2012 because my wife and I happened to be traveling in the UK that June. We rather enjoyed the cool and wet weather as a contrast to an Arizona June which is hot (35-45̊C) and dry.

Results show that even with super computers, predicting the weather is a tricky business. Government agencies seem to make assumptions based on political science rather than real science. Perhaps they should consult the Old Farmer’s Almanac more often. That publication claims to get the long-range forecasts right about 80% of the time.