Climate change

“Climategate” comes back to bite the University of Arizona

“The University of Arizona has been ordered to surrender emails by two UA scientists that a group claims will help prove that theories about human-caused climate change are false and part of a conspiracy.” (Arizona Daily Star) The professors involved are Malcolm Hughes, who is still with the UA, and Jonathan Overpeck, who left earlier this year.

The backstory begins in 2009:

In 2009, it was revealed that someone hacked in to the files of the Climatic Research Unit (CRU) based at the University of East Anglia, in England. The CRU has been a major proponent of anthropogenic global warming and a principal in report preparation for the Intergovernmental Panel on Climate Change (IPCC).

More than 1,000 internal emails and several reports from CRU have been posted on the internet and the blogosphere had gone wild with the implications of the revealed messages. Dr. Phil Jones, head of CRU, confirmed that his organization has been hacked and that the emails are accurate. This disclosure did not include any Emails at other institutions such as Penn State or the University of Arizona.

The emails reveal a concerted effort on the part of a small group of scientists to manipulate data, suppress dissent, and foil the dissemination of the information by “losing” data and skirting Britain’s Freedom of Information Act. The emails reveal that the contention of dangerous human-induced global warming is not supported by the data, that those supporting that contention knew it, and sought to control the discussion so as to hide the unreliable nature of what they were claiming.

Part of the controversy involved the infamous “hock stick” graph devised by Michael Mann of Penn State and subsequently adopted by the IPCC.

In the “battle of the graphs” the bottom panel shows temperatures based on proxy data and measurements. It shows that the Medieval Warm Period of 1,000 years ago was much warmer than now. Mann’s hockey stick did away with the Medieval Warm Period and showed only a large spike of recent warming – hence the name “hockey stick”. The “hockey stick” made its debut in the journal Geophysical Research Letters in 1999 in a paper by Michael Mann, Raymond Bradley, and Malcolm Hughes that built upon a 1998 paper by the same authors in the journal Nature which detailed the methodology for creating a proxy temperature reconstruction.

There are problems with the Hockey Stick according to Canadian researchers Steve McIntyre and Ross McKitrick. “The first mistake made by Mann et al. and copied by the UN in 2001 lay in the choice of proxy data. The UN’s 1996 report had recommended against reliance upon bristlecone pines as proxies for reconstructing temperature because 20th-century carbon-dioxide fertilization accelerated annual growth and caused a false appearance of exceptional recent warming. Notwithstanding the warning against reliance upon bristlecones in UN 1996, Mann et al. had relied chiefly upon a series of bristlecone-pine datasets for their reconstruction of medieval temperatures. Worse, their statistical model had given the bristlecone-pine data sets 390 times more prominence than the other datasets they had used.

Furthermore, the statistical algorithms in Mann et al. where shown to be flawed. McIntyre ran the Mann’s algorithm 10,000 times, having replaced all palaeoclimatological data with randomly-generated, electronic “red noise”. They found that, even with this entirely random data, altogether unconnected with the temperature record, the model nearly always constructed a “hockey stick” curve similar to that in the UN’s 2001 report.” (See their detailed report)

Mann had another problem. Their proxy data began to rise, but then took a plunge into cooler temperatures. They hid this decline by truncating the proxy data and substituting rising measured temperatures without telling anyone. This became known as “Mike’s Nature Trick”. (Read more)

One other incident: In my article A Simple Question for Climate Alarmists I posed this question: “What physical evidence supports the contention that carbon dioxide emissions from burning fossil fuels are the principal cause of global warming since 1970?” In a public forum, I had the opportunity to pose this question to then UofA professor Jonathan Overpeck. He could not cite any supporting physical evidence.



Effects of global warming on human health

The EPA’s “endangerment finding” classified carbon dioxide as a pollutant and claimed that global warming will have adverse effects on human health. Real research says the opposite: cold is deadlier.  The scientific evidence shows that warming is good for health. This is discussed in detail in chapter 7 of Climate Change Reconsidered II: Biological Impacts published by the Heartland Institute. See links to the entire publication at:

Here are the key findings based on extensive review of the scientific literature:

• Warmer temperatures lead to a net decrease in temperature-related mortality, including deaths associated with cardiovascular disease, respiratory disease, and strokes. The evidence of this benefit comes from research conducted in every major country of the world.

• In the United States the average person who died because of cold temperature exposure lost in excess of 10 years of potential life, whereas the average person who died because of hot temperature exposure likely lost no more than a few days or weeks of life.

• Some 4,600 deaths are delayed each year as people in the U.S. move from cold northeastern states to warm southwestern states. Between 3 and 7% of the gains in longevity experienced by the U.S. population over the past three decades is due simply to people moving to warmer states.

• Cold-related deaths are far more numerous than heat-related deaths in the United States, Europe, and almost all countries outside the tropics. Coronary and cerebral thrombosis account for about half of all cold-related mortality.

• Global warming is reducing the incidence of cardiovascular diseases related to low temperatures and wintry weather by a much greater degree than it increases the incidence of cardiovascular diseases associated with high temperatures and summer heat waves.

• The adverse health impacts of cold temperatures, especially with respect to respiratory health, are more significant than those of high temperatures in many parts of the world, including Spain, Canada, Shanghai, and Taiwan. In the subtropical island of Taiwan, for example, researchers found low minimum temperatures were the strongest risk factor associated with outpatient visits for respiratory diseases.

• A vast body of scientific examination and research contradict the claim that malaria will expand across the globe and intensify as a result of CO2-induced warming.

• Concerns over large increases in vector-borne diseases such as dengue as a result of rising temperatures are unfounded and unsupported by the scientific literature, as climatic indices are poor predictors for dengue disease.

• While climatic factors largely determine the geographical distribution of ticks, temperature and climate change are not among the significant factors determining the incidence of tick-borne diseases.

• The ongoing rise in the air’s CO2 content is not only raising the productivity of Earth’s common food plants but also significantly increasing the quantity and potency of the many health-promoting substances found in their tissues, which are the ultimate sources of sustenance for essentially all animals and humans.

• Atmospheric CO2 enrichment positively impacts the production of numerous health-promoting substances found in medicinal or “health food” plants, and this phenomenon may have contributed to the increase in human life span that has occurred over the past century or so.

• There appears to be little reason to expect any significant CO2-induced increases in human health-harming substances produced by plants as the atmosphere’s CO2 concentration continues to rise.

Read the full report for details and supporting references.

For more background, see papers linked to in How Climate Has Affected Human History.

These papers show that humanity prospered during warm times (e.g. the Renaissance period) and suffered during cold times (e.g. the “Dark Ages”).

See also:

Dump EPA endangerment finding

Fourth National Climate Assessment is junk science

Fourth National Climate Assessment is junk science

The U.S. Global Change Research Program (USGCRP) has just released the final version of its Fourth National Climate Assessment report, one that many were claiming that the Trump administration would suppress because, like its predecessors, it is mainly a political document rather than a true scientific assessment . You can read the full 477-page report here:

The main conclusion is: “This assessment concludes, based on extensive evidence, that it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.”

The “extensive evidence” is based entirely on climate modeling rather than on observations. The results produced by models diverge widely from reality. The new report makes the same claims and invokes the same junk science as the previous 2014 report which I analyzed here: National Climate Assessment Lacks Physical Evidence.

As an example of unfounded claims made in the new report we see this statement in the executive summary: “Heatwaves have become more frequent in the United States since the 1960s, while extreme cold temperatures and cold waves are less frequent.”

But plots from the EPA and NOAA show that the most intense heat waves occurred in the 1930s.


Another example:

Claim in the report: “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” This statement is technically correct but it represents cherry-picking and lying by omission.

The National Interagency Fire Center has a table listing the number of fires and acreage burned from 1960 through 2016 (see: ).

We see from the table that there were 18,229 fires reported in 1983 which increased to 67,743 fires reported in 2016. What the report doesn’t mention is that the number of fires from 1960 to 1982 were all in the six figure range, e.g., in 1960 there were 103,387 fires and in 1981 there were 249,370 fires. The number dropped to 174,755 fires in 1982.

Fire frequency does not necessarily increase with warming. In many parts of the world, fire frequency decreases with warming. See my post “Wildfires And Warming – relationship not so clear.”

A third example of unfounded claims:

Section 2.6.1 of the report discusses the “greenhouse effect.” They claim: “As increasing GHG [greenhouse gases] concentrations warm the atmosphere, tropospheric water vapor concentrations increase, thereby amplifying the warming effect.” Climate models depend on this assumption. But NOAA’s own data show that global humidity has been decreasing with warming.

Comments by others:

Theoretical physicist Steve Koonin has an op-ed in the Wall Street Journal entitled “A Deceptive New Report on Climate.”

Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. The WSJ article is pay-walled but you can read extensive excerpts here.

Among his comments:

One notable example of alarm-raising is the description of sea-level rise, one of the greatest climate concerns. The report ominously notes that while global sea level rose an average 0.05 inch a year during most of the 20th century, it has risen at about twice that rate since 1993. But it fails to mention that the rate fluctuated by comparable amounts several times during the 20th century. The same research papers the report cites show that recent rates are statistically indistinguishable from peak rates earlier in the 20th century, when human influences on the climate were much smaller. The report thus misleads by omission.

Note: The rate of sea level rise and fall tends to be cyclical on decadal and bi-decadal periods. See my article: The Sea Level Scam.

Koonin also comments on heat waves: The report’s executive summary declares that U.S. heat waves have become more common since the mid-1960s, although acknowledging the 1930s Dust Bowl as the peak period for extreme heat. Yet buried deep in the report is a figure [6.3] showing that heat waves are no more frequent today than in 1900.

Comments by Dr. Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute, past president of the American Association of State Climatologists, and former program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Read full post “What You Won’t Find in the New National Climate Assessment.”

Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.

The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing.

The report does not tell you that:

1) Climate model predictions of global temperature diverge widely from observations.

2) No hot spot over tropics: The models predict that there should have been a huge “hot spot” over the entire tropics, which is a bit less than 40% of the globe’s surface. Halfway up through the atmosphere (by pressure), or at 500 hPa, the predicted warming is also twice what is being observed, and further up, the prediction is for seven times more warming than is being observed.

The importance of this is paramount. The vertical distribution of temperature in the tropics is central to the formation of precipitation.

Missing the tropical hot spot provokes an additional cascade of errors. A vast amount of the moisture that forms precipitation here originates in the tropics. Getting that wrong trashes the precipitation forecast, with additional downstream consequences, this time for temperature.

When the sun shines over a wet surface, the vast majority of its incoming energy is shunted towards the evaporation of water rather than direct heating of the surface. This is why in the hottest month in Manaus, Brazil, in the middle of the tropical rainforest and only three degrees from the equator, high temperatures average only 91 F (not appreciably different than humid Washington, DC’s 88 F). To appreciate the effect of water on surface heating of land areas, high temperatures in July in bone-dry Death Valley average 117 F.

Getting the surface temperature wrong will have additional consequences for vegetation and agriculture. In general, a wetter U.S. is one of bumper crops and good water supplies out west from winter snows, hardly the picture painted in the National Assessment.

If the government is going to spend time and our money on producing another assessment report, that report should be based on empirical evidence, not climate models. Note that USGCRP is a conglomeration of 13 federal agencies that had a 2016 budget of $2.6 billion for the climate assessment project. Did you get your money’s worth?

Climate modelers make some outlandish predictions, but occasionally there is a glimmer of honesty:

“The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change.” — James Hansen, “Climate forcings in the Industrial era”, PNAS, Vol. 95, Issue 22, 12753-12758, October 27, 1998.

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC.

And remember: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken

One other point:

Temperatures recorded by the US Climate Reference Network (USCRN) show no statistically significant trend since this network was established in 2004. This data is from state-of-the-art ultra-reliable triple redundant weather stations placed on pristine environments. As a result, these temperature data need none of the adjustments that plague the older surface temperature networks, such as USHCN and GHCN, which have been heavily adjusted to attempt corrections for a wide variety of biases. Using NOAA’s own USCRN data, which eliminates all of the squabbles over the accuracy of and the adjustment of temperature data, we can get a clear plot of pristine surface data.



By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

USGCRP Science?
What is now called the USGCRP has a murky, politicized past. It was established in 1989 and mandated by Congress in 1990 to “assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It is to produce a National Climate Assessment every four year. Since 1990, it has produced four reports. The last full report, the 3rd National Climate Assessment, was in May 2014. Apparently, after the election of Mr. Trump, the USGCRP decided on the CSSR, released last week. As with prior NSGCRP reports, it ignores the “natural processes of global change”, which is part of its Congressional mandate.

Such political games are part of USGCRP’s established history. After the election of Mr. Bush, in 2000, under a prior name, the USGCRP released the 2000 U.S. National Assessment of Climate Change report. As shown in the 2008 report of the Nongovernmental International Panel for Climate Change (NIPCC) (Fig 16 & pp 14 to 16), the government report had projections / predictions that were nonsense. The government entity had two different climate models for climate change to 2090, which produced dramatically different results for perception, by regions. The worst example was for the Red River watershed in the Dakotas and Minnesota. One model had a precipitation drop of about 80%, turning the region into a desert, the second model had a precipitation increase of about 80%, resulting in dramatic flooding. The disparity between two models is but one example how inadequately tested global climate models may be used to project / predict almost anything. The federal courts found that the 2000 report did not meet the standards of the Data Quality Act, also called the Information Quality Act. The recent reports of the UN Intergovernmental Panel on Climate Change (IPCC) and the USGCRP have tried to cover up the disparities in the results of their global climate models by blending them into an ensemble. Usually, there are too few runs of any model to establish realistic forecasts for that model. The forecasts change with each run.

Further, the major problem remains, the models are not adequately tested to be used to form government policies on global warming / climate change. As comments by Patrick Michaels carried in last week’s TWTW illustrate, USGCRP ignores the existence of the important problem between the forecasts of atmospheric temperature trends by the global climate models with actual atmospheric temperature trends. The USGCRP ignores physical science.


For background reading:

A Simple Question for Climate Alarmists

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

Analysis of the National Climate Assessment Report 2014

Trump, the National Climate Assessment report, and fake news 2017

Winter Weather forecast – NOAA vs Old Farmer’s Almanac

The U.S. National Oceanic and Atmospheric Administration (NOAA) has just issued its prediction for temperature and precipitation for the winter of 2017-2018. You can see NOAA maps and a video here.

In general, NOAA predicts a colder and wetter winter in the northwest, and a warmer and drier winter in the southwest. Here are the NOAA maps:

The Old Farmer’s Almanac predicts the opposite. Go to and click on a region on the map for detailed predictions.


For the southwest (region 14). OFA predicts:

Winter will be colder than normal, with above-normal precipitation. The coldest periods will be from late November into early December and in late December and mid-January. Snowfall will be above normal in the east and near to below normal in the west, with the snowiest periods in late December, early and mid-January, and early February. April and May will be slightly rainier than normal, with temperatures below normal in the east and near normal in the west. Summer will be slightly hotter than normal, with the hottest periods in mid- and late June and early August. Rainfall will be below normal in the northwest and above normal in the southeast. September and October will be cooler and drier than normal.


For the deep south (region 8) OFA predicts “Winter will be rainier and slightly cooler than normal, with near- or above-normal snowfall.” NOAA is predicting warmer and dryer.


NOAA predictions are based on observation and computer modeling (sometimes with false assumptions as to what drives climate). The Old Farmer’s Almanac forms its predictions by comparing solar patterns and historical weather conditions with current solar activity. (Read more)

Print out this post and check back at the end of February to see which organization came closer to reality. Much depends upon whether or not we see a La Niña develop this winter.

To see how some previous NOAA predictions turned out see:

Will global warming weaken the North American Monsoon?

Arizona gets most of its rain from thunderstorms during the summer, a period called the North American monsoon (see Arizona Monsoon for background and the anatomy of thunderstorms). By government decree, the monsoon season lasts from June 15 through September 30. In actuality, rains usually start in early July following the rain-dance ceremony of the Tohono O’odham people. In 2017, there were unusually heavy rains in July and below normal rain in August and September.

Researchers from Princeton University, using a new precipitation model, claim that global warming will decrease the rain of the monsoon. From the abstract of their paper published in Nature:

Future changes in the North American monsoon, a circulation system that brings abundant summer rains to vast areas of the North American Southwest, could have significant consequences for regional water resources. How this monsoon will change with increasing greenhouse gases, however, remains unclear, not least because coarse horizontal resolution and systematic sea-surface temperature biases limit the reliability of its numerical model simulations. Here we investigate the monsoon response to increased atmospheric carbon dioxide (CO2) concentrations using a 50-km-resolution global climate model which features a realistic representation of the monsoon climatology and its synoptic-scale variability. It is found that the monsoon response to CO2 doubling is sensitive to sea-surface temperature biases. When minimizing these biases, the model projects a robust reduction in monsoonal precipitation over the southwestern United States, contrasting with previous multi-model assessments.

Let’s see how this model premise has worked so far:

The graph below, from NOAA data, shows that year-to-year precipitation varies quit a bit. The overall trend is for increasing precipitation with global warming, not a decrease.

A plot of annual precipitation reflects the high temperatures and drought conditions of the first half of the 20th Century, but there is no apparent trend for more recent warming.

This new model, as all climate models, assumes that carbon dioxide is the major forcing of global temperature, an assumption for which there is no physical evidence.


A Simple Question for Climate Alarmists

An examination of the relationship between temperature and carbon dioxide

Why Hurricanes Can’t Be Blamed On Global Warming

The leftish press and Hollywood climate experts have been claiming that the recent rash of dangerous hurricanes is due to global warming. Dr. Roy Spencer, U.S. Science Team leader for the Advanced Microwave Scanning Radiometer flying on NASA’s Aqua satellite, takes exception to these claims in a short blog post and in a new E-bookavailable from Amazon for $2.99. The E-book is about 11,000 words long and contains 17 illustrations. I recommend you read it.

In the book, Spencer explains the origin of hurricanes and gives a history of U.S. hurricanes from colonial times to present time, including comments on hurricanes Harvey and Irma.

Spencer notes that geological studies of sediments in coastal lakes in Texas and Florida show that “catastrophic hurricane strikes were more frequent 1,000 to 2,000 years ago than in the most recent 1,000 years.” Hurricanes making landfall in Florida show a downward trend in both number and intensity (that trend includes hurricane Irma). Spencer says that hurricanes in tropical Atlantic, Caribbean, and Gulf of Mexico are not limited by sea surface temperatures.

He also notes that “ two major hurricane strikes endured by the Massachusetts Bay Colony, in 1635 and in 1675, have yet to be rivaled in more modern times.”

“…Most Atlantic hurricanes can be traced back to African easterly waves [of low wind shear].  These waves draw their energy from the temperature contrast between the hot air over the Sahara Desert and the cooler air over the Sahel, and as they leave the west coast of Africa they ‘kick start’ the organization of rain shower activity over the tropical eastern Atlantic Ocean.”

You will have to read the E-book to delve more deeply into the mechanics of hurricanes. Here is an excerpt:

If you were to go up inside the eye at the altitude where jets fly, you would find the air temperature there is 10 or 20 deg. F warmer than normal for that altitude. This warmth is caused by air being forced to sink in response to rising air in the showers and thunderstorms surrounding the eye. This ‘subsidence warming’ is a universal feature of all precipitation systems, but only in hurricanes is it highly concentrated into one relatively small area. All of the warm rising air in billowing rain clouds must be exactly matched by sinking air elsewhere, and in the case of hurricanes, that sinking air is most concentrated and intense in the eye of the storm.  For more common rain systems, the warming is much weaker as it is spread over huge areas hundreds or even thousands of miles in diameter. Only a few miles away from the eye is the heavily raining eyewall of the hurricane; this is where the strongest surface winds occur.

Spencer also has a chapter on “The Effect of Sea Level Rise on Hurricane Storm Surge” in which he shows that sea level rise has been mostly if not entirely natural, with no convincing evidence that it has accelerated from human-caused global warming.

Separate from Spencer’s data, Dr. Chris Landsea of NOAA Hurricane Research Division presents at table of Atlantic hurricanes beginning from 1851. You will see that there is no sign of influence by global warming. Landsea has this caveat about the data: “The Atlantic hurricane database (or HURDAT) extends back to 1851. However, because tropical storms and hurricanes spend much of their lifetime over the open ocean – some never hitting land – many systems were “missed” during the late 19th and early 20th Centuries (Vecchi and Knutson 2008). Starting in 1944, systematic aircraft reconnaissance was commenced for monitoring both tropical cyclones and disturbances that had the potential to develop into tropical storms and hurricanes. This did provide much improved monitoring, but still about half of the Atlantic basin was not covered (Sheets 1990). Beginning in 1966, daily satellite imagery became available at the National Hurricane Center, and thus statistics from this time forward are most complete (McAdie et al. 2009).” See data

Back in 1999, Landsea et al. published a paper which found “that multidecadal variability is more characteristic of the region. Various environmental factors including Caribbean sea level pressures and 200mb zonal winds, the stratospheric Quasi-Biennial Oscillation, the El Niño-Southern Oscillation, African West Sahel rainfall and Atlantic sea surface temperatures … show significant, concurrent relationships to the frequency, intensity and duration of Atlantic hurricanes.” (Source)

Dr. Neil Frank, former Director National Hurricane Center:

“Over the past several weeks numerous articles suggest Harvey and Irma were the result of global warming. The concept is a warmer earth will generate stronger and wetter hurricanes. A number of people have said Irma was the most intense hurricane in the history of the Atlantic while Harvey was the wettest and both were good examples of what we can expect in the future because of global warming. What does a fact check reveal about these two hurricanes?”

Frank shows that neither of the above contentions is true, read more.

See also:

Houston’s long history of flooding

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

A Simple Question for Climate Alarmists

Houston’s long history of flooding

Houston, Texas, seat of Harris County, has a long history of flooding because the city was built on a flood plain. The deluge generated by hurricane Harvey in August, 2017, is only the latest episode.

Houston lies within a coastal plain about 50 miles northwest of Galveston. The area has very flat topography which is cut by four major bayous that pass through the city: Buffalo Bayou, which runs into downtown and the Houston Ship Channel; and three of its tributaries: Brays Bayou, which runs along the Texas Medical Center; White Oak Bayou, which runs through the Heights and near the northwest area; and Sims Bayou, which runs through the south of Houston and downtown Houston. The ship channel goes past Galveston and into the Gulf of Mexico.

The land around Houston consists of sand, silt, and clay deposited by local rivers.

The sedimentary layers underneath Houston ultimately extend down some 60,000 feet, with the oldest beds deposited during the Cretaceous. Between 30,000 feet and 40,000 feet below the surface is a layer of salt, the primary source of salt domes which dot the metropolitan area. Since salt is more buoyant than other sediments, it rises to the surface, creating domes and anticlines and causing subsidence due to its removal from its original strata. These structures manage to capture oil and gas as it percolates through the subsurface. [source]

Groundwater pumping also causes subsidence in parts of the city. (See: Geologists find parts of Northwest Houston, Texas sinking rapidly )

Hurricane damage in Houston:

As described by the Harris County Flood Control District (HCFCD) [link]:

When the Allen brothers founded Houston in 1836, they established the town at the confluence of Buffalo and White Oak Bayous. Shortly thereafter, every structure in the new settlement flooded. Early settlers documented that after heavy rains, their wagon trips west through the prairie involved days of walking through knee-deep water. Harris County suffered through 16 major floods from 1836 to 1936, some of which crested at more than 40 feet, turning downtown Houston streets into raging rivers.

Houston was flooded during the September, 1900, hurricane which wiped out Galveston.

In December of 1935 a massive flood occurred in the downtown Houston as the water level height measured at Buffalo Bayou in Houston topped out at 54.4 feet which was higher than Harvey. There have been 30 major floods in the Houston area since 1937 when the flood control district was established in spite of construction of flood control measures.

In June, 2001, Harris County suffered widespread flooding due to hurricane Allison. According to HCFCD, before leaving the area, Allison would dump as much as 80 percent of the area’s average annual rainfall over much of Harris County, simultaneously affecting more than 2 million people. When the rains finally eased, Allison had left Harris County, Texas, with 22 fatalities, 95,000 damaged automobiles and trucks, 73,000 damaged residences, 30,000 stranded residents in shelters, and over $5 billion in property damage in its wake.

Some climate alarmists are claiming that global warming has played a part in the flooding produced by hurricane Harvey. Dr. Roy Spencer debunks that notion here and here. Storms of or greater than Harvey’s magnitude have happened before. Storm damage is not due entirely to weather. Some is due to local infrastructure.

It all boils down to the luck of the draw: if you choose to inhabit a flood plain, you will get wet from time to time.

P.S. Prior to Harvey, which made landfall as a Category 4 storm, the U.S. had gone a remarkable 12 years without being hit by a hurricane of Category 3 strength or stronger. Since 1970 the U.S. has only seen four hurricanes of Category 4 or 5 strength. In the previous 47 years, the country was struck by 14 such storms.

Dr. Roy Spencer takes on Al Gore

This is a reblog from Dr. Roy Spencer.

See the original post here.

An Inconvenient Deception: How Al Gore Distorts Climate Science and Energy Policy

August 19th, 2017

Al Gore has provided a target-rich environment of deceptions in his new movie.

After viewing Gore’s most recent movie, An Inconvenient Sequel: Truth to Power, and after reading the book version of the movie, I was more than a little astounded. The new movie and book are chock-full of bad science, bad policy, and factual errors.

So, I was inspired to do something about it. I’d like to announce my new e-book, entitled An Inconvenient Deception: How Al Gore Distorts Climate Science and Energy Policy, now available on

After reviewing some of Gore’s history in the environmental movement, I go through the movie, point by point.

One of Gore’s favorite tactics is to show something that happens naturally, then claim (or have you infer) that it is due to humanity’s greenhouse gas emissions. As I discuss in the book, this is what he did in his first movie (An Inconvenient Truth), too.

For example, sea level rise. Gore is seen surveying flooded streets in Miami Beach.

That flooding is mostly a combination of (1) natural sea level rise (I show there has been no acceleration of sea level rise beyond what was already happening since the 1800s), and (2) satellite-measured sinking of the reclaimed swamps that have been built upon for over 100 years in Miami Beach.

In other words, Miami Beach was going to have to deal with the increasing flooding from their “king tides”, with or without carbon dioxide emissions.

Gore is also shown jumping across meltwater streams on the Greenland ice sheet. No mention is made that this happens naturally every year. Sure, 2012 was exceptional for its warmth and snow melt (which he mentioned), but then 2017 came along and did just the opposite with record snow accumulation, little melt, and the coldest temperature ever recorded in the Northern Hemisphere for a July.

The fact that receding glaciers in Alaska are revealing stumps from ancient forests that grew 1,000 to 2,000 years ago proves that climate varies naturally, and glaciers advance and recede without any help from humans.

So, why is your SUV suddenly being blamed when it happens today?

The list goes on and on.

Some of what Gore claims is just outright false. He says that wheat and corn yields in China are down by 5% in recent decades. Wrong. They have been steadily climbing, just like almost everywhere else in the world. Here’s the situation for all grain crops in China:

And that lack of rainfall in Syria that supposedly caused conflict and war? It didn’t happen. Poor farmers could no longer afford diesel fuel to pump groundwater because Assad tripled the price. Semi-arid Syria is no place to grow enough crops for a rapidly growing population, anyway.

I also address Gore’s views on alternative energy, mainly wind and solar. It is obvious that Gore does not consider government subsidies when he talks about the “cost” of renewable energy sometimes being cheaper than fossil fuels. Apparently, he hasn’t heard that the citizens pay the taxes that then support the alternative energy industries which Gore, Elon Musk and others financially benefit from. If and when renewable energy become cost-competitive, it won’t need politicians and pundits like Mr. Gore campaigning for it.

To counter what is in movie theaters now, I had to whip up this book in only 2 weeks, and I didn’t have a marching army of well-funded people like Gore has had. (Too bad he didn’t have someone doing fact-checking.) Despite my disadvantage, I think I present a powerful case that most of what he presents is, at the very least, very deceptive.

Wryheat comments:

I did buy and read Spencer’s book.  It’s about 81 Kindle pages long. Spencer succinctly debunks all of Gore’s claims with real evidence. Here is the table of contents:

Introduction Global Warming Background
Al Gore’s History in the Climate Debate
Not All Science is Created Equal
The Deceptions in An Inconvenient Truth (AIT)
Al Gore and Bill Nye’s Faked Science Experiment
An Inconvenient Sequel: Truth to Power Warming Temperatures
Greenland Melting
Sea Level Rise and Street Flooding in Miami Storm Damage
Flooding of the 9/11 Memorial from Hurricane Sandy
Earthrise: The Big Blue Marble
Solar Power, Solar City, and Elon Musk
Pressure on India to use Solar Energy
Air Pollution in China
Typhoon Haiyan Hits the Philippines
Conflict in Syria and the Role of Drought
Crop Yields Are Increasing, Not Decreasing
Is the Atmosphere an Open Sewer for CO2?
The Bataclan Terrorist Attack in Paris
The Paris Climate Conference
Truth to Power
Final Comments

Trump, the National Climate Assessment report, and fake news

The New York Times recently obtained a draft of the up-coming National Climate Assessment report. NYT is worried that the Trump administration will suppress the report. However, according to scientists who worked on the report, it has been available online since last January. (See Daily Caller story) You can download the 545-page 3rd draft report here, but don’t bother.

Besides the “fake news” story in the New York Times, we have a “fake news” story from the Associated Press printed by the Arizona Daily Star. Within that story is this sentence: Contradicting Trump’s claims that climate change is a “hoax,” the draft report representing the consensus of 13 federal agencies concludes that the evidence global warming is being driven by human activities is “unambiguous.”

Definition of unambiguous: “Admitting of no doubt or misunderstanding; having only one meaning or interpretation and leading to only one conclusion.”

Because of that statement and this one: “In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC, I downloaded the report to see just how unambiguous the evidence is. Here is what I found.

1) All their evidence consists of computer modeling. There is no physical evidence. That’s just like the previous National Climate Assessment report. They are, in essence, claiming that evidence of warming is evidence of the cause of warming.

2) On page 139, they discuss how they attribute causes:

Detection and attribution of climate change involves assessing the causes of observed changes in the climate system through systematic comparison of climate models and observations using various statistical methods. An attributable change refers to a change in which the relative contribution of causal factors has been evaluated along with an assignment of statistical confidence.

3) Beginning on page 144, they discuss “major uncertainties.” Oops, not so “unambiguous.”

The transient climate response (TCR) is defined as the global mean surface temperature change at the time of CO2 doubling in a 1%/year CO2 transient increase experiment. The TCR of the climate system to greenhouse gas increases remains uncertain, with ranges of 0.9° to 2.0°C (1.6° to 3.6°F) and 0.9° to 2.5°C (1.6° to 4.5°F) in two recent assessments. The climate system response to aerosol forcing (direct and indirect effects combined) remains highly uncertain, because although more of the relevant processes are being in included in models, confidence in these representations remains low. Therefore, there is considerable uncertainty in quantifying the attributable warming contributions of greenhouse gases and aerosols separately. There is uncertainty in the possible levels of internal climate variability, but current estimates likely  range of +/- 0.1°C, or 0.2°F, over 60 years) would have to be too low by more than a factor or two or three for the observed trend to be explainable by internal variability.

Does that sound like the evidence is unambiguous?

“There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.” – Mark Twain, Life on the Mississippi

UPDATE: The material above refers to the third draft of the report. The fifth draft has just become available. One analyst noticed “that the latest draft climate report, published in June, had seemingly left out a rather embarrassing table from the Executive Summary, one that had previously been written into the Third Draft, published last December.” What has been omitted is the fact “that the hottest temperatures, (averaged over the US), were not only much, much higher in the 1930s. They were also higher during the 1920s. Indeed there have been many other years with higher temperatures than most of the recent ones.” (Source)

I would not call it a hoax as does President Trump; I’d call it a scam. The National Climate Assessment itself is fake news; a political, rather than a scientific document.

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” – Upton Sinclair.

Additional reading:

Alan Carlin, a former senior EPA analyst, says computer models fail because: The bottom-up GCM was a bad approach from the start and should never have been paid for by the taxpayers. All that we have are computer models that were designed and then tuned to lead to the IPCC’s desired answers and have had a difficult time even doing that.

So not only are the results claiming that global temperatures are largely determined by atmospheric CO2 wrong, but the basic methodology is useless. Climate is a coupled, non-linear chaotic system, and the IPCC agrees that this is the case. It cannot be usefully modeled by using necessarily limited models which assume the opposite. Read more

Dr. Tim Ball: Uncovered: decades-old government report showing climate data was bad, unfit for purpose. In 1999, the National Academy of Sciences, the research arm of the National Research Council, released a study expressing concern about the accuracy of the data used in the debate over climate change. They said there are,

“Deficiencies in the accuracy, quality and continuity of the records,” that “place serious limitations on the confidence that can be placed in the research results.”

See also:

A Simple Question for Climate Alarmists – where is the physical evidence

Evidence that CO2 emissions do not intensify the greenhouse effect

My comments on the previous National Climate Assessment: