global warming

Nitrogen in rocks identified as major plant fertilizer not considered by climate models

Organic nitrogen compounds such as ammonia (NH3) act as plant fertilizers. Robust plant growth consumes more atmospheric carbon dioxide during the process ofphotosynthesis. However, atmospheric nitrogen (N2) is relatively inert. It is converted to organic nitrogen compounds by bacteria in the top soil layers. (See nitrogen fixation) Climate models have assumed that the atmosphere is the only source of nitrogen and have therefore underestimated its fertilization effect and also underestimated the capability of plants to remove carbon dioxide from the atmosphere. New studies show that much nitrogen comes from rocks, some already in useable organic form. Weathering of rocks releases this organic nitrogen.

“A considerable amount of the nitrogen in igneous and sedimentary rocks exists as ammonium ions held within the lattice structures of silicate minerals. In sedimentary rocks, the ammonium is held by secondary silicate minerals; in igneous rocks, the ammonium is contained largely within potassium-bearing primary minerals. Analyses indicated that most of the nitrogen in igneous rocks, and from one-tenth to two-thirds of that in sedimentary rocks (shales) occurred as fixed ammonium.” (Source)

Nitrate deposits in arid and semi-arid regions provide another source of nitrogen.

“Nitrogen bearing rocks are globally distributed and comprise a potentially large pool of nitrogen in nutrient cycling that is frequently neglected because of a lack of routine analytical methods for quantification. Nitrogen in rock originates as organically bound nitrogen associated with sediment, or in thermal waters representing a mixture of sedimentary, mantle, and meteoric sources of nitrogen.” (Source)

A new study, reported by Science Daily, concerns research conducted by University of California – Davis published April 6, 2018.

“For centuries, the prevailing science has indicated that all of the nitrogen on Earth available to plants comes from the atmosphere. But a study from the University of California, Davis, indicates that more than a quarter comes from Earth’s bedrock.”

“The discovery could greatly improve climate change projections, which rely on understanding the carbon cycle. This newly identified source of nitrogen could also feed the carbon cycle on land, allowing ecosystems to pull more emissions out of the atmosphere, the authors said.”

“Geology might have a huge control over which systems can take up carbon dioxide and which ones don’t.”

“While there were hints that plants could use rock-derived nitrogen, this discovery shatters the paradigm that the ultimate source of available nitrogen is the atmosphere. Nitrogen is both the most important limiting nutrient on Earth and a dangerous pollutant, so it is important to understand the natural controls on its supply and demand. Humanity currently depends on atmospheric nitrogen to produce enough fertilizer to maintain world food supply. A discovery of this magnitude will open up a new era of research on this essential nutrient.”

Study citation: B. Z. Houlton, S. L. Morford, R. A. Dahlgren. Convergent evidence for widespread rock nitrogen sources in Earth’s surface environment. Science, 2018; 360 (6384): 58 DOI: 10.1126/science.aan4399.

Looks like “climate science” is still not settled. For instance, a 2003 study published in the same Science journal claimed, “there will not be enough nitrogen available to sustain the high carbon uptake scenarios.” Investor’s Business Daily opines: “with more nitrogen available, plant life might be able to absorb more CO2 than climate scientists have been estimating, which means the planet won’t warm as much, despite mankind’s pumping CO2 into the atmosphere.”

 

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

A Simple Question for Climate Alarmists

Advertisements

Tuvalu and other Pacific islands resist sea level rise and add land area

Climate alarmists have long been predicting that global warming induced sea level rise would make low-lying Pacific islands disappear and cause thousands of “climate refugees” to seek new homes. Here are some examples:

Smithsonian.com, August, 2004: Will Tuvalu Disappear Beneath the Sea? Global warming threatens to swamp a small island nation.

Mother Jones, December, 2009: What Happens When Your Country Drowns?

Washington Post, August, 2014: Has the era of the ‘climate change refugee’ begun?

Bloomberg, November, 2017: A Tiny Island Prepares the World for a Climate Refugee Crisis.

The University of Arizona has been complicit in this hype; see my Wryheat post: University of Arizona dances with sea level.

These alarmist claims have not come to pass because of the geologic processes that build these islands.

A new paper published in Nature Communications on Feb. 9, 2018, shows that despite sea level rise, most islands are increasing in land area.

A University of Auckland study (Patterns of island change and persistence offer alternate adaptation pathways for atoll nations, Paul S. Kench, Murray R. Ford & Susan D. Owen) examined changes in the geography of Tuvalu’s nine atolls and 101 reef islands between 1971 and 2014, using aerial photographs and satellite imagery. The paper claims that local sea level has risen at twice the global average (~3.90 + 0.4 mm.yr-1). That translates to about six inches over the 43-year period. However, the study found eight of the atolls and almost three-quarters of the islands grew during the study period, increasing Tuvalu’s total land area by 2.9 percent, even though sea levels in the country rose at twice the global average. (Read Full paper in Nature).

Here is figure 3 from that paper followed by its caption:

Caption for Tuvalu fig 3 (ha = hectares): Examples of island change and dynamics in Tuvalu from 1971 to 2014.

A Nanumaga reef platform island (301 ha) increased in area 4.7 ha (1.6%) and remained stable on its reef platform.

B Fangaia island (22.4 ha), Nukulaelae atoll, increased in area 3.1 ha (13.7%) and remained stable on reef rim.

C Fenualango island (14.1 ha), Nukulaelae atoll rim, increased in area 2.3 ha (16%). Note smaller island on left Teafuafatu (0.29 ha), which reduced in area 0.15 ha (49%) and had significant lagoonward movement.

D Two smaller reef islands on Nukulaelae reef rim. Tapuaelani island, (0.19 ha) top left, increased in area 0.21 ha (113%) and migrated lagoonward. Kalilaia island, (0.52 ha) bottom right, reduced in area 0.45 ha (85%) migrating substantially lagoonward.

E Teafuone island (1.37 ha) Nukufetau atoll, increased in area 0.04 ha (3%). Note lateral migration of island along reef platform. Yellow lines represent the 1971 shoreline, blue lines represent the 1984 shoreline, green lines represent the 2006 shoreline and red lines represent the 2014 shoreline.

 

The reason that these islands are gaining area is that as the sea rises, coral reefs grow higher and trap coral debris and sand to build up the island. The science of coral reef atolls is not new. This process was first described by Charles Darwin in 1842: The structure and distribution of coral reefs. Being the first part of the geology of the voyage of the Beagle, under the command of Capt. Fitzroy, R.N. during the years 1832 to 1836. London: Smith Elder and Co. (Link to Darwin’s full description).

This figure from Darwin’s paper shows that coral atolls originate around a volcanic island or seamount. As sea level rises (or land sinks) the corals grow to remain in shallow water and the coral debris and sand cause an atoll island to form. That the corals were able to overcome a recent six-inch rise in sea level may not seem very much, but remember that these islands have been around a long time and dealt with a 400-foot rise in sea level since the depths of the last glacial epoch.

The findings of the new paper cited above support previous studies. For instance:

Kench et al., 2015, Coral islands defy sea-level rise over the past century: Records from a central Pacific atoll, Geological Society of America, in Geology Magazine, March 2015. (Source)

“Funafuti Atoll, in the tropical Pacific Ocean, has experienced some of the highest rates of sea-level rise (~5.1 + 0.7 mm/yr), totaling ~0.30 + 0.04 m over the past 60 yr. We analyzed six time slices of shoreline position over the past 118 yr at 29 islands of Funafuti Atoll to determine their physical response to recent sea-level rise. Despite the magnitude of this rise, no islands have been lost, the majority have enlarged, and there has been a 7.3% increase in net island area over the past century (A.D. 1897–2013). There is no evidence of heightened erosion over the past half-century as sea-level rise accelerated. Reef islands in Funafuti continually adjust their size, shape, and position in response to variations in boundary conditions, including storms, sediment supply, as well as sea level. Results suggest a more optimistic prognosis for the habitability of atoll nations and demonstrate the importance of resolving recent rates and styles of island change to inform adaptation strategies.”

See also:

The Sea Level Scam

 

 

 

 

 

 

 

 

 

 

 

 

Climate Craziness, Politics, and Hypocrisy

In my opinion, the greatest danger we face from global warming is that politicians think they can stop it. Politicians decree that we must reduce carbon dioxide emissions from use of fossil fuels even though there is no physical evidence that those emissions play a significant role is controlling global temperature. (See: Evidence that CO2 emissions do not intensify the greenhouse effect)

The policy of reducing CO2 emissions is costing billions, even trillions, of dollars that could be put to better use. For instance, Germany will have to spend more than 1 trillion euros ($1.2 trillion) to meet even the lower end of the European Union’s 2050 target to reduce carbon dioxide emissions, according to a draft of a study commissioned by the BDI German industry group. (Source).

Several counties and municipalities in California as well as New York City have filed lawsuits against energy companies. These suits are seeking to force oil and gas companies to pay reparations for severe weather and infrastructure advancements to guard against future storms and rising sea levels. Read more However, as noted by Valerie Richardson in The Washington Times, the risks posed by human-caused climate change were apparently alarming enough to prompt seven California municipalities last year to sue ExxonMobil, but not serious enough to disclose in full to their investors. “Notwithstanding their claims of imminent, allegedly near-certain harm, none of the municipalities disclosed to investors such risks in their respective bond offerings, which collectively netted over $8 billion for these local governments over the last 27 years,” said ExxonMobil in its petition in Texas District Court. Read more.

Back in the year 2000, Dr David Viner, a senior research scientist at the climatic research unit (CRU) of the University of East Anglia, predicted that within a few years winter snowfall will become “a very rare and exciting event. Children just aren’t going to know what snow is. Snowfalls are now just a thing of the past.” (Source) Residents of the northeast U.S. would disagree due to the very cold weather and snowfalls this winter.

Also, at the World Economic Forum in Davos, Switzerland (January, 2018), climate activists set up a base camp to educate world leaders about man-made global warming. Mother Nature didn’t cooperate, however, as the “Gore Effect” kicked in and dumped about six feet of snow on their little stunt during the last six days. The weather at Davos did not deter the arrival of 1,000 private jets owned or chartered by elites who lecture the rest of us about limiting our “carbon footprint.” The hosting organization for the Davos forum has a formal sustainability policy that vows to “limit our environmental impact” and addresses such issues as climate change and deforestation.

Just plain crazy:

Researchers at The University of Manchester have carried out the first ever study looking at the carbon footprint of sandwiches, both home-made and pre-packaged. They considered the whole life cycle of sandwiches, including the production of ingredients, sandwiches and their packaging, as well as food waste discarded at home and elsewhere in the supply chain. Of the recipes considered, the most carbon-intensive variety is a ready-made ‘all-day breakfast’ sandwich which includes egg, bacon and sausage. Read more

Researchers at the University of Arizona set out to learn more about how people’s perception of the threat of global climate change affects their mental health. They found that while some people have little anxiety about the Earth’s changing climate, others are experiencing high levels of stress, and even depression, based on their perception of the threat of global climate change. Read more

Alarmist scientists have found a terrifying new ‘climate change’ threat: mutant transgender turtles. Their study, titled Environmental Warming and Feminization of One of the Largest Sea Turtle Populations in the World, warns that global warming could turn the world’s sea turtle populations female, possibly leading to their extinction. Read more.

And even this: A Canadian government website claims Santa Claus signed an international agreement to relocate his workshop to the South Pole to escape the effects of man-made global warming in the Arctic. Read more.

See also:

Climate Madness 1

Climate Madness 2

Climate Madness 3

Climate Madness 4  

Climate Madness 5

Climate Madness 6

Climate Madness 7

Climate Madness 8

Climate Madness 9

Climate Madness 10

“Climategate” comes back to bite the University of Arizona

“The University of Arizona has been ordered to surrender emails by two UA scientists that a group claims will help prove that theories about human-caused climate change are false and part of a conspiracy.” (Arizona Daily Star) The professors involved are Malcolm Hughes, who is still with the UA, and Jonathan Overpeck, who left earlier this year.

The backstory begins in 2009:

In 2009, it was revealed that someone hacked in to the files of the Climatic Research Unit (CRU) based at the University of East Anglia, in England. The CRU has been a major proponent of anthropogenic global warming and a principal in report preparation for the Intergovernmental Panel on Climate Change (IPCC).

More than 1,000 internal emails and several reports from CRU have been posted on the internet and the blogosphere had gone wild with the implications of the revealed messages. Dr. Phil Jones, head of CRU, confirmed that his organization has been hacked and that the emails are accurate. This disclosure did not include any Emails at other institutions such as Penn State or the University of Arizona.

The emails reveal a concerted effort on the part of a small group of scientists to manipulate data, suppress dissent, and foil the dissemination of the information by “losing” data and skirting Britain’s Freedom of Information Act. The emails reveal that the contention of dangerous human-induced global warming is not supported by the data, that those supporting that contention knew it, and sought to control the discussion so as to hide the unreliable nature of what they were claiming.

Part of the controversy involved the infamous “hock stick” graph devised by Michael Mann of Penn State and subsequently adopted by the IPCC.

In the “battle of the graphs” the bottom panel shows temperatures based on proxy data and measurements. It shows that the Medieval Warm Period of 1,000 years ago was much warmer than now. Mann’s hockey stick did away with the Medieval Warm Period and showed only a large spike of recent warming – hence the name “hockey stick”. The “hockey stick” made its debut in the journal Geophysical Research Letters in 1999 in a paper by Michael Mann, Raymond Bradley, and Malcolm Hughes that built upon a 1998 paper by the same authors in the journal Nature which detailed the methodology for creating a proxy temperature reconstruction.

There are problems with the Hockey Stick according to Canadian researchers Steve McIntyre and Ross McKitrick. “The first mistake made by Mann et al. and copied by the UN in 2001 lay in the choice of proxy data. The UN’s 1996 report had recommended against reliance upon bristlecone pines as proxies for reconstructing temperature because 20th-century carbon-dioxide fertilization accelerated annual growth and caused a false appearance of exceptional recent warming. Notwithstanding the warning against reliance upon bristlecones in UN 1996, Mann et al. had relied chiefly upon a series of bristlecone-pine datasets for their reconstruction of medieval temperatures. Worse, their statistical model had given the bristlecone-pine data sets 390 times more prominence than the other datasets they had used.

Furthermore, the statistical algorithms in Mann et al. where shown to be flawed. McIntyre ran the Mann’s algorithm 10,000 times, having replaced all palaeoclimatological data with randomly-generated, electronic “red noise”. They found that, even with this entirely random data, altogether unconnected with the temperature record, the model nearly always constructed a “hockey stick” curve similar to that in the UN’s 2001 report.” (See their detailed report)

Mann had another problem. Their proxy data began to rise, but then took a plunge into cooler temperatures. They hid this decline by truncating the proxy data and substituting rising measured temperatures without telling anyone. This became known as “Mike’s Nature Trick”. (Read more)

One other incident: In my article A Simple Question for Climate Alarmists I posed this question: “What physical evidence supports the contention that carbon dioxide emissions from burning fossil fuels are the principal cause of global warming since 1970?” In a public forum, I had the opportunity to pose this question to then UofA professor Jonathan Overpeck. He could not cite any supporting physical evidence.

 

Effects of global warming on human health

The EPA’s “endangerment finding” classified carbon dioxide as a pollutant and claimed that global warming will have adverse effects on human health. Real research says the opposite: cold is deadlier.  The scientific evidence shows that warming is good for health. This is discussed in detail in chapter 7 of Climate Change Reconsidered II: Biological Impacts published by the Heartland Institute. See links to the entire publication at: http://climatechangereconsidered.org/climate-change-reconsidered-ii-biological-impacts/

Here are the key findings based on extensive review of the scientific literature:

• Warmer temperatures lead to a net decrease in temperature-related mortality, including deaths associated with cardiovascular disease, respiratory disease, and strokes. The evidence of this benefit comes from research conducted in every major country of the world.

• In the United States the average person who died because of cold temperature exposure lost in excess of 10 years of potential life, whereas the average person who died because of hot temperature exposure likely lost no more than a few days or weeks of life.

• Some 4,600 deaths are delayed each year as people in the U.S. move from cold northeastern states to warm southwestern states. Between 3 and 7% of the gains in longevity experienced by the U.S. population over the past three decades is due simply to people moving to warmer states.

• Cold-related deaths are far more numerous than heat-related deaths in the United States, Europe, and almost all countries outside the tropics. Coronary and cerebral thrombosis account for about half of all cold-related mortality.

• Global warming is reducing the incidence of cardiovascular diseases related to low temperatures and wintry weather by a much greater degree than it increases the incidence of cardiovascular diseases associated with high temperatures and summer heat waves.

• The adverse health impacts of cold temperatures, especially with respect to respiratory health, are more significant than those of high temperatures in many parts of the world, including Spain, Canada, Shanghai, and Taiwan. In the subtropical island of Taiwan, for example, researchers found low minimum temperatures were the strongest risk factor associated with outpatient visits for respiratory diseases.

• A vast body of scientific examination and research contradict the claim that malaria will expand across the globe and intensify as a result of CO2-induced warming.

• Concerns over large increases in vector-borne diseases such as dengue as a result of rising temperatures are unfounded and unsupported by the scientific literature, as climatic indices are poor predictors for dengue disease.

• While climatic factors largely determine the geographical distribution of ticks, temperature and climate change are not among the significant factors determining the incidence of tick-borne diseases.

• The ongoing rise in the air’s CO2 content is not only raising the productivity of Earth’s common food plants but also significantly increasing the quantity and potency of the many health-promoting substances found in their tissues, which are the ultimate sources of sustenance for essentially all animals and humans.

• Atmospheric CO2 enrichment positively impacts the production of numerous health-promoting substances found in medicinal or “health food” plants, and this phenomenon may have contributed to the increase in human life span that has occurred over the past century or so.

• There appears to be little reason to expect any significant CO2-induced increases in human health-harming substances produced by plants as the atmosphere’s CO2 concentration continues to rise.

Read the full report for details and supporting references.

For more background, see papers linked to in How Climate Has Affected Human History.

These papers show that humanity prospered during warm times (e.g. the Renaissance period) and suffered during cold times (e.g. the “Dark Ages”).

See also:

Dump EPA endangerment finding

Fourth National Climate Assessment is junk science

Fourth National Climate Assessment is junk science

The U.S. Global Change Research Program (USGCRP) has just released the final version of its Fourth National Climate Assessment report, one that many were claiming that the Trump administration would suppress because, like its predecessors, it is mainly a political document rather than a true scientific assessment . You can read the full 477-page report here: https://science2017.globalchange.gov/

The main conclusion is: “This assessment concludes, based on extensive evidence, that it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.”

The “extensive evidence” is based entirely on climate modeling rather than on observations. The results produced by models diverge widely from reality. The new report makes the same claims and invokes the same junk science as the previous 2014 report which I analyzed here: National Climate Assessment Lacks Physical Evidence.

As an example of unfounded claims made in the new report we see this statement in the executive summary: “Heatwaves have become more frequent in the United States since the 1960s, while extreme cold temperatures and cold waves are less frequent.”

But plots from the EPA and NOAA show that the most intense heat waves occurred in the 1930s.

 

Another example:

Claim in the report: “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” This statement is technically correct but it represents cherry-picking and lying by omission.

The National Interagency Fire Center has a table listing the number of fires and acreage burned from 1960 through 2016 (see: https://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html ).

We see from the table that there were 18,229 fires reported in 1983 which increased to 67,743 fires reported in 2016. What the report doesn’t mention is that the number of fires from 1960 to 1982 were all in the six figure range, e.g., in 1960 there were 103,387 fires and in 1981 there were 249,370 fires. The number dropped to 174,755 fires in 1982.

Fire frequency does not necessarily increase with warming. In many parts of the world, fire frequency decreases with warming. See my post “Wildfires And Warming – relationship not so clear.”

A third example of unfounded claims:

Section 2.6.1 of the report discusses the “greenhouse effect.” They claim: “As increasing GHG [greenhouse gases] concentrations warm the atmosphere, tropospheric water vapor concentrations increase, thereby amplifying the warming effect.” Climate models depend on this assumption. But NOAA’s own data show that global humidity has been decreasing with warming.

Comments by others:

Theoretical physicist Steve Koonin has an op-ed in the Wall Street Journal entitled “A Deceptive New Report on Climate.”

Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. The WSJ article is pay-walled but you can read extensive excerpts here.

Among his comments:

One notable example of alarm-raising is the description of sea-level rise, one of the greatest climate concerns. The report ominously notes that while global sea level rose an average 0.05 inch a year during most of the 20th century, it has risen at about twice that rate since 1993. But it fails to mention that the rate fluctuated by comparable amounts several times during the 20th century. The same research papers the report cites show that recent rates are statistically indistinguishable from peak rates earlier in the 20th century, when human influences on the climate were much smaller. The report thus misleads by omission.

Note: The rate of sea level rise and fall tends to be cyclical on decadal and bi-decadal periods. See my article: The Sea Level Scam.

Koonin also comments on heat waves: The report’s executive summary declares that U.S. heat waves have become more common since the mid-1960s, although acknowledging the 1930s Dust Bowl as the peak period for extreme heat. Yet buried deep in the report is a figure [6.3] showing that heat waves are no more frequent today than in 1900.

Comments by Dr. Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute, past president of the American Association of State Climatologists, and former program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Read full post “What You Won’t Find in the New National Climate Assessment.”

Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.

The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing.

The report does not tell you that:

1) Climate model predictions of global temperature diverge widely from observations.

2) No hot spot over tropics: The models predict that there should have been a huge “hot spot” over the entire tropics, which is a bit less than 40% of the globe’s surface. Halfway up through the atmosphere (by pressure), or at 500 hPa, the predicted warming is also twice what is being observed, and further up, the prediction is for seven times more warming than is being observed.

The importance of this is paramount. The vertical distribution of temperature in the tropics is central to the formation of precipitation.

Missing the tropical hot spot provokes an additional cascade of errors. A vast amount of the moisture that forms precipitation here originates in the tropics. Getting that wrong trashes the precipitation forecast, with additional downstream consequences, this time for temperature.

When the sun shines over a wet surface, the vast majority of its incoming energy is shunted towards the evaporation of water rather than direct heating of the surface. This is why in the hottest month in Manaus, Brazil, in the middle of the tropical rainforest and only three degrees from the equator, high temperatures average only 91 F (not appreciably different than humid Washington, DC’s 88 F). To appreciate the effect of water on surface heating of land areas, high temperatures in July in bone-dry Death Valley average 117 F.

Getting the surface temperature wrong will have additional consequences for vegetation and agriculture. In general, a wetter U.S. is one of bumper crops and good water supplies out west from winter snows, hardly the picture painted in the National Assessment.

If the government is going to spend time and our money on producing another assessment report, that report should be based on empirical evidence, not climate models. Note that USGCRP is a conglomeration of 13 federal agencies that had a 2016 budget of $2.6 billion for the climate assessment project. Did you get your money’s worth?

Climate modelers make some outlandish predictions, but occasionally there is a glimmer of honesty:

“The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change.” — James Hansen, “Climate forcings in the Industrial era”, PNAS, Vol. 95, Issue 22, 12753-12758, October 27, 1998.

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC.

And remember: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken

One other point:

Temperatures recorded by the US Climate Reference Network (USCRN) show no statistically significant trend since this network was established in 2004. This data is from state-of-the-art ultra-reliable triple redundant weather stations placed on pristine environments. As a result, these temperature data need none of the adjustments that plague the older surface temperature networks, such as USHCN and GHCN, which have been heavily adjusted to attempt corrections for a wide variety of biases. Using NOAA’s own USCRN data, which eliminates all of the squabbles over the accuracy of and the adjustment of temperature data, we can get a clear plot of pristine surface data.

 

BACKGROUND:

By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

USGCRP Science?
What is now called the USGCRP has a murky, politicized past. It was established in 1989 and mandated by Congress in 1990 to “assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It is to produce a National Climate Assessment every four year. Since 1990, it has produced four reports. The last full report, the 3rd National Climate Assessment, was in May 2014. Apparently, after the election of Mr. Trump, the USGCRP decided on the CSSR, released last week. As with prior NSGCRP reports, it ignores the “natural processes of global change”, which is part of its Congressional mandate.

Such political games are part of USGCRP’s established history. After the election of Mr. Bush, in 2000, under a prior name, the USGCRP released the 2000 U.S. National Assessment of Climate Change report. As shown in the 2008 report of the Nongovernmental International Panel for Climate Change (NIPCC) (Fig 16 & pp 14 to 16), the government report had projections / predictions that were nonsense. The government entity had two different climate models for climate change to 2090, which produced dramatically different results for perception, by regions. The worst example was for the Red River watershed in the Dakotas and Minnesota. One model had a precipitation drop of about 80%, turning the region into a desert, the second model had a precipitation increase of about 80%, resulting in dramatic flooding. The disparity between two models is but one example how inadequately tested global climate models may be used to project / predict almost anything. The federal courts found that the 2000 report did not meet the standards of the Data Quality Act, also called the Information Quality Act. The recent reports of the UN Intergovernmental Panel on Climate Change (IPCC) and the USGCRP have tried to cover up the disparities in the results of their global climate models by blending them into an ensemble. Usually, there are too few runs of any model to establish realistic forecasts for that model. The forecasts change with each run.

Further, the major problem remains, the models are not adequately tested to be used to form government policies on global warming / climate change. As comments by Patrick Michaels carried in last week’s TWTW illustrate, USGCRP ignores the existence of the important problem between the forecasts of atmospheric temperature trends by the global climate models with actual atmospheric temperature trends. The USGCRP ignores physical science.

 

For background reading:

A Simple Question for Climate Alarmists

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

Analysis of the National Climate Assessment Report 2014

Trump, the National Climate Assessment report, and fake news 2017

Will global warming weaken the North American Monsoon?

Arizona gets most of its rain from thunderstorms during the summer, a period called the North American monsoon (see Arizona Monsoon for background and the anatomy of thunderstorms). By government decree, the monsoon season lasts from June 15 through September 30. In actuality, rains usually start in early July following the rain-dance ceremony of the Tohono O’odham people. In 2017, there were unusually heavy rains in July and below normal rain in August and September.

Researchers from Princeton University, using a new precipitation model, claim that global warming will decrease the rain of the monsoon. From the abstract of their paper published in Nature:

Future changes in the North American monsoon, a circulation system that brings abundant summer rains to vast areas of the North American Southwest, could have significant consequences for regional water resources. How this monsoon will change with increasing greenhouse gases, however, remains unclear, not least because coarse horizontal resolution and systematic sea-surface temperature biases limit the reliability of its numerical model simulations. Here we investigate the monsoon response to increased atmospheric carbon dioxide (CO2) concentrations using a 50-km-resolution global climate model which features a realistic representation of the monsoon climatology and its synoptic-scale variability. It is found that the monsoon response to CO2 doubling is sensitive to sea-surface temperature biases. When minimizing these biases, the model projects a robust reduction in monsoonal precipitation over the southwestern United States, contrasting with previous multi-model assessments.

Let’s see how this model premise has worked so far:

The graph below, from NOAA data, shows that year-to-year precipitation varies quit a bit. The overall trend is for increasing precipitation with global warming, not a decrease.


A plot of annual precipitation reflects the high temperatures and drought conditions of the first half of the 20th Century, but there is no apparent trend for more recent warming.

This new model, as all climate models, assumes that carbon dioxide is the major forcing of global temperature, an assumption for which there is no physical evidence.

See:

A Simple Question for Climate Alarmists

An examination of the relationship between temperature and carbon dioxide

Why Hurricanes Can’t Be Blamed On Global Warming

The leftish press and Hollywood climate experts have been claiming that the recent rash of dangerous hurricanes is due to global warming. Dr. Roy Spencer, U.S. Science Team leader for the Advanced Microwave Scanning Radiometer flying on NASA’s Aqua satellite, takes exception to these claims in a short blog post and in a new E-bookavailable from Amazon for $2.99. The E-book is about 11,000 words long and contains 17 illustrations. I recommend you read it.

In the book, Spencer explains the origin of hurricanes and gives a history of U.S. hurricanes from colonial times to present time, including comments on hurricanes Harvey and Irma.

Spencer notes that geological studies of sediments in coastal lakes in Texas and Florida show that “catastrophic hurricane strikes were more frequent 1,000 to 2,000 years ago than in the most recent 1,000 years.” Hurricanes making landfall in Florida show a downward trend in both number and intensity (that trend includes hurricane Irma). Spencer says that hurricanes in tropical Atlantic, Caribbean, and Gulf of Mexico are not limited by sea surface temperatures.

He also notes that “ two major hurricane strikes endured by the Massachusetts Bay Colony, in 1635 and in 1675, have yet to be rivaled in more modern times.”

“…Most Atlantic hurricanes can be traced back to African easterly waves [of low wind shear].  These waves draw their energy from the temperature contrast between the hot air over the Sahara Desert and the cooler air over the Sahel, and as they leave the west coast of Africa they ‘kick start’ the organization of rain shower activity over the tropical eastern Atlantic Ocean.”

You will have to read the E-book to delve more deeply into the mechanics of hurricanes. Here is an excerpt:

If you were to go up inside the eye at the altitude where jets fly, you would find the air temperature there is 10 or 20 deg. F warmer than normal for that altitude. This warmth is caused by air being forced to sink in response to rising air in the showers and thunderstorms surrounding the eye. This ‘subsidence warming’ is a universal feature of all precipitation systems, but only in hurricanes is it highly concentrated into one relatively small area. All of the warm rising air in billowing rain clouds must be exactly matched by sinking air elsewhere, and in the case of hurricanes, that sinking air is most concentrated and intense in the eye of the storm.  For more common rain systems, the warming is much weaker as it is spread over huge areas hundreds or even thousands of miles in diameter. Only a few miles away from the eye is the heavily raining eyewall of the hurricane; this is where the strongest surface winds occur.

Spencer also has a chapter on “The Effect of Sea Level Rise on Hurricane Storm Surge” in which he shows that sea level rise has been mostly if not entirely natural, with no convincing evidence that it has accelerated from human-caused global warming.

Separate from Spencer’s data, Dr. Chris Landsea of NOAA Hurricane Research Division presents at table of Atlantic hurricanes beginning from 1851. You will see that there is no sign of influence by global warming. Landsea has this caveat about the data: “The Atlantic hurricane database (or HURDAT) extends back to 1851. However, because tropical storms and hurricanes spend much of their lifetime over the open ocean – some never hitting land – many systems were “missed” during the late 19th and early 20th Centuries (Vecchi and Knutson 2008). Starting in 1944, systematic aircraft reconnaissance was commenced for monitoring both tropical cyclones and disturbances that had the potential to develop into tropical storms and hurricanes. This did provide much improved monitoring, but still about half of the Atlantic basin was not covered (Sheets 1990). Beginning in 1966, daily satellite imagery became available at the National Hurricane Center, and thus statistics from this time forward are most complete (McAdie et al. 2009).” See data

Back in 1999, Landsea et al. published a paper which found “that multidecadal variability is more characteristic of the region. Various environmental factors including Caribbean sea level pressures and 200mb zonal winds, the stratospheric Quasi-Biennial Oscillation, the El Niño-Southern Oscillation, African West Sahel rainfall and Atlantic sea surface temperatures … show significant, concurrent relationships to the frequency, intensity and duration of Atlantic hurricanes.” (Source)

Dr. Neil Frank, former Director National Hurricane Center:

“Over the past several weeks numerous articles suggest Harvey and Irma were the result of global warming. The concept is a warmer earth will generate stronger and wetter hurricanes. A number of people have said Irma was the most intense hurricane in the history of the Atlantic while Harvey was the wettest and both were good examples of what we can expect in the future because of global warming. What does a fact check reveal about these two hurricanes?”

Frank shows that neither of the above contentions is true, read more.

See also:

Houston’s long history of flooding

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

A Simple Question for Climate Alarmists

Trump, the National Climate Assessment report, and fake news

The New York Times recently obtained a draft of the up-coming National Climate Assessment report. NYT is worried that the Trump administration will suppress the report. However, according to scientists who worked on the report, it has been available online since last January. (See Daily Caller story) You can download the 545-page 3rd draft report here, but don’t bother.

Besides the “fake news” story in the New York Times, we have a “fake news” story from the Associated Press printed by the Arizona Daily Star. Within that story is this sentence: Contradicting Trump’s claims that climate change is a “hoax,” the draft report representing the consensus of 13 federal agencies concludes that the evidence global warming is being driven by human activities is “unambiguous.”

Definition of unambiguous: “Admitting of no doubt or misunderstanding; having only one meaning or interpretation and leading to only one conclusion.”

Because of that statement and this one: “In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC, I downloaded the report to see just how unambiguous the evidence is. Here is what I found.

1) All their evidence consists of computer modeling. There is no physical evidence. That’s just like the previous National Climate Assessment report. They are, in essence, claiming that evidence of warming is evidence of the cause of warming.

2) On page 139, they discuss how they attribute causes:

Detection and attribution of climate change involves assessing the causes of observed changes in the climate system through systematic comparison of climate models and observations using various statistical methods. An attributable change refers to a change in which the relative contribution of causal factors has been evaluated along with an assignment of statistical confidence.

3) Beginning on page 144, they discuss “major uncertainties.” Oops, not so “unambiguous.”

The transient climate response (TCR) is defined as the global mean surface temperature change at the time of CO2 doubling in a 1%/year CO2 transient increase experiment. The TCR of the climate system to greenhouse gas increases remains uncertain, with ranges of 0.9° to 2.0°C (1.6° to 3.6°F) and 0.9° to 2.5°C (1.6° to 4.5°F) in two recent assessments. The climate system response to aerosol forcing (direct and indirect effects combined) remains highly uncertain, because although more of the relevant processes are being in included in models, confidence in these representations remains low. Therefore, there is considerable uncertainty in quantifying the attributable warming contributions of greenhouse gases and aerosols separately. There is uncertainty in the possible levels of internal climate variability, but current estimates likely  range of +/- 0.1°C, or 0.2°F, over 60 years) would have to be too low by more than a factor or two or three for the observed trend to be explainable by internal variability.

Does that sound like the evidence is unambiguous?

“There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.” – Mark Twain, Life on the Mississippi

UPDATE: The material above refers to the third draft of the report. The fifth draft has just become available. One analyst noticed “that the latest draft climate report, published in June, had seemingly left out a rather embarrassing table from the Executive Summary, one that had previously been written into the Third Draft, published last December.” What has been omitted is the fact “that the hottest temperatures, (averaged over the US), were not only much, much higher in the 1930s. They were also higher during the 1920s. Indeed there have been many other years with higher temperatures than most of the recent ones.” (Source)

I would not call it a hoax as does President Trump; I’d call it a scam. The National Climate Assessment itself is fake news; a political, rather than a scientific document.

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” – Upton Sinclair.

Additional reading:

Alan Carlin, a former senior EPA analyst, says computer models fail because: The bottom-up GCM was a bad approach from the start and should never have been paid for by the taxpayers. All that we have are computer models that were designed and then tuned to lead to the IPCC’s desired answers and have had a difficult time even doing that.

So not only are the results claiming that global temperatures are largely determined by atmospheric CO2 wrong, but the basic methodology is useless. Climate is a coupled, non-linear chaotic system, and the IPCC agrees that this is the case. It cannot be usefully modeled by using necessarily limited models which assume the opposite. Read more

Dr. Tim Ball: Uncovered: decades-old government report showing climate data was bad, unfit for purpose. In 1999, the National Academy of Sciences, the research arm of the National Research Council, released a study expressing concern about the accuracy of the data used in the debate over climate change. They said there are,

“Deficiencies in the accuracy, quality and continuity of the records,” that “place serious limitations on the confidence that can be placed in the research results.”

See also:

A Simple Question for Climate Alarmists – where is the physical evidence

Evidence that CO2 emissions do not intensify the greenhouse effect

My comments on the previous National Climate Assessment:

https://wryheat.wordpress.com/2014/11/15/national-climate-assessment-lacks-physical-evidence/