Author: wryheat2

Fourth National Climate Assessment is junk science

The U.S. Global Change Research Program (USGCRP) has just released the final version of its Fourth National Climate Assessment report, one that many were claiming that the Trump administration would suppress because, like its predecessors, it is mainly a political document rather than a true scientific assessment . You can read the full 477-page report here: https://science2017.globalchange.gov/

The main conclusion is: “This assessment concludes, based on extensive evidence, that it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.”

The “extensive evidence” is based entirely on climate modeling rather than on observations. The results produced by models diverge widely from reality. The new report makes the same claims and invokes the same junk science as the previous 2014 report which I analyzed here: National Climate Assessment Lacks Physical Evidence.

As an example of unfounded claims made in the new report we see this statement in the executive summary: “Heatwaves have become more frequent in the United States since the 1960s, while extreme cold temperatures and cold waves are less frequent.”

But plots from the EPA and NOAA show that the most intense heat waves occurred in the 1930s.

 

Another example:

Claim in the report: “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” This statement is technically correct but it represents cherry-picking and lying by omission.

The National Interagency Fire Center has a table listing the number of fires and acreage burned from 1960 through 2016 (see: https://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html ).

We see from the table that there were 18,229 fires reported in 1983 which increased to 67,743 fires reported in 2016. What the report doesn’t mention is that the number of fires from 1960 to 1982 were all in the six figure range, e.g., in 1960 there were 103,387 fires and in 1981 there were 249,370 fires. The number dropped to 174,755 fires in 1982.

Fire frequency does not necessarily increase with warming. In many parts of the world, fire frequency decreases with warming. See my post “Wildfires And Warming – relationship not so clear.”

A third example of unfounded claims:

Section 2.6.1 of the report discusses the “greenhouse effect.” They claim: “As increasing GHG [greenhouse gases] concentrations warm the atmosphere, tropospheric water vapor concentrations increase, thereby amplifying the warming effect.” Climate models depend on this assumption. But NOAA’s own data show that global humidity has been decreasing with warming.

Comments by others:

Theoretical physicist Steve Koonin has an op-ed in the Wall Street Journal entitled “A Deceptive New Report on Climate.”

Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. The WSJ article is pay-walled but you can read extensive excerpts here.

Among his comments:

One notable example of alarm-raising is the description of sea-level rise, one of the greatest climate concerns. The report ominously notes that while global sea level rose an average 0.05 inch a year during most of the 20th century, it has risen at about twice that rate since 1993. But it fails to mention that the rate fluctuated by comparable amounts several times during the 20th century. The same research papers the report cites show that recent rates are statistically indistinguishable from peak rates earlier in the 20th century, when human influences on the climate were much smaller. The report thus misleads by omission.

Note: The rate of sea level rise and fall tends to be cyclical on decadal and bi-decadal periods. See my article: The Sea Level Scam.

Koonin also comments on heat waves: The report’s executive summary declares that U.S. heat waves have become more common since the mid-1960s, although acknowledging the 1930s Dust Bowl as the peak period for extreme heat. Yet buried deep in the report is a figure [6.3] showing that heat waves are no more frequent today than in 1900.

Comments by Dr. Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute, past president of the American Association of State Climatologists, and former program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Read full post “What You Won’t Find in the New National Climate Assessment.”

Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.

The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing.

The report does not tell you that:

1) Climate model predictions of global temperature diverge widely from observations.

2) No hot spot over tropics: The models predict that there should have been a huge “hot spot” over the entire tropics, which is a bit less than 40% of the globe’s surface. Halfway up through the atmosphere (by pressure), or at 500 hPa, the predicted warming is also twice what is being observed, and further up, the prediction is for seven times more warming than is being observed.

The importance of this is paramount. The vertical distribution of temperature in the tropics is central to the formation of precipitation.

Missing the tropical hot spot provokes an additional cascade of errors. A vast amount of the moisture that forms precipitation here originates in the tropics. Getting that wrong trashes the precipitation forecast, with additional downstream consequences, this time for temperature.

When the sun shines over a wet surface, the vast majority of its incoming energy is shunted towards the evaporation of water rather than direct heating of the surface. This is why in the hottest month in Manaus, Brazil, in the middle of the tropical rainforest and only three degrees from the equator, high temperatures average only 91 F (not appreciably different than humid Washington, DC’s 88 F). To appreciate the effect of water on surface heating of land areas, high temperatures in July in bone-dry Death Valley average 117 F.

Getting the surface temperature wrong will have additional consequences for vegetation and agriculture. In general, a wetter U.S. is one of bumper crops and good water supplies out west from winter snows, hardly the picture painted in the National Assessment.

If the government is going to spend time and our money on producing another assessment report, that report should be based on empirical evidence, not climate models. Note that USGCRP is a conglomeration of 13 federal agencies that had a 2016 budget of $2.6 billion for the climate assessment project. Did you get your money’s worth?

Climate modelers make some outlandish predictions, but occasionally there is a glimmer of honesty:

“The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change.” — James Hansen, “Climate forcings in the Industrial era”, PNAS, Vol. 95, Issue 22, 12753-12758, October 27, 1998.

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC.

And remember: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken

One other point:

Temperatures recorded by the US Climate Reference Network (USCRN) show no statistically significant trend since this network was established in 2004. This data is from state-of-the-art ultra-reliable triple redundant weather stations placed on pristine environments. As a result, these temperature data need none of the adjustments that plague the older surface temperature networks, such as USHCN and GHCN, which have been heavily adjusted to attempt corrections for a wide variety of biases. Using NOAA’s own USCRN data, which eliminates all of the squabbles over the accuracy of and the adjustment of temperature data, we can get a clear plot of pristine surface data.

 

BACKGROUND:

By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

USGCRP Science?
What is now called the USGCRP has a murky, politicized past. It was established in 1989 and mandated by Congress in 1990 to “assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It is to produce a National Climate Assessment every four year. Since 1990, it has produced four reports. The last full report, the 3rd National Climate Assessment, was in May 2014. Apparently, after the election of Mr. Trump, the USGCRP decided on the CSSR, released last week. As with prior NSGCRP reports, it ignores the “natural processes of global change”, which is part of its Congressional mandate.

Such political games are part of USGCRP’s established history. After the election of Mr. Bush, in 2000, under a prior name, the USGCRP released the 2000 U.S. National Assessment of Climate Change report. As shown in the 2008 report of the Nongovernmental International Panel for Climate Change (NIPCC) (Fig 16 & pp 14 to 16), the government report had projections / predictions that were nonsense. The government entity had two different climate models for climate change to 2090, which produced dramatically different results for perception, by regions. The worst example was for the Red River watershed in the Dakotas and Minnesota. One model had a precipitation drop of about 80%, turning the region into a desert, the second model had a precipitation increase of about 80%, resulting in dramatic flooding. The disparity between two models is but one example how inadequately tested global climate models may be used to project / predict almost anything. The federal courts found that the 2000 report did not meet the standards of the Data Quality Act, also called the Information Quality Act. The recent reports of the UN Intergovernmental Panel on Climate Change (IPCC) and the USGCRP have tried to cover up the disparities in the results of their global climate models by blending them into an ensemble. Usually, there are too few runs of any model to establish realistic forecasts for that model. The forecasts change with each run.

Further, the major problem remains, the models are not adequately tested to be used to form government policies on global warming / climate change. As comments by Patrick Michaels carried in last week’s TWTW illustrate, USGCRP ignores the existence of the important problem between the forecasts of atmospheric temperature trends by the global climate models with actual atmospheric temperature trends. The USGCRP ignores physical science.

 

For background reading:

A Simple Question for Climate Alarmists

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

Analysis of the National Climate Assessment Report 2014

Trump, the National Climate Assessment report, and fake news 2017

Advertisements

It’s time to dump the EPA “endangerment finding” which classified carbon dioxide as a pollutant

In 2009, the EPA ruled, under the Clean Air Act, that “the current and projected concentrations of the six key well-mixed greenhouse gases—carbon dioxide (CO2), methane (CH4), nitrous oxide (N2O), hydrofluorocarbons (HFCs), perfluorocarbons (PFCs), and sulfur hexafluoride (SF6)—in the atmosphere threaten the public health and welfare of current and future generations.” In essence, the EPA classified carbon dioxide as a pollutant even though Carbon dioxide is necessary for life on Earth.

For some perspective, note that current atmospheric concentration of carbon dioxide is about 400ppm (parts per million) while the air we exhale with every breathe contains 40,000ppm carbon dioxide. Is breathing causing air pollution?

This EPA ruling in effect allowed EPA to regulate everything from automobile exhaust to power plants to refrigerators. In order to overturn the finding, one would have to successfully show that the underlying scientific basis is wrong – and it is. Another tactic would be to have Congress amend the Clean Air Act, something that is very unlikely in the current contentious Congress.

The EPA’s scientific basis is derived from climate models, predictions of which diverge widely from reality. See my ADI articles:

Evidence that CO2 emissions do not intensify the greenhouse effect

Failure of climate models shows that carbon dioxide does not drive global temperature

Additional reading on the “Endangerment Finding” if you want to get into the details:

 

The EPA CO2 endangerment finding endangers the USA by Dennis Avery.

“In science, if your theory doesn’t take account of all the relevant data, you need a new theory.” Avery shows how the climate models fail to explain observations and notes that thousands of new coal-fired power plants are being built around the world – even in Europe. Avery is a former U.S. State Department senior analyst and co-author with astrophysicist Fred Singer of Unstoppable Global Warming: Every 1,500 Years.

 

Why Revoking the EPA GHG Endangerment Finding Is the Most Urgent Climate Action Needed

by Alan Carlin. Carlin is a scientist and economist who worked for the RAND Corp. and the EPA.

“Revoking the EF is the only way to bring the climate alarmism scam to the untimely end it so richly deserves in the US and hopefully indirectly elsewhere. Until that happens the CIC [climate industrial complex] will continue to pursue its bad science through reports such as the National Climate Assessment with the recommended disastrous policies that would seriously damage the environment, impoverish the less wealthy, and bring economic disaster for our Nation by raising the prices and decreasing the availability and reliability of fossil fuel energy which is so central to our way of life and economy.”

 

In a separate post, Carlin also said that “EPA never engaged in a robust, meaningful discussion. Rather, there was a pro forma review after a decision had already been made which met many but not all of the legal requirements.” He lists “six crucial scientific issues that EPA did not actively discuss despite my best efforts to bring a few of them to their attention in early 2009.”

 

Dr. Pat Michaels on the ‘voluminous science that the USGCRP either ignored or slanted’ for the EPA endangerment finding

Patrick J. Michaels is the director of the Center for the Study of Science at the Cato Institute. Michaels is a past president of the American Association of State Climatologists and was program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Michaels recounts his testimony before the EPA. USGCRP is U.S. Global Change Research Program.

 

60 scientists call for EPA endangerment finding to be reversed

“We the undersigned are individuals who have technical skills and knowledge relevant to climate science and the GHG Endangerment Finding. We each are convinced that the 2009 GHG Endangerment Finding is fundamentally flawed and that an honest, unbiased reconsideration is in order.”

Ice Age Mammals of the San Pedro River Valley, Southeastern Arizona

If you had been in Southeastern Arizona eleven or twelve thousand years ago, it would look much different from today. The climate was cooler and wetter, and the rivers actually flowed. Also, you would encounter a suite of large mammals which became extinct in North America. These animals included horses, camels, mastodons, mammoths, long-horned bison, tapirs, shrub oxen, and ground sloths, which were preyed upon by dire wolves, jaguars, cougars, bears, the American lion, and man. (Horses and camels were re-introduced from Europe and Asia.)

We know this because remains of all these animals were found in several sites along the San Pedro River between Tombstone and Bisbee and at other sites in southern Arizona.

At the end of the last glacial epoch, climate became very unstable with the result that many of these megafauna became extinct in North America and the human Clovis culture dispersed. I go into greater detail on extinction hypotheses in my article “Cold case: What Killed the Mammoths?” linked below.

The Arizona Geological Survey published a paper about these animals in 1998 which has recently become available for free download:

http://repository.azgs.az.gov/uri_gin/azgs/dlio/1682

Within this 32-page publication are drawings and brief descriptions of the animals and information about Clovis culture humans who hunted them. The paper describes how people hunted and speculates on causes of extinction.

According to AZGS:

Popular literature and illustrations often depict Clovis hunters using stone-tipped spears to attack full-grown mammoth. Archaeological evidence indicates, however, that they more often concentrated their efforts on calves and young adults, sometimes ambushing them near or at watering places. At the Lehner Mammoth Site bones of nine mammoths, all juveniles, were recovered. They were apparently trapped and killed in the stream bed where archaeologists uncovered their bones thousands of years later. The mammoth killed at the Naco Site was also a young adult.

Bison meat appears to have been popular among the Clovis people. At Murray Springs bones of eleven young bison were found along with bones of one mammoth. Both the mammoth and the bison were likely ambushed when they came to water.

Being so large and cumbersome to transport, a mammoth carcass was butchered where it fell. The presence of hearths at kill sites, such as Murray Springs and the Lehner Site, suggests that the hunters also ate some of the meat on the spot, perhaps roasting it as they proceeded with the butchering. Cut marks on bone surfaces, and broken cutting tools indicate that the meat was stripped from the carcass and transported to a nearby camp, where more of it could have been eaten or dried for future consumption.

See also:

Cold Case: What Killed the Mammoths?

A Very Brief History of Climate Change in the Sonoran Desert

Where the Glyptodonts roamed

 

 

 

 

 

 

 

 

 

 

Winter Weather forecast – NOAA vs Old Farmer’s Almanac

The U.S. National Oceanic and Atmospheric Administration (NOAA) has just issued its prediction for temperature and precipitation for the winter of 2017-2018. You can see NOAA maps and a video here.

In general, NOAA predicts a colder and wetter winter in the northwest, and a warmer and drier winter in the southwest. Here are the NOAA maps:

The Old Farmer’s Almanac predicts the opposite. Go to https://www.almanac.com/weather/longrange and click on a region on the map for detailed predictions.

 

For the southwest (region 14). OFA predicts:

Winter will be colder than normal, with above-normal precipitation. The coldest periods will be from late November into early December and in late December and mid-January. Snowfall will be above normal in the east and near to below normal in the west, with the snowiest periods in late December, early and mid-January, and early February. April and May will be slightly rainier than normal, with temperatures below normal in the east and near normal in the west. Summer will be slightly hotter than normal, with the hottest periods in mid- and late June and early August. Rainfall will be below normal in the northwest and above normal in the southeast. September and October will be cooler and drier than normal.

 

For the deep south (region 8) OFA predicts “Winter will be rainier and slightly cooler than normal, with near- or above-normal snowfall.” NOAA is predicting warmer and dryer.

 

NOAA predictions are based on observation and computer modeling (sometimes with false assumptions as to what drives climate). The Old Farmer’s Almanac forms its predictions by comparing solar patterns and historical weather conditions with current solar activity. (Read more)

Print out this post and check back at the end of February to see which organization came closer to reality. Much depends upon whether or not we see a La Niña develop this winter.

To see how some previous NOAA predictions turned out see:

https://wryheat.wordpress.com/2014/02/10/government-winter-weather-forecasts-botched-again/

Will global warming weaken the North American Monsoon?

Arizona gets most of its rain from thunderstorms during the summer, a period called the North American monsoon (see Arizona Monsoon for background and the anatomy of thunderstorms). By government decree, the monsoon season lasts from June 15 through September 30. In actuality, rains usually start in early July following the rain-dance ceremony of the Tohono O’odham people. In 2017, there were unusually heavy rains in July and below normal rain in August and September.

Researchers from Princeton University, using a new precipitation model, claim that global warming will decrease the rain of the monsoon. From the abstract of their paper published in Nature:

Future changes in the North American monsoon, a circulation system that brings abundant summer rains to vast areas of the North American Southwest, could have significant consequences for regional water resources. How this monsoon will change with increasing greenhouse gases, however, remains unclear, not least because coarse horizontal resolution and systematic sea-surface temperature biases limit the reliability of its numerical model simulations. Here we investigate the monsoon response to increased atmospheric carbon dioxide (CO2) concentrations using a 50-km-resolution global climate model which features a realistic representation of the monsoon climatology and its synoptic-scale variability. It is found that the monsoon response to CO2 doubling is sensitive to sea-surface temperature biases. When minimizing these biases, the model projects a robust reduction in monsoonal precipitation over the southwestern United States, contrasting with previous multi-model assessments.

Let’s see how this model premise has worked so far:

The graph below, from NOAA data, shows that year-to-year precipitation varies quit a bit. The overall trend is for increasing precipitation with global warming, not a decrease.


A plot of annual precipitation reflects the high temperatures and drought conditions of the first half of the 20th Century, but there is no apparent trend for more recent warming.

This new model, as all climate models, assumes that carbon dioxide is the major forcing of global temperature, an assumption for which there is no physical evidence.

See:

A Simple Question for Climate Alarmists

An examination of the relationship between temperature and carbon dioxide

Rocks in the Chiricahua National Monument and Fort Bowie National Historic Site

The Arizona Geological Survey has made available for free download a 48-page booklet which explains geologic features of two areas of southeastern Arizona. This well-illustrated booklet is intended for the layman but is also interesting to experienced geologists.

The Chiricahua mountains are known for their thousands of rock pinnacles formed in volcanic rocks by weathering during the last glacial epoch. In a previous article, I explain “The Explosive Geology Of The Chiricahua Mountains.”

 

 

The new booklet provides photos and descriptions that allow you to find many volcanic features along major roads and trails in the monument. None of these features are designated by markers along the trail.

The geology of Fort Bowie is quite different and consists of sediments and granite intrusive rocks.

As described in the booklet:

“Fort Bowie was built to guard Apache Pass, a natural passage between the Dos Cabezas and Chiricahua Mountains that connects the San Simon and Sulphur Springs Valleys. The

dependable springs, including Apache Spring, that have attracted humans to this narrow passage for thousands of years are also the result of geology, specifically the Apache Pass fault.”

The booklet starts out by describing the past 30 million years of geologic history which includes plate tectonics and volcanism – the processes which gave rise to the features visible today.

The features of the Chiricahua Mountains are put in context as follows:

“The landscape of Chiricahua National Monument, like that of much of the Earth’s surface, is a complex mosaic of large and small geologic features. Some of these features were produced by processes that were more active in past geological time but have now slowed or ceased. Other features are the result of past and currently active processes; only a few owe their origin solely to recently active processes. Welded tuff, fiamme, surge beds, fossil fumaroles, and the dacite caprock, for example, were all produced during the eruption of the Turkey Creek caldera, about 27 million years ago. Some joints and spherulites formed as the ash sheet cooled. Other joints and the region’s mountain ranges and intervening basins are the result of Basin and Range faulting during the period 25 to 5 million years ago. Willcox Playa, talus cones, pinnacles, and slot canyons were produced by processes that were more active during the wetter, cooler climate of the glacial epochs from 1.6 million to 10,000 years ago. Tafoni, rock varnish, lichens, case-hardened surfaces, chicken heads, exfoliation shingles, horizontal ribs, solution ponds and the rounded form of the columns are all forming today. Unraveling the evolution of such complex landscapes makes geology a particularly challenging science.” If you are unfamiliar with some of the names of features, read the booklet.

URL to the booklet: http://repository.azgs.az.gov/sites/default/files/dlio/files/nid1731/dte-11chirichua_mtns.pdf

 

Genetics of Mexican wolves – assessment of possible hybridization with other canids

A new study, commissioned by the Pima Natural Resource Conservation District*, examined the genetics of Mexican wolves (Canis lupus baileyi) and assessed the possibility of hybridization with dogs of Native American origin and or/coyotes. You can read the entire study here.

The basic finding from this research and other research cited within the report are that all North American wolves are hybrids with coyotes, and a few are hybrids with dogs. The current captive-bred population of Mexican wolves shows no hybridization with coyotes or dogs, but some previous research did detect some Mexican wolf-coyote hybrids.

Here are some highlights from the report:

The study concluded “living Mexican wolves are not derived from hybridization with Native American dogs. The results also did not indicate recent hybridization between Mexican wolves and coyotes. However, one wolf-dog hybrid was detected in wolves from Idaho. Our study used captive-reared Mexican wolves, therefor future analyses of wild-born wolves and dogs living in the same areas are needed to determine if hybridization is occurring in the wild population of Mexican wolves in Mexico, New Mexico and Arizona.”

The report notes that other studies have found wolf-dog hybrids in northern wolves .

“A second hybridization concern involves wolves and coyotes. Wolves and coyotes share a recent common ancestor during the Pleistocene (Ice Ages) in North America and their subsequent occupation of the same ranges may result in some level of hybridization. Indeed, evidence of historic and recent hybridization comes from mitochondrial DNA (mtDNA) sequences, y-chromosome, SNPs and whole genome sequences (WGS).”

“… land use changes following European colonization of North America have favored the spread of coyotes while wolf populations have declined, resulting in substantial levels of hybridization between these two species in some areas (e.g. Eastern North America). This same process also resulted in hybridization with domestic dogs, contributing to three species hybrids in some populations…”

“..all North American wolves …have significant amounts of coyote ancestry. In addition, we detect a strong geographic cline in the proportion of coyote ancestry across North American canids: Alaskan and Yellowstone wolves have 8 to 8.5% coyote ancestry, Great Lakes wolves have 21.7 to 23.9% coyote ancestry, Algonquin wolves have at least 32.5 to 35.5% coyote ancestry, and Quebec sequences have more than 50% coyote ancestry. [A] Mexican wolf… had a coyote ancestry of approximately 11%. The significance of these results, as well as those of previous authors, is that wolf-coyote hybridization occurs naturally, and the process can be accelerated in human-dominated landscapes that favor coyotes.”

“The captive Mexican wolf samples were divergent from other wolves as well as coyotes and dogs of European, East Asian, and North American descent.”

“Additionally, the remnant Mexican wolf population was subject to, and has the genetic signal of, one of the most severe, recent genetic bottlenecks in conservation history. It was founded from just seven remaining individuals separated into three lineages, subsequently inbred in captivity, and then lineages cross-bred to attempt a genetic rescue.”

We see from this study that the science is not settled. There are still several outstanding questions regarding Mexican wolves in the wild.

Question for readers: Should an animal group that is variously hybridized with other animals qualify for protection under the Endangered Species Act?

*About the Pima Natural Resource Conservation District (link)

The Pima NRCD is a State-authorized local unit of government that has been given a broad mandate to provide for the restoration and conservation of lands and soil resources, the preservation of water rights and the control and prevention of soil erosion, and thereby to conserve natural resources, conserve wildlife, protect the tax base, protect public lands and protect and restore this state’s rivers and streams and associated riparian habitats, including fish and wildlife resources that are dependent on those habitats, and in such manner to protect and promote the public health, safety and general welfare of the people.

Arizona’s 42 Conservation Districts cover the entire state of Arizona, and parts of New Mexico and Utah on the Navajo Nation. Arizona’s Conservation Districts are in a unique position to lead local conservation partnership efforts that achieve landscape level results across all land ownerships in Arizona. They have authority to enter into agreements with private landowners, state and federal agencies, tribes, and others to implement a local conservation program in their District. The Conservation District model has proven itself over the last 75 years to be the most effective approach to achieving sound management of Arizona’s natural resources.

Pima NRCD believes private lands provide the tax base that supports most county and state services. Additionally, private lands are the underlying lands for historic federal and state grazing leases, as these lands are the basis for economic productivity.

Disclosure: I am a board member of Pima NRCD.

 

See also:

Wolf attacks on humans in North America

Are Mexican wolves in Arizona actually wolf-dog hybrids?

Why Hurricanes Can’t Be Blamed On Global Warming

The leftish press and Hollywood climate experts have been claiming that the recent rash of dangerous hurricanes is due to global warming. Dr. Roy Spencer, U.S. Science Team leader for the Advanced Microwave Scanning Radiometer flying on NASA’s Aqua satellite, takes exception to these claims in a short blog post and in a new E-bookavailable from Amazon for $2.99. The E-book is about 11,000 words long and contains 17 illustrations. I recommend you read it.

In the book, Spencer explains the origin of hurricanes and gives a history of U.S. hurricanes from colonial times to present time, including comments on hurricanes Harvey and Irma.

Spencer notes that geological studies of sediments in coastal lakes in Texas and Florida show that “catastrophic hurricane strikes were more frequent 1,000 to 2,000 years ago than in the most recent 1,000 years.” Hurricanes making landfall in Florida show a downward trend in both number and intensity (that trend includes hurricane Irma). Spencer says that hurricanes in tropical Atlantic, Caribbean, and Gulf of Mexico are not limited by sea surface temperatures.

He also notes that “ two major hurricane strikes endured by the Massachusetts Bay Colony, in 1635 and in 1675, have yet to be rivaled in more modern times.”

“…Most Atlantic hurricanes can be traced back to African easterly waves [of low wind shear].  These waves draw their energy from the temperature contrast between the hot air over the Sahara Desert and the cooler air over the Sahel, and as they leave the west coast of Africa they ‘kick start’ the organization of rain shower activity over the tropical eastern Atlantic Ocean.”

You will have to read the E-book to delve more deeply into the mechanics of hurricanes. Here is an excerpt:

If you were to go up inside the eye at the altitude where jets fly, you would find the air temperature there is 10 or 20 deg. F warmer than normal for that altitude. This warmth is caused by air being forced to sink in response to rising air in the showers and thunderstorms surrounding the eye. This ‘subsidence warming’ is a universal feature of all precipitation systems, but only in hurricanes is it highly concentrated into one relatively small area. All of the warm rising air in billowing rain clouds must be exactly matched by sinking air elsewhere, and in the case of hurricanes, that sinking air is most concentrated and intense in the eye of the storm.  For more common rain systems, the warming is much weaker as it is spread over huge areas hundreds or even thousands of miles in diameter. Only a few miles away from the eye is the heavily raining eyewall of the hurricane; this is where the strongest surface winds occur.

Spencer also has a chapter on “The Effect of Sea Level Rise on Hurricane Storm Surge” in which he shows that sea level rise has been mostly if not entirely natural, with no convincing evidence that it has accelerated from human-caused global warming.

Separate from Spencer’s data, Dr. Chris Landsea of NOAA Hurricane Research Division presents at table of Atlantic hurricanes beginning from 1851. You will see that there is no sign of influence by global warming. Landsea has this caveat about the data: “The Atlantic hurricane database (or HURDAT) extends back to 1851. However, because tropical storms and hurricanes spend much of their lifetime over the open ocean – some never hitting land – many systems were “missed” during the late 19th and early 20th Centuries (Vecchi and Knutson 2008). Starting in 1944, systematic aircraft reconnaissance was commenced for monitoring both tropical cyclones and disturbances that had the potential to develop into tropical storms and hurricanes. This did provide much improved monitoring, but still about half of the Atlantic basin was not covered (Sheets 1990). Beginning in 1966, daily satellite imagery became available at the National Hurricane Center, and thus statistics from this time forward are most complete (McAdie et al. 2009).” See data

Back in 1999, Landsea et al. published a paper which found “that multidecadal variability is more characteristic of the region. Various environmental factors including Caribbean sea level pressures and 200mb zonal winds, the stratospheric Quasi-Biennial Oscillation, the El Niño-Southern Oscillation, African West Sahel rainfall and Atlantic sea surface temperatures … show significant, concurrent relationships to the frequency, intensity and duration of Atlantic hurricanes.” (Source)

Dr. Neil Frank, former Director National Hurricane Center:

“Over the past several weeks numerous articles suggest Harvey and Irma were the result of global warming. The concept is a warmer earth will generate stronger and wetter hurricanes. A number of people have said Irma was the most intense hurricane in the history of the Atlantic while Harvey was the wettest and both were good examples of what we can expect in the future because of global warming. What does a fact check reveal about these two hurricanes?”

Frank shows that neither of the above contentions is true, read more.

See also:

Houston’s long history of flooding

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

A Simple Question for Climate Alarmists