The New Madrid, Missouri Earthquakes, 1811-1812

 

When one thinks of earthquakes in the U.S., we often think of the west coast. But, on a U.S. earthquake hazards map, there is a big bull’s eye in the Midwest along the Mississippi River, centered on the town of New Madrid, Missouri. According to the U.S. Geological Survey (USGS) this area, the New Madrid seismic zone has “ repeatedly produced sequences of major earthquakes, including several of magnitude 7 to 8, over the past 4,500 years.”

The most famous New Madrid earthquakes occurred from December 16, 1811, through February 7, 1812. The three main earthquakes measured 7.3-7.5 on the Richter scale. Aftershocks persisted through 1813.

 

According to the USGS:

1811, December 16, 08:15 UTC Northeast Arkansas – the first main shock

2:15 am local time

Magnitude ~7.5

This powerful earthquake was felt widely over the entire eastern United States. People were awakened by the shaking in New York City, Washington, D.C., and Charleston, South Carolina. Perceptible ground shaking was in the range of one to three minutes depending upon the observers location. The ground motions were described as most alarming and frightening in places like Nashville, Tennessee, and Louisville, Kentucky. Reports also describe houses and other structures being severely shaken with many chimneys knocked down. In the epicentral area the ground surface was described as in great convulsion with sand and water ejected tens of feet into the air liquefaction).

 

During the February 7 earthquake, “Large waves (seiches) were generated on the Mississippi River by seismically-induced ground motions deforming the riverbed. Local uplifts of the ground and water waves moving upstream gave the illusion that the river was flowing upstream. Ponds of water also were agitated noticeably.”

The New Madrid seismic zone is underlain by the Reelfoot Rift, a large fault zone with mainly horizontal movement. It is speculated that this rift was formed about 750 million years ago during the breakup of the supercontinent Rodinia. The Reelfoot Rift failed to split the continent, but remains a weak area in Earth’s crust. From time to time, pressure from the movement of tectonic plates causes movement on this weak area resulting in earthquakes.

The USGS “concludes that the New Madrid Seismic zone is at significant risk for damaging earthquakes that must be accounted for in urban planning and development. A fundamental problem is the lack of knowledge concerning the physical processes that govern earthquake recurrence in the Central US, and whether large earthquakes will continue to occur at the same intervals as the previous three clusters of events. ”

To read more, including eyewitness accounts, and a summary of 1811-1812 New Madrid earthquakes sequence, go to:

https://earthquake.usgs.gov/earthquakes/events/1811-1812newmadrid/

 

Related stories:

Where the Next Big American Earthquake and Tsunami Might Occur

The Great Arizona-Sonora Earthquake of 1887

 

Advertisements

Rebuttals to climate alarmist claims

This series of articles was originally published by ICECAP:

http://icecap.us/index.php/go/political-climate/alarmist_claim_rebuttal_update/ 

Below are a series of rebuttals of the most common climate alarmists’ claims such as those made in the recently released Fourth National Climate Assessment Report.  The authors of these rebuttals are all recognized experts in the relevant scientific fields. The rebuttals demonstrate the falsity of EPA’s claims merely by citing the most credible empirical data on the topic.

For each alarmist claim, a summary of the relevant rebuttal is provided below along with a link to the full text of the rebuttal, which includes the names and the credentials of the authors of each rebuttal.

Claim: Heat Waves are increasing at an alarming rate and heat kills.
Summary of Rebuttal

There has been no detectable long-term increase in heat waves in the United States or elsewhere in the world. Most all-time record highs here in the U.S. happened many years ago, long before mankind was using much fossil fuel. Thirty-eight states set their all-time record highs before 1960 (23 in the 1930s!).  Here in the United States, the number of 100F, 95F and 90F days per year has been steadily declining since the 1930s. The Environmental Protection Agency Heat Wave Index confirms the 1930s as the hottest decade.

James Hansen while at NASA in 1999 said about the U.S. temperature record “In the U.S. the warmest decade was the 1930s and the warmest year was 1934”.

When NASA was challenged on the declining heat records in the U.S, the reply was that the U.S. is just 2% of the world.  However, all the continents recorded their all-time record highs before 1980.

Interestingly while the media gives a great deal of coverage to even minor heat waves to support the case that man-made global warming is occurring, the media tends to ignore deadly cold waves. But in actual fact, worldwide cold kills 20 times as many people as heat. This is documented in the “Excess Winter Mortality” which shows that the number of deaths in the 4 coldest winter months is much higher than the other 8 months of the year. The USA death rate in January and February is more than 1000 deaths per day greater than in it is July and August.

Clearly, there is no problem with increased heat waves due to Climate Change.

Detailed Rebuttal and Authors: AC Rebuttal Heat Waves

————–

Claim: Global warming is causing more hurricanes and stronger hurricanes.

Summary of Rebuttal
The long-term linear trend in the number and intensity of global hurricane activity has remained flat. Hurricane activity does vary year-to-year and over longer periods as short-term ocean cycles like El Nino/La Nina and multidecadal cycles in the Pacific (PDO) and Atlantic (AMO) ocean temperature regimes favor changes in activity levels and some basins over others.

Credible data show this is true despite much better open ocean detection than before the 1960s when many short-lived storms at sea would have been missed as there were no satellites, no aircraft reconnaissance, no radar, no buoys and no automated weather stations.

Landfall counts are more reliable. This data shows that the number of U.S. landfalling hurricanes and major hurricanes has been on the decline since the late 1800s.

However, the impacts on the United States has varied considerably with time, with very active seasons giving way to long lulls during which the public forgets the lessons from past storms and the risks of settling in vulnerable areas. The regions targeted vary too. The period from 1926 to 1935 was very active in the Gulf area. After decades of no impact storms, there were 8 major devastating storms on the east coast from 1938 to 1960 then a 25-year lull until Gloria and then Hugo began another active era.

This century Isabel in 2003, Charley, Frances, Ivan and Jeanne in 2004 and Dennis, Katrina, Rita and Wilma in 2005 all made landfall on the mainland.  2005 holds the record for 5 category 4 and 4 category 5 impact storms. At the time, some speculated this was the new norm for the Atlantic due to climate change. However, after the active 2005 season and before the landfall of two major storms on the U.S. in 2017, the U.S. had gone 4324 days (just short of 12 years) without a major hurricane landfall, exceeding the prior record 8-year lull in the 1860s.

Harvey in 2017 was the first hurricane to make landfall in Texas since Ike in 2008 and the first Category 4 hurricane in Texas since Hurricane Carla in 1961. Note that there has been no increase in Texas in either hurricanes or major hurricanes. In 2017, Irma was the first landfalling hurricane and major hurricane in Florida since Wilma in 2005. This was also after a record lull – 4439 days. The previous record lull back to 1851 was 2191 days from 1979 to 1985.

Michael whose tight core winds did major damage on a portion of the Florida panhandle in 2018 had the 20th lowest pressure for an Atlantic storm and was third lowest for a storm making landfall behind the Labor Day Hurricane in 1935 and Hurricane Camille in 1969.

In short, there is nothing unique or unprecedented about recent hurricane seasons or hurricanes. Active Atlantic seasons like 2004 and 2005 and 2017 were similar to 1893, 1926, 1933, 1950 and 1995. 1893 had 5 major hurricanes two of which both caused over 2000 deaths making that year the deadliest on record at that time. 7 years later in 1900, the Great Galveston hurricane killed up to 12,000, making it the most deadly in U.S. history.

Strong hurricanes like Maria in 2017 with devastation on the Caribbean islands are not unique. The Great Hurricane of 1780 killed 27,500 while ravaging the Caribbean islands with winds estimated over 200 mph. It was one of three hurricanes that year with death tolls over 1000.

The heavy rains associated with slow moving Harvey and Florence led to claims that slow movement was related to climate change. Careful analysis of the data shows a flat linear trend in storm motion over land for over the last half century.

The most recent (2018) U.S. Government analysis of the 36 most costly hurricane disasters in U.S. history, showed that increasing damages are due to increasing population density and infrastructure vulnerability, not due to storm intensity.

Chris Landsea (NOAA) in 2011 noted “instead of a dramatically increasing trend of hurricane damages, destruction from these storms varies on a decade-to-decade timescale with more damages in the early 1900s, low damages during the late 1900s to early 1920s, much higher destruction in late 1920s to the early 1960s, and reduced damages from the late 1960s to early 1990s. Certainly, the U.S. hurricane damages from 1996 to 2005 were quite high, but now it is evident that these were quite similar to the decade of 1926 to 1935. So, after straightforward consideration of the non-meteorological factors of inflation, wealth increases, and population change, there remains no indication that there has been a long-term pick up of U.S. hurricane losses that could be related to global warming today. There have been no peer-reviewed studies published anywhere that refute this.”

Detailed Rebuttal and Authors: AC Rebuttal Hurricanes
————

Claim: Global warming is causing more and stronger tornadoes.
Summary of Rebuttal

Tornadoes are failing to follow “global warming” predictions. Strong tornadoes have seen a decline in frequency since the 1950s. The years 2012, 2013, 2014, 2015 and 2016 all saw below average to near record low tornado counts in the U.S. since records began in 1954.  2017 rebounded only to the long-term mean. 2018 as of the end of May is ranking below the 25th percentile.

This lull followed a very active and deadly strong La Nina of 2010/11, which like the strong La Nina of 1973/74 produced record setting and very deadly outbreaks of tornadoes. Population growth and expansion outside urban areas have exposed more people to the tornadoes that once roamed through open fields.

Tornado detection has improved with the addition of NEXRAD, the growth of the trained spotter networks, storm chasers armed with cellular data and imagery and the proliferation of cell phone cameras and social media. This shows up most in the weak EF0 tornado count but for storms from moderate EF1 to strong EF 3+ intensity, the trend slope has been flat to down despite improved detection.

Detailed Rebuttal and Authors: AC Rebuttal Tornadoes
————-

Claim: Global warming is increasing the magnitude and frequency of droughts and floods.
Summary of Rebuttal

Our use of fossil fuels to power our civilization is not causing droughts or floods. NOAA found there is no evidence that floods and droughts are increasing because of climate change. The number, extend or severity of these events does increase dramatically for a brief period of years at some locations from time to time but then conditions return to more normal. This is simply the long-established constant variation of weather resulting from a confluence of natural factors.

In testimony before Congress Professor Roger Pielke, Jr. said: “It is misleading, and just plain incorrect, to claim that disasters associated with hurricanes, tornadoes, floods, or droughts have increased on climate timescales either in the United States or globally. Droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the U.S. over the last century.”

“The good news is U.S. flood damage is sharply down over 70 years,” Roger Pielke Jr. said. “Remember, disasters can happen any time…”. “But it is also good to understand long-term trends based on data, not hype.”

Detailed Rebuttal and Authors: AC Rebuttal Droughts and Floods
———-

Claim: Global Warming has increased U.S. Wildfires.
Summary of Rebuttal

Wildfires are in the news almost every late summer and fall.  The National Interagency Fire Center has recorded the number of fires and acreage affected since 1985. This data show the number of fires trending down slightly, though the acreage burned had increased before leveling off over the last 20 years. The NWS tracks the number of days where conditions are conducive to wildfires when they issue red-flag warnings. It is little changed.

Weather and normal seasonal and year-to-year variations brings a varying number and extent of wildfires to the west every year and other areas from time to time. The 2016/17 winter was a very wet one in the mountains in the west, in parts of the northern Sierra, the wettest/snowiest on record).  Wet winters cause more spring growth that will dry up in the dry summer heat season and become tinder for late summer and early fall fires before the seasonal rains return.

2017 was an active fire year in the U.S. but by no means a record. The U.S. had 64,610 fires, the 7th most in 11 years and the most since 2012.  The 9,574, 533 acres burned was the 4th most in 11 years and most since 2015. The fires burned in the Northwest including Montana with a very dry summer then the action shifted south seasonally with the seasonal start of the wind events like Diablo in northern California and Santa Ana to the south.

Fires spread to northern California in October with an episode of the dry Diablo wind that blows from the east and then in December as strong and persistent Santa Ana winds and dry air triggered a round of large fires in Ventura County.

According to the California Department of Forestry and Fire Protection, the 2017 California wildfire season was the most destructive one on record with a total of 8,987 fires that burned 1,241,158 acres. It included five of the 20 most destructive wildland-urban interface fires in the state’s history.

When it comes to considering the number of deaths and structures destroyed, the seven-fold increase in population in California from 1930 to 2017 must be noted. Not only does this increase in population mean more people and home structures in the path of fires, but it also means more fires.  Lightning and campfires caused most historic fires; today most are the result of power lines igniting trees.  The power lines have increased proportionately with the population, so it can be reasoned that most of the damage from wild fires in California is a result of increased population not Global Warming. The increased danger is also greatly aggravated by poor government forest management choices.  The explosive failure of power lines and other electrical equipment has regularly ranked among the top three singular sources of California wildfires for the last several years. In 2015, the last year of reported data, electrical power problems sparked the burning of 149,241 acres – more than twice the amount from any other cause.

Detailed Rebuttal and Authors: AC Rebuttal Wildfires
————

Claim: Global warming is causing snow to disappear.
Summary of Rebuttal

This is one claim that has been repeated for decades even as nature showed very much the opposite trend with unprecedented snows even in the big coastal cities. Every time they repeated the claim, it seems nature upped the ante more.

Alarmists have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the snow events in recent years have usually occurred in colder winters with high snow water equivalent ratios in frigid arctic air.

The eastern United States as an example has had 29 high impact winter snowstorms in the last 10 years. No prior ten-year period had more than 10.

Snowcover in the Northern Hemisphere, North America and Eurasia has been increasing since the 1960s in the fall and winter but declining in the spring and summer. However, as NOAA advised might be the case, snowcover measurement methodology changes at the turn of this century may be responsible for part of the warm season differences.

Detailed Rebuttal and Authors: AC Rebuttal Snow
———–

Claim: Global warming is resulting in rising sea levels as seen in both tide gauge and satellite technology.
Summary of Rebuttal

This claim is demonstrably false.  It really hinges on this statement: “Tide gauges and satellites agree with the model projections.” The models project a rapid acceleration of sea level rise over the next 30 to 70 years.  However, while the models may project acceleration, the tide gauges clearly do not.

All data from tide gauges in areas where land is not rising or sinking show instead a steady linear and unchanging sea level rate of rise from 4 up to 6 inches/century, with variations due to gravitational factors.  It is true that where the land is sinking as it is in the Tidewater area of Virginia and the Mississippi Delta region, sea levels will appear to rise faster but no changes in CO2 emissions would change that.

The implication that measured, validated, and verified Tide Gauge data support this conclusion remains simply false.  All such references rely on “semi-empirical” information, which merges, concatenates, combines, and joins, actual tide gauge data with various models of the reference author’s choosing.  Nowhere on this planet can a tide gauge be found, that shows even half of the claimed 3.3 mm/yr sea level rise rate in “Tectonically Inert” coastal zones.  These are areas that lie between regions of geological uplift and subsidence.  They are essentially neutral with respect to vertical land motion, and tide gauges located therein show between 1 mm/yr (3.9 inches/century) and 1.5 mm/yr (6 inches/century rise). The great Swedish Oceanographer, Nils-Axel Morner, has commented on this extensively, and his latest papers confirm this ‘inconvenient truth’.

Furthermore, alarmist claims that “Satellites agree with the model projection” are false.  Satellite technology was introduced to provide more objective measurement of the sea level rise because properly adjusted tide gauge data was not fitting Alarmists’ claims.  However, the new satellite and radar altimeter data lacked the resolution to accurately measure sea levels down to the mm level. Moreover, the raw data from this technology also conflicted with Alarmists’ claims. As a result, adjustments to this data were also made – most notably a Glacial Isostatic Adjustment (GIA). GIA assumes that basically all land is rebounding from long ago glaciations and oceanic basins are deepening. The assumption is that this rebounding is masking the true sea level rise. Alarmists continue to proclaim that their models project a rapid acceleration of sea level rise over the next 30 to 70 years, when those same models have failed to even come close to accurately predicting the past 25 years.

Detailed Rebuttal and Authors: AC Rebuttal – Sea Level
————

Claim:  Arctic, Antarctic and Greenland ice loss is accelerating due to global warming.
Summary of Rebuttal

Satellite and surface temperature records and sea surface temperatures show that both the East Antarctic Ice Sheet and the West Antarctic Ice Sheet are cooling, not warming and glacial ice is increasing, not melting. Satellite and surface temperature measurements of the southern polar area show no warming over the past 37 years. Growth of the Antarctic ice sheets means sea level rise is not being caused by melting of polar ice and, in fact, is slightly lowering the rate of rise. Satellite Antarctic temperature records show 0.02C/decade cooling since 1979. The Southern Ocean around Antarctica has been getting sharply colder since 2006. Antarctic sea ice is increasing, reaching all-time highs. Surface temperatures at 13 stations show the Antarctic Peninsula has been sharply cooling since 2000.

The Arctic includes the Arctic Ocean, Greenland, Iceland, and part of Siberia and northern Alaska. Because of the absence of any land mass in the Arctic Ocean, most of area lacks glaciers, which require a land mass. Thus, most of the Arctic contains only floating sea ice. Greenland, Iceland, northern Alaska, and northern Siberia contain the only glaciers in the general Arctic region.

Because of the absence of any land mass in the Arctic Ocean, most of the Arctic contains only floating sea ice.  Because the arctic ice is floating, it is subject to intrusians of warmer water under the ice during the natural multidecadal warm cycles especially from the North Atlantic, which thins the ice and reduces the ice extent in summer with its accompanying warmer air temperatures. Increased ice and colder temperatures are observed during cold water ocean cycles.

Arctic temperature records show that the 1920s and 1930s were warmer than 2000. Official historical fluctuations of Arctic sea ice begin with the first satellite images in 1979. That happens to coincide with the end of the recent 1945-1977 global cold period and the resulting maximum extent of Arctic sea ice. During the warm period from 1978 until recently, the extent of sea ice has diminished, but increased in the past several years. The Greenland ice sheet has also grown recently.

Detailed Rebuttal and Authors: AC Rebuttal Arctic, Antarctic and Greenland
————-

Claim: Rising atmospheric CO2 concentrations are causing ocean acidification, which is catastrophically harming marine life.
Summary of Rebuttal

As the air’s CO2 content rises in response to ever-increasing anthropogenic CO2 emissions, more and more carbon dioxide is expected to dissolve into the surface waters of the world’s oceans, which dissolution is projected to cause a 0.3 to 0.7 pH unit decline in the planet’s oceanic waters by the year 2300. A potential pH reduction of this magnitude has provoked concern and led to predictions that, if it occurs, marine life will be severely harmed – with some species potentially driven to extinction – as they experience negative impacts in growth, development, fertility and survival.

This ocean acidification hypothesis, as it has come to be known, has gained great momentum in recent years, because it offers a second independent reason to regulate fossil fuel emissions in addition to that provided by concerns over traditional global warming. For even if the climate models are proven to be wrong with respect to their predictions of atmospheric warming, extreme weather, glacial melt, sea level rise, or any other attendant catastrophe, those who seek to regulate and reduce CO2 emissions have a fall-back position, claiming that no matter what happens to the climate, the nations of the Earth must reduce their greenhouse gas emissions because of projected direct negative impacts on marine organisms via ocean acidification.

The ocean chemistry aspect of the ocean acidification hypothesis is rather straightforward, but it is not as solid as it is often claimed to be. For one thing, the work of a number of respected scientists suggests that the drop in oceanic pH will not be nearly as great as the IPCC and others predict. And, as with all phenomena involving living organisms, the introduction of life into the analysis greatly complicates things. When a number of interrelated biological phenomena are considered, it becomes much more difficult, if not impossible, to draw such sweeping negative conclusions about the reaction of marine organisms to ocean acidification. Quite to the contrary, when life is considered, ocean acidification is often found to be a non-problem, or even a benefit. And in this regard, numerous scientific studies have demonstrated the robustness of multiple marine plant and animal species to ocean acidification – when they are properly performed under realistic experimental conditions.

Detailed Rebuttal and Author: AC Rebuttal – Ocean Acidification

————

Claim: Carbon pollution is a health hazard.
Summary of Rebuttal

The term “carbon pollution” is a deliberate, ambiguous, disingenuous term, designed to mislead people into thinking carbon dioxide is pollution. It is used by the environmentalists to confuse the environmental impacts of CO2 emissions with the impact of the emissions of unwanted waste products of combustion. The burning of carbon-based fuels (fossil fuels – coal, oil, natural gas – and biofuels and biomass) converts the carbon in the fuels to carbon dioxide (CO2), which is an odorless invisible gas that is plant food and it is essential to life on the planet.

Because the burning of the fuel is never 100% efficient, trace amounts of pollutants including unburnt carbon are produced in the form of fine particulates (soot), hydrocarbon gases and carbon monoxide.  In addition, trace amounts of sulfur oxides, nitrogen oxides and other pollutant constituents can be produced.  In the US, all mobile and industrial stationary combustion sources must have emission control systems that remove the particulates and gaseous pollutants so that the emissions are in compliance with EPA’s emission standards.  The ambient air pollutant concentrations have been decreasing for decades and are going to keep decreasing for the foreseeable future because of existing non-GHG-related regulations.

Detailed Rebuttal and Authors: AC Rebuttal Health
————-

Conclusion

The well-documented invalidation of the “three lines of evidence” upon which EPA attributes global warming to human -caused CO2 emissions breaks the causal link between such CO2 emissions and global warming. {See here and here}

This in turn necessarily breaks the causal chain between CO2 emissions and the alleged knock-on effects of global warming, such as loss of Arctic ice, increased sea level, and increased heat waves, floods, droughts, hurricanes, tornadoes, etc. These alleged downstream effects are constantly cited to whip up alarm and create demands for ever tighter CO2 regulation. EPA explicitly relied on predicted increases in such events to justify the Endangerment Finding supporting its Clean Power Plan. But as shown above, there is no evidence to support such claims, and copious empirical evidence that refutes them.

The enormous cost and essentially limitless scope of the government’s regulatory authority over GHG/CO2 emissions cannot lawfully rest upon a collection of scary stories that are conclusively disproven by readily available empirical data.

The legal criteria for reconsidering the Endangerment Finding are clearly present in this case. The scientific foundation of the Endangerment Finding has been invalidated. The parade of horrible calamities that the Endangerment Finding predicts and that a vast program of regulation seeks to prevent have been comprehensively and conclusively refuted by empirical data. The Petition for Reconsideration should be granted

A Review of the state of Climate Science

The purpose of this article is to provide a quick reference to some of my articles dealing with climate so that you can cite facts to counter the ongoing scam.

Climate has been constantly changing for billions of years and will continue to do so no matter what human do or don’t do. The major current controversy is that carbon dioxide emissions from burning fossil fuels will adversely affect global climate. However, there is no physical evidence to support that claim.

The climate system consists of the sun acting upon two turbulent fluids, the atmosphere and the oceans. This is a coupled, non-linear chaotic system consisting of many variables. The notion that just one variable, carbon dioxide, which comprises just 0.04% of the atmosphere, is the major controlling factor, is absurd.

About evidence:

Computer modeling is speculation, not physical evidence. Output from computer modeling of the climate diverges widely from observations because input assumptions are wrong.

Correlation does not prove causation, but it may be suggestive. Still, correlation is not physical evidence.

Consensus is merely opinion, not physical evidence. Remember back in the 1970s the scientific consensus was that Earth was about to enter another glacial epoch.

Here are some reference articles:

A Simple Question for Climate Alarmists

“What physical evidence supports the contention that carbon dioxide emissions from burning fossil fuels are the principal cause of global warming since 1970?”

Evidence that CO2 emissions do not intensify the greenhouse effect

The “greenhouse” hypothesis fails on four major predictions probably because it ignores convective heat transfer.

The Broken Greenhouse – why CO2 is a minor player in global climate

The “greenhouse effect” does exist but water vapor is the major greenhouse gas.

What keeps Earth warm – the greenhouse effect or something else?

It’s the gravity of the planet and density of the atmosphere. We have a practical demonstration of this in the Grand Canyon.

An examination of the relationship between temperature and carbon dioxide

This shows that carbon dioxide has never been a controlling factor no matter what time scale is considered.

The 97 percent consensus of human caused climate change debunked again

On Consensus in Science

Consensus is “the first refuge of scoundrels.”

The Sea Level Scam

 

Tuvalu and other Pacific islands resist sea level rise and add land area

In spite of rising sea level, islands are increasing in land area. It’s all about geology.

Carbon dioxide is necessary for life on Earth

See the Article Index for more

 

The Broken Greenhouse – why CO2 is a minor player in global climate

Climate has been changing for about four billion years in cycles large and small. Climate will continue to change no matter what humans do or don’t do.

Carbon dioxide emissions from burning fossil fuels are the major bogeyman of our time. As H.L. Mencken wrote: “the whole point of practical politics is to keep the populace alarmed and hence clamorous to be led to safety by menacing it with an endless series of hobgoblins, all of them imaginary.” As we will see below, neither increasing carbon dioxide emissions nor reducing such emissions will have a significant effect on global warming.

Even the UN IPCC admits that the climate change bogeyman is about money and power, not the environment. The real goal of UN climate propaganda: “We require deep transformations of our economies and societies.” – UN climate chief Patricia Espinosa. “One has to free oneself from the illusion that international climate policy is environmental policy. Instead, climate change policy is about how we redistribute de facto the world’s wealth” — Ottmar Edenhofer, International Panel on Climate Change (IPCC). The real goal is one-world government.

Let’s review the “greenhouse effect” to see if carbon dioxide is really a major factor in controlling global climate.

We begin with a very simplified review of what the greenhouse effect is. Solar radiation, mostly short-wave radiation, passes through the atmosphere and warms the surface. In turn, the heated surface re-radiates energy as long-wave infrared radiation back to the atmosphere and eventually, back to space.

Greenhouse gases in the atmosphere intercept some of the long-wave infrared radiation and transfer some of the energy to excite (warm) other molecules in the atmosphere, some of the radiation goes back to the surface, and some of the radiation is radiated into space.

The major greenhouse gas is water vapor which absorbs almost all wavelengths of infrared radiation. Carbon dioxide absorbs four specific wavelengths of infrared radiation, three of which are also absorbed by water vapor. Other minor greenhouse gases are oxygen and ozone, methane, and nitrous oxide.

Once a particular wavelength becomes saturated, i.e., almost completely absorbed, additional quantities of greenhouse gases have no effect.

Even the IPCC agrees that the hypothetical capacity of carbon dioxide to change temperature is given by the formula: △Tc = αln(C2/C1), where △Tc is the change in temperature in degrees Centigrade and the term ln(C2/C1) is the natural logarithm of the CO2 concentration at time two divided by the concentration at time one. The constant α (alpha) is sometimes called the sensitivity and its value is subject to debate. This relationship was proposed by Svante August Arrhenius, a physicist and chemist, around 1896. This logarithmic formula produces a graph in the form shown below. This shows that as the concentration of carbon dioxide increases, it has less and less influence. This graph is the pure theoretical capacity of carbon dioxide to warm the atmosphere in absence of any confounding feedbacks. The different curves represent different values of sensitivity.

 

Carbon dioxide is currently about 400 parts per million (0.04%) of the atmosphere. Yet this nearly negligible amount is touted as the main driver of global temperature. Notice that even at the highest sensitivity on the chart, doubling carbon dioxide from 400ppm to 800ppm results in a theoretical rise in temperature of only slightly more than 1°C – nothing to worry about.

The climate system consists of two turbulent fluids (the atmosphere and the oceans) interacting with each other. As the IPCC rightly says in its Third Assessment Report: “In climate research and modeling, we should recognize that we are dealing with a coupled, non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” The claim that one minor variable acts as the major control knob is absurd.

In the graph, the numbers shown in parentheses are the estimated temperature increase from quadrupling carbon dioxide concentration. Many climate models use much higher values for the sensitivity. That’s why most climate models run much hotter than measured temperatures. Recent research suggests that sensitivity could be as low as -0.03°C, i.e., cooling. (Source)

The term “greenhouse effect” with respect to the atmosphere is an unfortunate analogy because it is misleading. The interior of a real greenhouse (or your automobile parked with windows closed and left in the sun) heats up because there is a physical barrier to convective heat loss. There is no such physical barrier in the atmosphere. The greenhouse hypothesis deals only with heat transfer by radiation and completely ignores convective heat transfer. Convective heat transfer (weather) puts many holes in the “blanket” of carbon dioxide. The “greenhouse” is effectively broken.

I have often heard it claimed that without the “greenhouse effect” Earth would be an iceball. Well, it ain’t necessarily so. There is an alternate hypothesis of what warms the atmosphere and this alternative is supported by physical evidence.

Scottish physicist James Clerk Maxwell proposed in his 1871 book “Theory of Heat” that the temperature of a planet depends only on gravity, mass of the atmosphere, and heat capacity of the atmosphere. This happens regardless of atmosphere composition. Greenhouse gases have nothing to do with it. Physical evidence supports this hypothesis. See more of this story here: What keeps Earth warm – the greenhouse effect or something else?

The “greenhouse” hypothesis of global warming is not supported by physical evidence, see:

A simple question for climate alarmists – where is the evidence.

On the other hand, there are several lines of physical evidence showing that carbon dioxide emissions do not intensify the “greenhouse effect” see: Evidence that CO2 emissions do not intensify the greenhouse effect

The global push for renewable energy generation of electricity is based on the false premise that we need to reduce carbon dioxide emissions to forestall dangerous warming. How much warming is dangerous? The IPCC says 2°C is dangerous. They are ignoring the Cretaceous Period when global temperature was at least10°C warmer and the Paleocene-Eocene when temperatures were up to 19°C warmer. (link) The IPCC’s arbitrary 2ºC (3.6ºF) “tipping point” has no basis in science. In fact, during the last 10,000 years, the temperature has cycled several times through warm and cool periods of 2ºC or more.

See also:

Analysis of US and State-by-State Carbon Dioxide Emissions and Potential “Savings” in Future Global Temperature and Global Sea Level Rise(link)

This paper shows that if Arizona stops all carbon dioxide emissions it could possibly prevent a rise in temperature of 0.0029°C by 2100. If the entire U.S. stopped all carbon dioxide emissions it could prevent a temperature rise of 0.172°C by 2100.

More Evidence Water Vapor Is Dominant Influence on Temperatures (link)

This article by meteorologist Joe Bastardi explains how water vapor moderates temperature.

Much of the climate scaremongering is based on climate models. Climate models are complex mathematical constructs, not physical evidence. But the atmosphere is even more complex, so modelers must ignore many variables such as Sun-Earth relationships and clouds, in favor of a few basic parameters. The fundamental assumption of climate models is that changes in CO2 concentration drives temperature change, but evidence from geology and astronomy show that the relationship is just the opposite. Temperature drives atmospheric CO2 concentration because temperature controls CO2 solubility in the oceans.

CO2 is Not a Greenhouse Gas 

Article by Dr. Tim Ball: The most important assumption behind the AGW theory is that an increase in global atmospheric CO2 will cause an increase in the average annual global temperature. The problem is that in every record of temperature and CO2, the temperature changes first. Think about what I am saying. The basic assumption on which the entire theory that human activity is causing global warming or climate change is wrong. The questions are how did the false assumption develop and persist? (Water vapor comprises 95% of greenhouse gas.)

For Arizona voters – Let’s Finish the Job and Repeal Arizona’s Existing Renewable Energy Mandate

In the November election, Arizona voters rightly and overwhelmingly rejected Proposition 127 which would have established an amendment to the Arizona Constitution requiring that 50 percent of electricity be generated from renewable energy sources such as solar and wind. We need to finish the job and get the Arizona legislature to repeal the existing 15 percent renewable energy mandate imposed upon us by the Arizona Corporation Commission (ACC) in 2006.

In 2013, I wrote an article outlining why this mandate is very bad policy, see:

Five reasons Arizona should repeal its renewable energy standards mandate.

Here is a brief summary of those reasons:

1) Electricity generated from solar and wind is much more expensive than conventional generation. That expense is reflected in higher electricity bills. My current bill from Tucson Electric Power shows “surcharges” directly attributable to the mandate totaling an extra $230 per year. I expect those charges to double as we transition from the current 7 percent renewables to the mandated 15 percent. The ACC itself estimated that, through 2025, the mandate would cost consumers $1.2 billion more than they would have paid for conventional energy sources.

2) Renewable energy sources such as wind and solar are intermittent, unpredictable, and unreliable. The electric grid is the lifeblood of modern civilization. Solar and wind generation can make the grid unstable and unreliable.

3) Because generation from renewable energy sources is intermittent and unpredictable, these sources require backup generation which is usually by burning fossil fuels. Experience in Europe shows that backup generators actually use more fuel and produce more carbon dioxide emissions and pollutants such as sulfur dioxide than they normally would if they were run efficiently for primary generation.

4) Use of renewable energy will not impact climate. If Arizona stopped all carbon dioxide emissions it could theoretically prevent a temperature rise of 0.0014°C by 2050. (source)

5) Finally, renewable energy is not as green as advertized.

The manufacturing and disposal processes for solar panels put several dangerous chemicals into the environment. Wind turbines chop up birds and bats. Wind turbines also have deleterious effects on human health, see: Health Hazards of Wind Turbines

Petition the Arizona legislature to end the mandate.

Article 15 of the Arizona Constitution deals with the ACC. Perhaps section 6 of that article provides a means for the legislature to rescind the mandate. It reads:

Section 6. The law-making power may enlarge the powers and extend the duties of the corporation commission, and may prescribe rules and regulations to govern proceedings instituted by and before it; but, until such rules and regulations are provided by law, the commission may make rules and regulations to govern such proceedings. [my emphasis]

Perhaps the legislature could pass a law that says: The ACC shall not mandate the method by which electricity is generated in Arizona. Any and all existing mandates are hereby rescinded and declared null and void.

 

Such a law does not mean that electric companies can’t use renewable energy. It just means that government bureaucrats can’t tell them they must.

The Arizona legislature reconvenes in mid January. Between now and then, please contact your state senator and two state representatives and urge them to repeal the ACC mandate.

To find contact information for your state legislators:

First find your Arizona legislative districthttps://azredistricting.org/districtlocator/

You will have to type in your address or zip code. This site will show both your federal congressional district number and your Arizona legislative district number.

Find an alphabetical list of members of the legislature (with phone numbers and email address):

https://www.azleg.gov/MemberRoster/

Scroll down the list until you find the legislators in your district.

To send a message from the roster:

You can click on the name in the 4th column to get to a message form. Or to send an email directly, use the name in the 4th column on the list and add: @azleg.gov

For regular mail, use this address: Legislator name, Arizona State Senate (or House of Representatives), 1700 W. Washington, Phoenix, AZ 85007.

We should rely upon the free market and let utility companies generate electricity by the method they see as most efficient, cost effective, and reliable. Most renewable energy sources are none of those things.

 

Fourth National Climate Assessment, Part 2 – no science, just scaremongering

On November 23, 2018, the U.S. Global Change Research Program (USGCRP) released Part 2 of the Fourth National Climate Assessment as required by law. [link to report] You may have read in the always credulous “mainstream” media about all the doom and gloom prophecies in the new report. Part 1 was released last November.

Both reports are based on computer modeling rather than on physical observations. Please read my comments on Part 1 here:

Fourth National Climate Assessment is junk science

Much of the latest USGCRP report is vague and unsubstantiated. It is really a political report rather than a science report. It offers no hard evidence, just vague assertions and claims that past climate change is no evidence about future climate change. It does not meet the standards of the Information Quality Act, and each page should be stamped: “Based on speculation, not hard evidence.” Part 2 is based almost entirely on one extreme climate model, Representative Concentration Pathway 8.5, (RCP8.5) which is an outlier from most other models. Even the UN’s IPCC is phasing out that model.

The scaremongers have a problem. Since the first National Climate Assessment in 2000, U.S. temperatures show no net change. Nature is not cooperating with the political narrative.

 

“The problem with these sorts of ‘studies’ is the main conclusion is already made before the actual work begins. These academics aren’t studying to see if the changing climate is caused by man or nature, it’s simply accepted as faith that it’s man’s fault. So these studies are done to reinforce preconceived notions and justify jobs. These academics who conduct them have to justify their jobs and bring in grant money, government grant money; our money.” – Derek Hunter, Townhall (link)

 

4 Problems With the New Climate Change Report

1. It wildly exaggerates economic costs.

One statistic that media outlets have seized upon is that the worst climate scenario could cost the U.S. 10 percent of its gross domestic product by 2100. The 10 percent loss projection is more than twice the percentage that was lost during the Great Recession.

The study, funded in part by climate warrior Tom Steyer’s organization, calculates these costs on the assumption that the world will be 15 degrees Fahrenheit warmer. That temperature projection is even higher than the worst-case scenario predicted by the United Nations Intergovernmental Panel on Climate Change. In other words, it is completely unrealistic.

2. It assumes the most extreme (and least likely)climate scenario.

The scary projections in the National Climate Assessment rely on a theoretical climate trajectory that is known as Representative Concentration Pathway 8.5. In estimating impacts on climate change, climatologists use four representative such trajectories to project different greenhouse gas concentrations.

To put it plainly, Representative Concentration Pathway 8.5 assumes a combination of bad factors that are not likely to all coincide. It assumes “the fastest population growth (a doubling of Earth’s population to 12 billion), the lowest rate of technology development, slow GDP growth, a massive increase in world poverty, plus high energy use and emissions.”

3. It cherry-picks science on extreme weather and misrepresents timelines and causality.

4. Energy taxes are a costly non-solution.

The National Climate Assessment stresses that this report “was created to inform policy-makers and makes no specific recommendations on how to remedy the problem.” Yet the takeaway was clear: The costs of action (10 percent of America’s GDP) dwarf the costs of any climate policy.

The reality, however, is that policies endorsed to combat climate change would carry significant costs and would do nothing to mitigate warming, even if there were a looming catastrophe like the National Climate Association says.

Just last month, the Intergovernmental Panel on Climate Change proposed a carbon tax of between $135 and $5,500 by the year 2030. An energy tax of that magnitude would bankrupt families and businesses, and undoubtedly catapult the world into economic despair.

These policies would simply divert resources away from more valuable use, such as investing in more robust infrastructure to protect against natural disasters or investing in new technologies that make Representative Concentration Pathway 8.5 even more of an afterthought than it already should be. The Heritage Foundation

More Comments

“The scientists who wrote the National Climate Assessment used unreliable information that exaggerates the risks global warming poses.” – University of Colorado Prof. Roger Pielke Jr.

“This report from the climate alarmist Deep State in our government is even more hysterical than some United Nations reports. The idea that global temperatures could rise as much as 12 degrees in the next 80 years is absurd and not a shred of actual data and observation supports that. And as noted in Climate Change Reconsidered, sea levels have not been rising at an accelerated rate, and global temperatures have stayed largely the same for much of the last 20 years.” – Tim Huelskamp, Ph.D., President & CEO, The Heartland Institute

“I have never seen such blatantly absurd conclusions drawn entirely from mathematical models that use only a limited number of variables. Of course, this shoddy science by Obama-era appointees serves its real purpose: producing a preordained political outcome that puts more power and money in the hands of the United Nations.

“The physical evidence proves conclusively that sea level is not rising at increased levels. The frequency and strength of hurricanes has been declining for years, not increasing. The same goes for tornados, floods, and forest fires. In fact, there is no evidence that further increases in carbon dioxide emissions will have any deleterious effect on the planet or its temperature.

“This report is a scientific embarrassment. Not only does it rely on computer models to predict the climate through the end of the century, it relies on computer models from five years ago that have been laughably wrong, failing to get even close to reality since 2013. Happily, President Trump has on his advisory staff Dr. William Happer, who knows how flawed these models are and will advise the president to not base a single aspect of U.S. policy upon them.” – Jay Lehr, Ph.D., Science Director, The Heartland Institute

According to the Center for the Study of Carbon Dioxide and Global Change (http://www.co2science.org/):

“Real-world observations fail to confirm essentially all of the alarming predictions of significant increases in the frequency and severity of droughts, floods and hurricanes that climate models suggest should occur in response to a global warming of the magnitude that was experienced by the earth over the past two centuries as it gradually recovered from the much-lower-than-present temperatures characteristic of the depths of the Little Ice Age. And other observations have shown that the rising atmospheric CO2 concentrations associated with the development of the Industrial Revolution have actually been good for the planet, as they have significantly enhanced the plant productivity and vegetative water use efficiency of earth’s natural and agro-ecosystems, leading to a significant ‘greening of the earth.’” Read 168-page report

Comment from the Science and Environmental Policy Project (http://www.sepp.org/):

“Humanity evolved in the tropics about 200,000 years ago during periods of extreme climate change. The current warm period, the Holocene Epoch, started about 11,700 years ago. According to the International Commission on Stratigraphy, the earth has experienced three periods of climate change since emerging from the depths of the last Ice Age into the Holocene Epoch. Agriculture began during the Greenlanddian Age, the warmest time of the Holocene Epoch. Civilization began during Northgrippian Age, warmer than today, about 8200 to 4200 years ago. During the subsequent cooling, about 4200 years ago, humanity suffered and cultures disappeared. These changes appear to be unrelated to carbon dioxide (CO2). Yet the USGCRP declares that climate has been stable for 12,000 years and humanity is threatened by global warming from CO2?”

Humans adapted to Younger Dryas 

Climate change is real, climate has changed throughout the Earth’s history and will change in the future. Many times in human history climate has changed more rapidly than it is changing today, these changes are documented here and here. Probably the best example is from the end of the last glacial period, 11,700 years ago, after the Younger Dryas cold period, when temperatures rose 5-10°C in just a few decades in the Northern Hemisphere. This is an astounding 9°F to 18°F in much less than 100 years. Humans adapted and even thrived during this change, which occurred at the dawn of human civilization. Despite this evidence, NCA4 insists that recent warming is unprecedented, this is a clear error in the report. (Source)

By the way: According to the U.S. Energy Information Administration, between 2005 and 2017, U.S. energy related emissions of carbon dioxide plunged by 861 million metric tons, a 14% drop due mainly to the fracking revolution. During the same period, global emissions rose by 21% due mostly to China and India economic development.

Related articles:

Making climate predictions by S. Fred Singer

Reducing or eliminating carbon dioxide emissions will have no significant effect on global temperatures. See why:

Evidence that CO2 emissions do not intensify the greenhouse effect

Climate change in perspective

Devil’s Trumpet, another pretty but poisonous plant

Devil’s Trumpet, (Datura fastuosa), also called Datura metel is native to India and southeast Asia, but now grows all over the world in warm climates. It is in the Nightshade family. I took the photo for this article near the butterfly garden at the Arizona-Sonora Desert Museum. Other common names for this plant include: Horn of Plenty, Downy Thorn-Apple, Hoary Thorn-Apple, Purple Thorn-Apple, and Thorn-Apple. See more photos here.

The plant can be both an annual and perennial and can grow three to 12 feet high. The flowers, which are up to eight inches long, come in a variety of colors including white, yellow, cream, red, and violet.

According to Wikipedia:

All parts of Datura plants contain dangerous levels of highly poisonous tropane alkaloids and may be fatal if ingested by humans or other animals, including livestock and pets.

Datura metel may be toxic if ingested in a tiny quantity, symptomatically expressed as flushed skin, headaches, hallucinations, and possibly convulsions or even a coma. The principal toxic elements are tropane alkaloids. Ingesting even a single leaf can lead to severe side effects.

The plant is cultivated as an ornamental and for its medicinal characteristics. It is widely used in traditional Chinese medicine.

An article in the Journal of Pharmacology goes into great detail about the medical uses of this plant. In summary, “The dried leaves, flowers and roots were used as narcotic, antispasmodic, antitussive, bronchodilator, anti-asthmatic and as hallucinogenic. The plant was also used in diarrhea, skin diseases, epilepsy, hysteria, rheumatic pains, hemorrhoids, painful menstruation, skin ulcers, wounds and burns. In Ayurveda [an ancient medical treatise summarizing the Hindu art of healing and prolonging life], the plant was considered bitter, acrid, astringent, germicide, anodyne, antiseptic, antiphlogistic, narcotic and sedative.”

An article at Entheology.com goes into detail about traditional uses of this plant, most of which involve inebriation.

 

Related article:

Sacred Datura – pretty, poisonous, and hallucinogenic

 

Lindzen explains the climate system

The following are excerpts from a lecture presented by Dr. Richard Lindzen to the Global Warming Policy Foundation in October, 2018. Dr. Lindzen was Alfred P. Sloan Professor of Meteorology at the Massachusetts Institute of Technology until his retirement in 2013. He is the author of over 200 papers on meteorology and climatology and is a member of the US National Academy of Sciences and of the Academic Advisory Council of GWPF.

Each of the following sections has more to it. Read the entire lecture here:
https://www.thegwpf.org/content/uploads/2018/10/Lindzen-AnnualGWPF-lecture.pdf

The climate system

The following description of the climate system contains nothing that is in the least controversial, and I expect that anyone with a scientific background will readily follow the description. I will also try to make the description intelligible to the non-scientist.

The system we are looking at consists in two turbulent fluids (the atmosphere and the oceans) interacting with each other. By ‘turbulent,’ I simply mean that it is characterized by irregular circulations like those found in a gurgling brook or boiling water, but on the planetary scale of the oceans and the atmosphere. The opposite of turbulent is called laminar, but any fluid forced to move fast enough becomes turbulent, and turbulence obviously limits predictability. By interaction, I simply mean that they exert stress on each other and exchange heat with each other.

These fluids are on a rotating planet that is unevenly heated by the sun. The motions in the atmosphere (and to a lesser extent in the oceans) are generated by the uneven influence of the sun. The sun, itself, can be steady, but it shines directly on the tropics while barely skimming the Earth at the poles. The drivers of the oceans are more complex and include forcing by wind as well as the sinking of cold and salty water. The rotation of the Earth has many consequences too, but for the present, we may simply note that it leads to radiation being distributed around a latitude circle.

The oceans have circulations and currents operating on time scales ranging from years to millennia, and these systems carry heat to and from the surface. Because of the scale and density of the oceans, the flow speeds are generally much smaller than in the atmosphere and are associated with much longer time scales. The fact that these circulations carry heat to and from the surface means that the surface, itself, is never in equilibrium with space. That is to say, there is never an exact balance between incoming heat from the sun and outgoing radiation generated by the Earth because heat is always being stored in and released from the oceans and surface temperature is always, therefore, varying somewhat.

In addition to the oceans, the atmosphere is interacting with a hugely irregular land surface. As air passes over mountain ranges, the flow is greatly distorted. Topography therefore plays a major role in modifying regional climate. These distorted air-flows even generate fluid waves that can alter climate at distant locations. Computer simulations of the climate generally fail to adequately describe these effects.

A vital constituent of the atmospheric component is water in the liquid, solid and vapor phases, and the changes in phase have vast impacts on energy flows. Each component also has important radiative impacts. You all know that it takes heat to melt ice, and it takes further heat for the resulting water to become vapor or, as it is sometimes referred to, steam. The term humidity refers to the amount of vapor in the atmosphere. The flow of heat is reversed when the phase changes are reversed; that is, when vapor condenses into water, and when water freezes. The release of heat when water vapor condenses drives thunder clouds (known as cumulonimbus), and the energy in a thundercloud is comparable to that released in an H-bomb. I say this simply to illustrate that these energy transformations are very substantial. Clouds consist of water in the form of fine droplets and ice in the form of fine crystals. Normally, these fine droplets and crystals are suspended by rising air currents, but when these grow large enough they fall through the rising air as rain and snow. Not only are the energies involved in phase transformations important, so is the fact that both water vapor and clouds (both ice- and water-based) strongly affect radiation. Although I haven’t discussed the greenhouse effect yet, I’m sure all of you have heard that carbon dioxide is a greenhouse gas and that this explains its warming effect. You should, therefore, understand that the two most important greenhouse substances by far are water vapor and clouds. Clouds are also important reflectors of sunlight.

The unit for describing energy flows is watts per square meter. The energy budget of this system involves the absorption and re-emission of about 200 watts per square meter. Doubling CO2 involves a 2% perturbation to this budget. So do minor changes in clouds and other features, and such changes are common. The Earth receives about 340 watts per square meter from the sun, but about 140 watts per square meter is simply reflected back to space, by both the Earth’s surface and, more importantly, by clouds. This leaves about 200 watts per square meter that the Earth would have to emit in order to establish balance.

The sun radiates in the visible portion of the radiation spectrum because its temperature is about 6000K. ‘K’ refers to Kelvins, which are simply degrees Centigrade plus 273. Zero K is the lowest possible temperature (-273?C). Temperature determines the spectrum of the emitted radiation. If the Earth had no atmosphere at all (but for purposes of argument still was reflecting 140 watts per square meter), it would have to radiate at a temperature of about 255K, and, at this temperature, the radiation is mostly in the infrared.

Of course, the Earth does have an atmosphere and oceans, and this introduces a host of complications. So be warned, what follows will require a certain amount of concentration. Evaporation from the oceans gives rise to water vapor in the atmosphere, and water vapor very strongly absorbs and emits radiation in the infrared. This is what we mean when we call water vapor a greenhouse gas. The water vapor essentially blocks infrared radiation from leaving the surface, causing the surface and (via conduction) the air adjacent to the surface to heat, and, as in a heated pot of water, convection sets on. Because the density of air decreases with height, the buoyant elements expand as they rise. This causes the buoyant elements to cool as they rise, and the mixing results in decreasing temperature with height rather than a constant temperature. To make matters more complicated, the amount of water vapor that the air can hold decreases rapidly as the temperature decreases. At some height there is so little water vapor above this height that radiation from this level can now escape to space. It is at this elevated level (around 5 km) that the temperature must be about 255K in order to balance incoming radiation. However, because convection causes temperature to decrease with height, the surface now has to actually be warmer than 255K. It turns out that it has to be about 288K (which is the average temperature of the Earth’s surface).

This is what is known as the greenhouse effect. It is an interesting curiosity that had convection produced a uniform temperature, there wouldn’t be a greenhouse effect. In reality, the situation is still more complicated. Among other things, the existence of upper-level cirrus clouds, which are very strong absorbers and emitters of infrared radiation, effectively block infrared radiation from below. Thus, when such clouds are present above about 5 km, their tops rather than the height of 5 km determine the level from which infrared reaches space. Now the addition of other greenhouse gases (like carbon dioxide) elevates the emission level, and because of the convective mixing, the new level will be colder. This reduces the outgoing infrared flux, and, in order to restore balance, the atmosphere would have to warm. Doubling carbon dioxide concentration is estimated to be equivalent to a forcing of about 3.7 watts per square meter, which is little less than 2% of the net incoming 200 watts per square meter. Many factors, including cloud area and height, snow cover, and ocean circulations, commonly cause changes of comparable magnitude.

It is important to note that such a system will fluctuate with time scales ranging from seconds to millennia, even in the absence of an explicit forcing other than a steady sun. Much of the popular literature (on both sides of the climate debate) assumes that all changes must be driven by some external factor. Of course, the climate system is driven by the sun, but even if the solar forcing were constant, the climate would still vary. This is actually something that all of you have long known – even if you don’t realize it. After all, you have no difficulty recognizing that the steady stroking of a violin string by a bow causes the string to vibrate and generate sound waves. In a similar way, the atmosphere–ocean system responds to steady forcing with its own modes of variation (which, admittedly, are often more complex than the modes of a violin string). Moreover, given the massive nature of the oceans, such variations can involve time scales of millennia rather than milliseconds. El Niño is a relatively short example, involving years, but most of these internal time variations are too long to even be identified in our relatively short instrumental record. Nature has numerous examples of autonomous variability, including the approximately 11-year sunspot cycle and the reversals of the Earth’s magnetic field every couple of hundred thousand years or so. In this respect, the climate system is no different from other natural systems.

Of course, such systems also do respond to external forcing, but such a forcing is not needed for them to exhibit variability. While the above is totally uncontroversial, please think about it for a moment. Consider the massive heterogeneity and complexity of the system, and the variety of mechanisms of variability as we consider the current narrative that is commonly presented as ‘settled science.’

The popular narrative and its political origins

Now here is the currently popular narrative concerning this system. The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature
change, and is primarily controlled by the 1-2% perturbation in the energy budget due to a single variable – carbon dioxide – among many variables of comparable importance. This is an extraordinary claim based on reasoning that borders on magical thinking. It is, however, the narrative that has been widely accepted, even among many sceptics.

Many politicians and learned societies go even further: They endorse carbon dioxide as
the controlling variable, and although mankind’s CO2 contributions are small compared to the much larger but uncertain natural exchanges with both the oceans and the biosphere, they are confident that they know precisely what policies to implement in order to control carbon dioxide levels.

The evidence

At this point, some of you might be wondering about all the so-called evidence for dangerous climate change. What about the disappearing Arctic ice, the rising sea level, the weather extremes, starving polar bears, the Syrian Civil War, and all the rest of it? The vast variety of the claims makes it impossible to point to any particular fault that applies to all of them. Of course, citing the existence of changes – even if these observations are correct (although surprisingly often they are not) – would not implicate greenhouse warming per se. Nor would it point to danger. Note that most of the so-called evidence refers to matters of which you have no personal experience. Some of the claims, such as those relating to weather extremes, contradict what both physical theory and empirical data show. The purpose of these claims is obviously to frighten and befuddle the public, and to make it seem like there is evidence where, in fact, there is none.

Conclusion

So there you have it. An implausible conjecture backed by false evidence and repeated incessantly has become politically correct ‘knowledge,’ and is used to promote the overturn of industrial civilization. What we will be leaving our grandchildren is not a planet damaged by industrial progress, but a record of unfathomable silliness as well as a landscape degraded by rusting wind farms and decaying solar panel arrays. False claims about 97% agreement will not spare us, but the willingness of scientists to keep mum is likely to much reduce trust in and support for science. Perhaps this won’t be such a bad thing after all – certainly as concerns ‘official’ science.

There is at least one positive aspect to the present situation. None of the proposed policies will have much impact on greenhouse gases. Thus we will continue to benefit from the one thing that can be clearly attributed to elevated carbon dioxide: namely, its effective role as a plant fertilizer, and reducer of the drought vulnerability of plants.

See also:
Evidence that CO2 emissions do not intensify the greenhouse effect
An examination of the relationship between temperature and carbon dioxide

A Hidden Tucson Treasure – WomanKraft Art Center

Nearly hidden by surrounding trees, a 1918 Queen Anne Victorian house at 388 South Stone Avenue is home to the WomanKraft Art Gallery and School of the Arts. They call it “The Castle.” Look for the colorful front gate just across the street from the Downtown Motor Hotel.

WomanKraft is a non-profit arts organization open to both women and men. The WomanKraft mission: “To claim, validate, and empower women artists and other under-represented groups.” Local artists are encouraged to submit artwork to all upcoming shows, at no charge or submission fee. Besides the gallery and school, there is an all-natural beauty salon and individual studios for rent. And, it’s not just all art; they have fun too. There are bingo nights, karaoke nights, and rummage sales. Regular Gallery hours are Wednesday through Saturday 1 to 5pm. Go take a look.

My wife Lonni and I are both members. Lonni, besides being a novelist, is also a fine artist and displays (and sometimes sells) her work in most of the shows. One of my favorites of Lonni’s paintings is “Feathers & Pearls” painted for a “black & white & shades of grey” show. I produce the WomanKraft Gallery newsletter which is called “The Castle Voice.” See the latest issue here. The Castle Voice contains a list of all shows, and descriptions for WomanKraft’s extensive schedule of classes. Most shows have receptions on two Saturday nights with free admission, snacks, and beverages. Check the schedule in the Castle Voice.

Artwork in the shows vary in price range and are unlimited in their creativity – metal, oils, water color, sculpture, collage, jewelry and pure imagination. If you love a piece of art, chances are that you can afford it.

In the current show, running from November 3 through December 22, all works are priced between $1.00 and $100.00. It’s a great place to find that special and unusual holiday treasure.

To learn more, read The Castle Voice, linked above and/or visit the website and facebook page.

Website: http://womankraft.org/

Facebook: https://www.facebook.com/womankraft

WomanKraft is located just four blocks south of downtown Tucson at 388 South Stone.

Note to readers:

Main temperature database used by IPCC found to contain multiple errors

An audit of the HadCRUT4 dataset, the primary global temperature database used by the Intergovernmental Panel on Climate Change (IPCC) has found multiple errors.

HadCRUT4 is also the dataset at the center of “ClimateGate” from 2009, managed by the Climate Research Unit (CRU) at East Anglia University.

The paper, An Audit of the Creation and Content of the HadCRUT4 Temperature Dataset by John McLean (PhD), was first published as a PhD thesis and now as a book. Get the book for $8 here. Read the original thesis here (free download).

The audit found more than 70 areas of concern about data quality and accuracy.

Australian researcher John McLean says that HadCRUT4 is far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world.

Main points:

The Hadley data is one of the most cited, most important databases for climate modeling, and thus for policies involving billions of dollars.

McLean found freakishly improbable data, and systematic adjustment errors, large gaps where there is no data, location errors, Fahrenheit temperatures reported as Celsius, and spelling errors.

[The improper transposition of Fahrenheit temperatures to Celsius is serious. Fahrenheit 40 is a cool temperature but Celsius 40 is equivalent to 104 Fahrenheit. This erroneous transposition is real “man-made global warming.”]

Almost no quality control checks have been done: outliers that are obvious mistakes have not been corrected. For instance, one town in Columbia spent three months in 1978 at an average daily temperature of over 80 degrees C (176 F). One town in Romania stepped out from summer in 1953 straight into a month of Spring at minus 46°C. These are supposedly “average” temperatures for a full month at a time. St Kitts, a Caribbean island, was recorded at 0°C for a whole month, and twice!

Temperatures for the entire Southern Hemisphere in 1850 and for the next three years are calculated from just one site in Indonesia and some random ships.

Sea surface temperatures represent 70% of the Earth’s surface, but some measurements come from ships which are logged at locations 100km inland. Others are in harbors which are hardly representative of the open ocean.

When a thermometer is relocated to a new site, the adjustment assumes that the old site was always built up and “heated” by concrete and buildings. In reality, the artificial warming probably crept in slowly. By correcting for buildings that likely didn’t exist in 1880, old records are artificially cooled. Adjustments for a few site changes can create a whole century of artificial warming trends.

Details of the worst outliers:

For April, June and July of 1978 Apto Uto, Colombia had an average monthly temperature of 81.5°C, 83.4°C and 83.4°C respectively. (178 to 182 Fahrenheit)

The monthly mean temperature in September 1953 at Paltinis, Romania is reported as -46.4 °C (in other years the September average was about 11.5°C).

At Golden Rock Airport, on the island of St Kitts in the Caribbean, mean monthly temperatures for December in 1981 and 1984 are reported as 0.0°C. But from 1971 to 1990 the average in all the other years was 26.0°C.

Bad data and bad modeling assumptions make IPCC temperature simulations diverge widely from really. That’s why we should not believe the IPCC when they cry “wolf” and say it’s the end of the world unless we give them billions of dollars and get rid of fossil fuels.

The primary conclusion of the audit (as noted by Anthony Watts) is that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.

One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.

Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as “indicative” of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect.

 

Why is it that a PhD student working from home can find mistakes that the British Met Office, a £226 million institute with 2,100 employees, could not. Significantly, the Met Office, in a statement, said they do not disagree with any of his claims.

Maybe, as President Dwight D. Eisenhower said in his farewell address:

Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite.

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

The fake two degree political limit on global warming

Climate change in perspective – a tutorial for policy makers