CO2

Fourth National Climate Assessment is junk science

The U.S. Global Change Research Program (USGCRP) has just released the final version of its Fourth National Climate Assessment report, one that many were claiming that the Trump administration would suppress because, like its predecessors, it is mainly a political document rather than a true scientific assessment . You can read the full 477-page report here: https://science2017.globalchange.gov/

The main conclusion is: “This assessment concludes, based on extensive evidence, that it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.”

The “extensive evidence” is based entirely on climate modeling rather than on observations. The results produced by models diverge widely from reality. The new report makes the same claims and invokes the same junk science as the previous 2014 report which I analyzed here: National Climate Assessment Lacks Physical Evidence.

As an example of unfounded claims made in the new report we see this statement in the executive summary: “Heatwaves have become more frequent in the United States since the 1960s, while extreme cold temperatures and cold waves are less frequent.”

But plots from the EPA and NOAA show that the most intense heat waves occurred in the 1930s.

 

Another example:

Claim in the report: “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” This statement is technically correct but it represents cherry-picking and lying by omission.

The National Interagency Fire Center has a table listing the number of fires and acreage burned from 1960 through 2016 (see: https://www.nifc.gov/fireInfo/fireInfo_stats_totalFires.html ).

We see from the table that there were 18,229 fires reported in 1983 which increased to 67,743 fires reported in 2016. What the report doesn’t mention is that the number of fires from 1960 to 1982 were all in the six figure range, e.g., in 1960 there were 103,387 fires and in 1981 there were 249,370 fires. The number dropped to 174,755 fires in 1982.

Fire frequency does not necessarily increase with warming. In many parts of the world, fire frequency decreases with warming. See my post “Wildfires And Warming – relationship not so clear.”

A third example of unfounded claims:

Section 2.6.1 of the report discusses the “greenhouse effect.” They claim: “As increasing GHG [greenhouse gases] concentrations warm the atmosphere, tropospheric water vapor concentrations increase, thereby amplifying the warming effect.” Climate models depend on this assumption. But NOAA’s own data show that global humidity has been decreasing with warming.

Comments by others:

Theoretical physicist Steve Koonin has an op-ed in the Wall Street Journal entitled “A Deceptive New Report on Climate.”

Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. The WSJ article is pay-walled but you can read extensive excerpts here.

Among his comments:

One notable example of alarm-raising is the description of sea-level rise, one of the greatest climate concerns. The report ominously notes that while global sea level rose an average 0.05 inch a year during most of the 20th century, it has risen at about twice that rate since 1993. But it fails to mention that the rate fluctuated by comparable amounts several times during the 20th century. The same research papers the report cites show that recent rates are statistically indistinguishable from peak rates earlier in the 20th century, when human influences on the climate were much smaller. The report thus misleads by omission.

Note: The rate of sea level rise and fall tends to be cyclical on decadal and bi-decadal periods. See my article: The Sea Level Scam.

Koonin also comments on heat waves: The report’s executive summary declares that U.S. heat waves have become more common since the mid-1960s, although acknowledging the 1930s Dust Bowl as the peak period for extreme heat. Yet buried deep in the report is a figure [6.3] showing that heat waves are no more frequent today than in 1900.

Comments by Dr. Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute, past president of the American Association of State Climatologists, and former program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Read full post “What You Won’t Find in the New National Climate Assessment.”

Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.

The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing.

The report does not tell you that:

1) Climate model predictions of global temperature diverge widely from observations.

2) No hot spot over tropics: The models predict that there should have been a huge “hot spot” over the entire tropics, which is a bit less than 40% of the globe’s surface. Halfway up through the atmosphere (by pressure), or at 500 hPa, the predicted warming is also twice what is being observed, and further up, the prediction is for seven times more warming than is being observed.

The importance of this is paramount. The vertical distribution of temperature in the tropics is central to the formation of precipitation.

Missing the tropical hot spot provokes an additional cascade of errors. A vast amount of the moisture that forms precipitation here originates in the tropics. Getting that wrong trashes the precipitation forecast, with additional downstream consequences, this time for temperature.

When the sun shines over a wet surface, the vast majority of its incoming energy is shunted towards the evaporation of water rather than direct heating of the surface. This is why in the hottest month in Manaus, Brazil, in the middle of the tropical rainforest and only three degrees from the equator, high temperatures average only 91 F (not appreciably different than humid Washington, DC’s 88 F). To appreciate the effect of water on surface heating of land areas, high temperatures in July in bone-dry Death Valley average 117 F.

Getting the surface temperature wrong will have additional consequences for vegetation and agriculture. In general, a wetter U.S. is one of bumper crops and good water supplies out west from winter snows, hardly the picture painted in the National Assessment.

If the government is going to spend time and our money on producing another assessment report, that report should be based on empirical evidence, not climate models. Note that USGCRP is a conglomeration of 13 federal agencies that had a 2016 budget of $2.6 billion for the climate assessment project. Did you get your money’s worth?

Climate modelers make some outlandish predictions, but occasionally there is a glimmer of honesty:

“The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change.” — James Hansen, “Climate forcings in the Industrial era”, PNAS, Vol. 95, Issue 22, 12753-12758, October 27, 1998.

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC.

And remember: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken

One other point:

Temperatures recorded by the US Climate Reference Network (USCRN) show no statistically significant trend since this network was established in 2004. This data is from state-of-the-art ultra-reliable triple redundant weather stations placed on pristine environments. As a result, these temperature data need none of the adjustments that plague the older surface temperature networks, such as USHCN and GHCN, which have been heavily adjusted to attempt corrections for a wide variety of biases. Using NOAA’s own USCRN data, which eliminates all of the squabbles over the accuracy of and the adjustment of temperature data, we can get a clear plot of pristine surface data.

 

BACKGROUND:

By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

USGCRP Science?
What is now called the USGCRP has a murky, politicized past. It was established in 1989 and mandated by Congress in 1990 to “assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It is to produce a National Climate Assessment every four year. Since 1990, it has produced four reports. The last full report, the 3rd National Climate Assessment, was in May 2014. Apparently, after the election of Mr. Trump, the USGCRP decided on the CSSR, released last week. As with prior NSGCRP reports, it ignores the “natural processes of global change”, which is part of its Congressional mandate.

Such political games are part of USGCRP’s established history. After the election of Mr. Bush, in 2000, under a prior name, the USGCRP released the 2000 U.S. National Assessment of Climate Change report. As shown in the 2008 report of the Nongovernmental International Panel for Climate Change (NIPCC) (Fig 16 & pp 14 to 16), the government report had projections / predictions that were nonsense. The government entity had two different climate models for climate change to 2090, which produced dramatically different results for perception, by regions. The worst example was for the Red River watershed in the Dakotas and Minnesota. One model had a precipitation drop of about 80%, turning the region into a desert, the second model had a precipitation increase of about 80%, resulting in dramatic flooding. The disparity between two models is but one example how inadequately tested global climate models may be used to project / predict almost anything. The federal courts found that the 2000 report did not meet the standards of the Data Quality Act, also called the Information Quality Act. The recent reports of the UN Intergovernmental Panel on Climate Change (IPCC) and the USGCRP have tried to cover up the disparities in the results of their global climate models by blending them into an ensemble. Usually, there are too few runs of any model to establish realistic forecasts for that model. The forecasts change with each run.

Further, the major problem remains, the models are not adequately tested to be used to form government policies on global warming / climate change. As comments by Patrick Michaels carried in last week’s TWTW illustrate, USGCRP ignores the existence of the important problem between the forecasts of atmospheric temperature trends by the global climate models with actual atmospheric temperature trends. The USGCRP ignores physical science.

 

For background reading:

A Simple Question for Climate Alarmists

Evidence that CO2 emissions do not intensify the greenhouse effect

An examination of the relationship between temperature and carbon dioxide

Analysis of the National Climate Assessment Report 2014

Trump, the National Climate Assessment report, and fake news 2017

Advertisements

It’s time to dump the EPA “endangerment finding” which classified carbon dioxide as a pollutant

In 2009, the EPA ruled, under the Clean Air Act, that “the current and projected concentrations of the six key well-mixed greenhouse gases—carbon dioxide (CO2), methane (CH4), nitrous oxide (N2O), hydrofluorocarbons (HFCs), perfluorocarbons (PFCs), and sulfur hexafluoride (SF6)—in the atmosphere threaten the public health and welfare of current and future generations.” In essence, the EPA classified carbon dioxide as a pollutant even though Carbon dioxide is necessary for life on Earth.

For some perspective, note that current atmospheric concentration of carbon dioxide is about 400ppm (parts per million) while the air we exhale with every breathe contains 40,000ppm carbon dioxide. Is breathing causing air pollution?

This EPA ruling in effect allowed EPA to regulate everything from automobile exhaust to power plants to refrigerators. In order to overturn the finding, one would have to successfully show that the underlying scientific basis is wrong – and it is. Another tactic would be to have Congress amend the Clean Air Act, something that is very unlikely in the current contentious Congress.

The EPA’s scientific basis is derived from climate models, predictions of which diverge widely from reality. See my ADI articles:

Evidence that CO2 emissions do not intensify the greenhouse effect

Failure of climate models shows that carbon dioxide does not drive global temperature

Additional reading on the “Endangerment Finding” if you want to get into the details:

 

The EPA CO2 endangerment finding endangers the USA by Dennis Avery.

“In science, if your theory doesn’t take account of all the relevant data, you need a new theory.” Avery shows how the climate models fail to explain observations and notes that thousands of new coal-fired power plants are being built around the world – even in Europe. Avery is a former U.S. State Department senior analyst and co-author with astrophysicist Fred Singer of Unstoppable Global Warming: Every 1,500 Years.

 

Why Revoking the EPA GHG Endangerment Finding Is the Most Urgent Climate Action Needed

by Alan Carlin. Carlin is a scientist and economist who worked for the RAND Corp. and the EPA.

“Revoking the EF is the only way to bring the climate alarmism scam to the untimely end it so richly deserves in the US and hopefully indirectly elsewhere. Until that happens the CIC [climate industrial complex] will continue to pursue its bad science through reports such as the National Climate Assessment with the recommended disastrous policies that would seriously damage the environment, impoverish the less wealthy, and bring economic disaster for our Nation by raising the prices and decreasing the availability and reliability of fossil fuel energy which is so central to our way of life and economy.”

 

In a separate post, Carlin also said that “EPA never engaged in a robust, meaningful discussion. Rather, there was a pro forma review after a decision had already been made which met many but not all of the legal requirements.” He lists “six crucial scientific issues that EPA did not actively discuss despite my best efforts to bring a few of them to their attention in early 2009.”

 

Dr. Pat Michaels on the ‘voluminous science that the USGCRP either ignored or slanted’ for the EPA endangerment finding

Patrick J. Michaels is the director of the Center for the Study of Science at the Cato Institute. Michaels is a past president of the American Association of State Climatologists and was program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Michaels recounts his testimony before the EPA. USGCRP is U.S. Global Change Research Program.

 

60 scientists call for EPA endangerment finding to be reversed

“We the undersigned are individuals who have technical skills and knowledge relevant to climate science and the GHG Endangerment Finding. We each are convinced that the 2009 GHG Endangerment Finding is fundamentally flawed and that an honest, unbiased reconsideration is in order.”

Impact of Paris Climate Accord and why Trump was right to dump it

The much touted Paris Climate Accord aims at worldwide reduction of carbon dioxide emissions in order to keep global temperatures from rising more than 2°C above pre-industrial levels. This goal is purely arbitrary and based not upon any physical evidence, but upon the unproven assumption that carbon dioxide emissions play a significant role in global warming. What the Paris Accord really does is to transfer trillions of dollars from industrialized countries, mainly the US, to the sticky-fingered United Nations and to developing nations. It has a very minimal effect on global warming.

Several studies estimate the actual effects of the Accord. The most recent is from Bjorn Lomborg, published in the peer-reviewed journal, Global Policy (read full paper). Here is the paper abstract:

This article investigates the temperature reduction impact of major climate policy proposals implemented by 2030, using the standard MAGICC climate model [developed at the National Center for Atmospheric Research, Boulder, US, and University of Adelaide, Australia].

Even optimistically assuming that promised emission cuts are maintained throughout the century, the impacts are generally small.

The impact of the US Clean Power Plan (USCPP) is a reduction in temperature rise by 0.013°C by 2100.

The full US promise for the COP21 climate conference in Paris, its so-called Intended Nationally Determined Contribution (INDC) will reduce temperature rise by 0.031°C.

The EU 20-20 policy has an impact of 0.026°C, the EU INDC 0.053°C, and China INDC 0.048°C.

All climate policies by the US, China, the EU and the rest of the world, implemented from the early 2000s to 2030 and sustained through the century will likely reduce global temperature rise about 0.17°C in 2100.

The estimated cost of this scam:

REPORT: $12.7 Trillion Needed To Meet Paris Climate Accord’s Goal

by Michael Bastasch, Daily Caller

A whopping $7.4 trillion will be spent globally on new green energy facilities in the coming decades, but another $5.3 trillion is needed to meet the goals of the Paris climate accord, according to a new report.

Bloomberg New Energy Finance (BNEF) is out with a new long-term energy outlook report, this time projecting a total of $12.7 trillion to keep projected global warming below 2 degrees Celsius by the end of the century — a goal of the Paris accord. Read more

“The current focus on CO2 emissions reductions risks having a massively expensive global solution that is more damaging to societies than the problem of climate change.” – Dr. Judith Curry

But the Accord will harm poor people in developing countries:

While the plan’s costs may range as high as $1 trillion annually, none of it would have any meaningful impact on the roughly three billion people in the developing world who currently have no real access to energy.

Much of the developing world still burns dung as their chief means of cooking and heating. Realistically, the most effective means of saving their lives and improving living conditions would be to provide the steady electricity generation needed for water and sewage treatment as well as lighting and cooking.

The Paris Accord, in contrast, essentially ends any chance to help them. While natural gas and coal power plants could provide reliable, affordable electricity for these populations, the Accord aims to steadily reduce fossil fuel usage. Read more

Estimates of the Accord’s effectiveness in reducing global warming as stated above are based on analysis of surface temperatures. However, “For the past 38 years, satellites have continually tracked global temperatures. And what they’ve recorded in that time is a temperature increase averaging 0.136 degrees Celsius per decade. That means on its current trajectory the Earth could see a potential surface temperature increase of 1.36 degrees Celsius over the entire 21st century.

Noting the current warming trajectory, it appears that by simply doing nothing, the world could accomplish the main goal of the Accord.” (IBID.)

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

Failure of climate models shows that carbon dioxide does not drive global temperature

An examination of the relationship between temperature and carbon dioxide

 

An examination of the relationship between temperature and carbon dioxide

Natural variation trumps CO2

Many climate scientists claim that our carbon dioxide emissions are the principal driver of global warming. I have asked several University of Arizona professors, who make such a claim, to provide supporting physical evidence. So far, none have been able to justify the claim with physical evidence.

In this article, we will examine the Earth’s temperature and the carbon dioxide (CO2) content of the atmosphere at several time scales to see if there is any relationship. I stipulate that the greenhouse effect does exist. I maintain, however, that the ability of CO2 emissions to cause global warming is tiny and overwhelmed by natural forces. The main effect of our “greenhouse” is to slow cooling.

There is an axiom in science which says: “correlation does not prove causation.” Correlation, however, is very suggestive of a relationship. Conversely, lack of correlation proves that there is no cause-and-effect relationship.

Phanerozoic time – the past 500 million years:

 

Estimates of global temperature and atmospheric CO2 content based on geological and isotope evidence show little correlation between the two. Earth experienced a major ice age in the Ordovician Period when atmospheric CO2 was 4,000ppm, 10 times higher than now. Temperatures during the Cretaceous Period were rising and steamy, but atmospheric CO2 was declining.

Notice also, that for most of the time, Earth’s temperature was much warmer than now and life flourished. There were some major extinction periods, all associated with ice ages.

Sources:

Berner, R.A. and Kothavala, Z, 2001, GEOCARB III: A Revised Model of Atmospheric CO2 over Phanerozoic Time, American Journal of Science, Vol. 301, February, 2001, P. 182–204

Scotese, C.R., http://www.geocraft.com/WVFossils/Carboniferous_climate.html

Our current ice age – the past 420,000 years:

During the latter part of our current ice age, glacial-interglacial cycles occurred with a periodicity of about 100,000 years which correlates with the changes in Earth’s orbit around the sun as it changes from nearly circular to elliptical with an eccentricity of about 9%. Here we see an apparent correlation between temperature and CO2. The data are from ice cores collected at the Vostok station in Antarctica. The scientists working on the Vostok core noticed that temperature changes PRECEDED changes in CO2 concentration by about 800 years. Again, we see that CO2 doesn’t have much influence on temperature, but temperature has great influence on CO2concentration because temperature controls CO2 solubility in the ocean.

Sources:

Petit, J.R., et al., 1999. Climate and atmospheric history of the past 420,000 years from the Vostok ice core, Antarctica. Nature 399: 429-436.

Mudelsee, M, 2001. The phase relations among atmospheric CO2 content, temperature and global ice volume over the past 420 ka, Quaternary Science Reviews 20:583-589.

Siegenthaler, U. Et al., 2005. Stable carbon cycle-climate relationship during the late Pleistocene. Science 310: 1313-1317.

The Holocene – the past 10,000 years:

The Holocene represents the current interglacial period. For most of the past 10,000 years, temperature was higher than now. CO2 was fairly steady below 300ppm (vs over 400ppm now). There were cycles of warm and cool periods at a periodicity of 1200 to 1500 years. This periodicity correlates with the interplay of the several solar cycles. The sun itself goes through cycles of solar intensity and magnetic flux. When the cycles are in a strong phase, the amount of cosmic rays entering the atmosphere is reduced, there are fewer clouds to block the sun, so it is warmer. When solar cycles wane, as is beginning to happen now, more cosmic rays enter the atmosphere and produce more clouds which block the sun, so it becomes cooler. The number of sunspots (hence magnetic flux) varies on an average cycle of 11 years. There are also 87-year (Gliessberg) and 210-year (DeVriess-Suess) cycles in the amplitude of the 11-year sunspot cycle which combine to form an approximately 1,500-year cycle of warming and cooling.

The 20th Century:

 

The first part of the 20th Century experienced warming in the 1920s and 1930s comparable to current temperatures. According to NASA, atmospheric CO2 rose from 295ppm in 1900 to 311ppm in 1940. Major emissions from burning fossil fuels, however, commenced after WWII in the mid 1940s. The period 1940-1970 saw a CO2 rise of 311ppm to 325ppm. That period also showed global cooling to such an extent that climate scientists were predicting a return to glacial conditions. From about 1980 to 2000, CO2 rose from 339ppm to 370ppm and we had warming during that period until the super El Nino of 1997/1998. Some of this data has been “corrected” by NOAA.

Source: NOAA Climate at a glance

The 21st Century so far 

Microwave data from satellites converted to temperature.

Between the El Nino of 1997 and that of 2016, there have been temperature fluctuations but no net warming. Atmospheric CO2 rose from 363ppm to 407ppm today. It seems that there is no correlation between global temperature and CO2.

As I said at the beginning, while the CO2-induced greenhouse effect has some hypothetical warming potential, that warming is tiny and overwhelmed by the forces of natural variation. So far, I have seen no physical evidence to contradict my contention.

Source : http://www.drroyspencer.com/2017/05/uah-global-temperature-update-for-april-2017-0-27-deg-c/

See also: Evidence that CO2 emissions do not intensify the greenhouse effect

The Oman geoengineering scheme to save the planet

A story in the Arizona Daily Star, 4-14-17 (the great march for science issue) shows how some scientists create the most tenuous links between their research and climate change as a plea for funding.

This story is “Oman’s mountains may hold clues for reversing climate change.” (Link) The lede: “Deep in the jagged red mountains of Oman, geologists are searching for an efficient and cheap way to remove carbon dioxide from the air and oceans — and perhaps begin to reverse climate change. They are coring samples from one of the world’s only exposed sections of the Earth’s mantle to uncover how a spontaneous natural process millions of years ago transformed carbon dioxide into limestone and marble.”

The researchers are excited because the exposed mantle rock is mostly peridotite, a coarse-grained igneous rock made up of the minerals olivine and pyroxene, both magnesium silicates. “They hope to answer the question of how the rocks managed to capture so much carbon over the course of 90 million years — and to see if there’s a way to speed up the timetable.” A researcher goes on to say, ““Every single magnesium atom in these rocks has made friends with the carbon dioxide to form solid limestone, magnesium carbonate, plus quartz.”

A couple of nitpicks: Limestone is calcium carbonate, not magnesium carbonate (Calcium and magnesium together with carbonate form a rock called dolomite). Marble is a metamorphic rock which requires heat and/or pressure to form. That “spontaneous natural process” happened not only millions of years ago, but is a continuing natural process in the ocean when calcium ions derived from weathering of surface rocks combine with carbonate ions in the ocean. Basaltic ocean crustal rocks act as a buffer by continuously removing CO2 from the ocean by combining carbonate with calcium derived from surface weathering of rocks.

Their great scheme is this: “a drilling operation could cycle carbon-rich water into the newly formed seabed on oceanic ridges far below the surface. Just like in Oman’s mountains, the submerged rock would chemically absorb carbon from the water. The water could then be cycled back to the surface to absorb more carbon from the atmosphere, in a sort of conveyor belt.”

Perhaps the researchers made the climate change link to their research just to suck up grant money so they can continue studying. The geology is interesting, but their idea sounds like another crazy, expensive, and totally unnecessary geoengineering scheme. (See Wacky Geoengineering Schemes to Control Climate)

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

Carbon dioxide is necessary for life on Earth

Earth’s climate has been changing for at least four billion years in cycles large and small. Few in the climate debate understand those changes and their causes. Many are fixated on carbon dioxide (CO2), a minor constituent of the atmosphere, but one absolutely necessary for life as we know it. Perhaps this fixation derives from ulterior political motives for controlling the global economy. For others, the true believers, perhaps this fixation derives from ignorance.

Greenpeace co-founder Dr. Patrick Moore has written an excellent summary of the history of carbon dioxide on Earth titled, “The Positive Impact of Human CO2 Emissions On the Survival of Life on Earth.” In this 24-page paper, Moore notes that we came dangerously close to losing plant life on Earth about 18,000 years ago, when CO2 levels approached 150 ppm, below which plant life can’t sustain photosynthesis. Currently, atmospheric CO2 stands at about 400 ppm which is about one-third the level for optimum plant growth.

Phanerozoic temp CO2 moore

Here is the executive summary of Moore’s paper:

This study looks at the positive environmental effects of carbon dioxide (CO2) emissions, a topic which has been well established in the scientific literature but which is far too often ignored in the current discussions about climate change policy. All life is carbon based and the primary source of this carbon is the CO2 in the global atmosphere. As recently as 18,000 years ago, at the height of the most recent major glaciation, CO2 dipped to its lowest level in recorded history at 180 ppm, low enough to stunt plant growth.

This is only 30 ppm above a level that would result in the death of plants due to CO2 starvation. It is calculated that if the decline in CO2 levels were to continue at the same rate as it has over the past 140 million years, life on Earth would begin to die as soon as two million years from now and would slowly perish almost entirely as carbon continued to be lost to the deep ocean sediments. The combustion of fossil fuels for energy to power human civilization has reversed the downward trend in CO2 and promises to bring it back to levels that are likely to foster a considerable increase in the growth rate and biomass of plants, including food crops and trees. Human emissions of CO2 have restored a balance to the global carbon cycle, thereby ensuring the long-term continuation of life on Earth.

Moore presents a concise history of CO2 beginning in the Cambrian Period 540 million years ago when CO2 was about 7,000 ppm. He follows that with a discussion of how carbon is distributed today between the atmosphere, oceans, plant life, and rocks.

In his concluding remarks, Moore briefly discusses the politics of CO2 and notes: ” Lost in all these machinations is the indisputable fact that the most important thing about CO2 is that it is essential for all life on Earth and that before humans began to burn fossil fuels, the atmospheric concentration of CO2 was heading in a very dangerous direction for a very long time.”

There is no physical evidence that our CO2 emissions from burning fossil fuels will produce catastrophic climate change. Moore asks, “Given that the optimum CO2 level for plant growth is above 1,000 ppm and that CO2 has been above that level for most of the history of life, what sense does it make to call for a reduction in the level of CO2 in the absence of evidence of catastrophic climate change?”

You can read Moore’s full paper here.

Also of interest is a 40-page paper published by the Canadian Friends of Science Society entitled “A Confluence of Carbonbaggers.” – great title. Here is part of the executive summary:

Given that in most industries, margins of error are required to be small, this paper reviews the substantial failings of the IPCC in everything from the original premise that human activity producing greenhouse gases was the primary cause of recent warming, to vast statistical errors in climate models that are 500 and 600% off trend.

Few average citizens have any knowledge of these goings on – despite the fact that billions of their tax dollars are vanishing every year on causes proclaimed by this august organization and echoed around the world by enthusiastic Environmental Non-Governmental Organizations (ENGOs) who find climate catastrophe predictions the easiest way to raise money to ‘save the planet’ – all the while demonizing traditional industries that have provided jobs, energy and resources that have created our modern, industrialized world.

This report is a compilation of errors, false and wildly exaggerated predictions, and the IPCC’s claim that it in fact ‘makes no recommendation of any kind on any topic’ – effectively washing its hands of responsibility for the damage its reports and meetings have done in the Western world. National economies have been ruined, investment markets distorted, industries devastated, thousands have died prematurely due to sharp rise in power prices across the UK and Europe as the poor and middle class have been pushed into ‘heat-or-eat’ poverty – and yet governments still persist in designing faulty climate and economic policies based on flawed documents from the IPCC – cited as ‘the authority on climate change.’

See also:

Evidence that CO2 emissions do not intensify the greenhouse effect

A Modest Proposal: Triple Your Carbon Footprint

Patrick Moore asks – Should We Celebrate Carbon Dioxide?

At the annual meeting of the Global Warming Policy Foundation, Dr. Patrick Moore, a founder of Greenpeace, put carbon dioxide in perspective and showed why policies and efforts to reduce atmospheric carbon dioxide are not only futile, but also dangerous to life on Earth. You can read the full transcript of the speech here. Dr. Moore left Greenpeace when that organization lost its strong humanitarian orientation and became radicalized with the belief that humans are the enemies of the earth.

Dr. Moore believes that the current concentration of carbon dioxide in the atmosphere, 400ppm, is dangerously low and much lower that it has been for most of Earth’s history. He says we and the planet would be better off with a concentration closer to 2,000ppm.

Here are some excerpts from his speech:

Let’s begin with our knowledge of the long-term history of the Earth’s temperature and of CO2 in the Earth’s atmosphere. I will focus on the past 540 million years since modern life forms evolved. It is glaringly obvious that temperature and CO2 are in an inverse correlation at least as often as they are in any semblance of correlation. Two clear examples of reverse correlation occurred 150 million years and 50 million years ago. At the end of the Jurassic temperature fell dramatically while CO2 spiked. During the Eocene Thermal Maximum, temperature was likely higher than any time in the past 550 million years while CO2 had been on a downward track for 100 million years. This evidence alone sufficient to warrant deep speculation of any claimed lock-step causal relationship between CO2 and temperature.

The Devonian Period beginning 400 million years ago marked the culmination of the invasion of life onto the land. Plants evolved to produce lignin, which in combination with cellulose, created wood which in turn for the first time allowed plants to grow tall, in competition with each other for sunlight. As vast forests spread across the land living biomass increased by orders of magnitude, pulling down carbon as CO2 from the atmosphere to make wood. Lignin is very difficult to break down and no decomposer species possessed the enzymes to digest it. Trees died atop one another until they were 100 metres or more in depth. This was the making of the great coal beds around the world as this huge store of sequestered carbon continued to build for 90 million years. Then, fortunately for the future of life, white rot fungi evolved to produce the enzymes that can digest lignin and coincident with that the coal-making era came to an end.

A well-documented record of global temperature over the past 65 million years shows that we have been in a major cooling period since the Eocene Thermal Maximum 50 million years ago. The Earth was an average 16C warmer then, with most of the increased warmth at the higher latitudes. The entire planet, including the Arctic and Antarctica were ice-free and the land there was covered in forest.

The ancestors of every species on Earth today survived through what may have been the warmest time in the history of life. It makes one wonder about dire predictions that even a 2C rise in temperature from pre-industrial times would cause mass extinctions and the destruction of civilization. Glaciers began to form in Antarctica 30 million years ago and in the northern hemisphere 3 million years ago. Today, even in this interglacial period of the Pleistocene Ice Age, we are experiencing one of the coldest climates in the Earth’s history.

Coming closer to the present we have learned from Antarctic ice cores that for the past 800,000 years there have been regular periods of major glaciation followed by interglacial periods in 100,000 year-cycles. These cycles coincide with the Milankovitch cycles that are tied to the eccentricity of the Earth’s orbit and its axial tilt. It is highly plausible that these cycles are related to solar intensity and the seasonal distribution of solar heat on the Earth’s surface. There is a strong correlation between temperature and the level of atmospheric CO2 during these successive glaciations, indicating a possible cause-effect relationship between the two. CO2 lags temperature by an average of 800 years during the most recent 400,000-year period, indicating that temperature is the cause, as the cause never comes after the effect.

Looking at the past 50,000 years of temperature and CO2 we can see that changes in CO2 follow changes in temperature. This is as one could expect, as the Milankovitch cycles are far more likely to cause a change in temperature than a change in CO2. And a change in the temperature is far more likely to cause a change in CO2 due to out-gassing of CO2 from the oceans during warmer times and an in-gassing (absorption) of CO2 during colder periods. Yet climate alarmists persist in insisting that CO2 is causing the change in temperature, despite the illogical nature of that assertion.

Coming back to the relationship between temperature and CO2 in the modern era we can see that temperature has risen at a steady slow rate in Central England since 1700 while human CO2 emissions were not relevant until 1850 and then began an exponential rise after 1950. This is not indicative of a direct causal relationship between the two. After freezing over regularly during the Little Ice Age the River Thames froze for the last time in 1814, as the Earth moved into what might be called the Modern Warm Period.

There was a 30-year period of warming from 1910-1940, then a cooling from 1940 to 1970, just as CO2 emissions began to rise exponentially, and then a 30-year warming from 1970-2000 that was very similar in duration and temperature rise to the rise from 1910-1940. One may then ask “what caused the increase in temperature from 1910-1940 if it was not human emissions? And if it was natural factors how do we know that the same natural factors were not responsible for the rise between 1970-2000.” You don’t need to go back millions of years to find the logical fallacy in the IPCC’s certainty that we are the villains in the piece.

Coming to the core of my presentation, CO2 is the currency of life and the most important building block for all life on Earth. All life is carbon-based, including our own. Surely the carbon cycle and its central role in the creation of life should be taught to our children rather than the demonization of CO2, that “carbon” is a “pollutant” that threatens the continuation of life. We know for a fact that CO2 is essential for life and that it must be at a certain level in the atmosphere for the survival of plants, which are the primary food for all the other species alive today. Should we not encourage our citizens, students, teachers, politicians, scientists, and other leaders to celebrate CO2 as the giver of life that it is?

It is a proven fact that plants, including trees and all our food crops, are capable of growing much faster at higher levels of CO2 than present in the atmosphere today. Even at the today’s concentration of 400 ppm plants are relatively starved for nutrition. The optimum level of CO2 for plant growth is about 5 times higher, 2000 ppm, yet the alarmists warn it is already too high. They must be challenged every day by every person who knows the truth in this matter. CO2 is the giver of life and we should celebrate CO2 rather than denigrate it as is the fashion today.

Let’s look at where all the carbon is in the world, and how it is moving around.

Today, at just over 400 ppm CO2 there are 850 billion tons of CO2 in the atmosphere. By comparison, when modern life-forms evolved over 500 million years ago there was nearly 15,000 billion tons of CO2 in the atmosphere, 17 times today’s level. Plants and soils combined contain more than 2,000 billion tons of carbon, more that twice as much as the entire global atmosphere. The oceans contain 38,000 billion tons of dissolved CO2, 45 times as much as in the atmosphere. Fossil fuels, which were made from plants that pulled CO2 from the atmosphere account for 5,000 – 10,000 billion tons of carbon, 6 – 12 times as much carbon as is in the atmosphere.

But the truly stunning number is the amount of carbon that has been sequestered from the atmosphere and turned into carbonaceous rocks. 100,000,000 billion tons, that’s one quadrillion tons of carbon, have been turned into stone by marine species that learned to make armour-plating for themselves by combining calcium and carbon into calcium carbonate. Limestone, chalk, and marble are all of life origin and amount to 99.9% of all the carbon ever present in the global atmosphere. The white cliffs of Dover are made of the calcium carbonate skeletons of coccolithophores, tiny marine phytoplankton.

The vast majority of the carbon dioxide that originated in the atmosphere has been sequestered and stored quite permanently in carbonaceous rocks where it cannot be used as food by plants.

Beginning 540 million years ago at the beginning of the Cambrian Period many marine species of invertebrates evolved the ability to control calcification and to build armour plating to protect their soft bodies. Shellfish such as clams and snails, corals, coccolithofores (phytoplankton) and foraminifera (zooplankton) began to combine carbon dioxide with calcium and thus to remove carbon from the life cycle as the shells sank into sediments; 100,000,000 billion tons of carbonaceous sediment. It is ironic that life itself, by devising a protective suit of armour, determined its own eventual demise by continuously removing CO2 from the atmosphere. This is carbon sequestration and storage writ large. These are the carbonaceous sediments that form the shale deposits from which we are fracking gas and oil today. And I add my support to those who say, “OK UK, get fracking”.

The past 150 million years has seen a steady drawing down of CO2 from the atmosphere. There are many components to this but what matters is the net effect, a removal on average of 37,000 tons of carbon from the atmosphere every year for 150 million years. The amount of CO2 in the atmosphere was reduced by about 90% during this period. This means that volcanic emissions of CO2 have been outweighed by the loss of carbon to calcium carbonate sediments on a multi-million year basis.

If this trend continues CO2 will inevitably fall to levels that threaten the survival of plants, which require a minimum of 150 ppm to survive. If plants die all the animals, insects, and other invertebrates that depend on plants for their survival will also die.

How long will it be at the present level of CO2 depletion until most or all of life on Earth is threatened with extinction by lack of CO2 in the atmosphere?

During this Pleistocene Ice Age, CO2 tends to reach a minimum level when the successive glaciations reach their peak. During the last glaciation, which peaked 18,000 years ago, CO2 bottomed out at 180 ppm, extremely likely the lowest level CO2 has been in the history of the Earth. This is only 30 ppm above the level that plants begin to die. Paleontological research has demonstrated that even at 180 ppm there was a severe restriction of growth as plants began to starve. With the onset of the warmer interglacial period CO2 rebounded to 280 ppm. But even today, with human emissions causing CO2 to reach 400 ppm plants are still restricted in their growth rate, which would be much higher if CO2 were at 1000-2000 ppm.

Here is the shocking news. If humans had not begun to unlock some of the carbon stored as fossil fuels, all of which had been in the atmosphere as CO2 before sequestration by plants and animals, life on Earth would have soon been starved of this essential nutrient and would begin to die. Given the present trends of glaciations and interglacial periods this would likely have occurred less than 2 million years from today, a blink in nature’s eye, 0.05% of the 3.5 billion-year history of life.

There is much more in Moore’s speech; follow the link above to read the whole thing.

For more information on climate change, see the Climate in Perspective page to link to my 28-page essay on climate myths and reality.

EPA emission standards for trucks: heavy cost, no benefit

At a cost of only $30 billion, new EPA regulations may save us from 0.0026°C of global warming by the year 2100.

According to an EPA report (971 pages) [link]: “The Environmental Protection Agency (EPA) and the National Highway Traffic Safety Administration (NHTSA), on behalf of the Department of Transportation, are each proposing changes to our comprehensive Heavy-Duty National Program that would further reduce greenhouse gas emissions (GHG) and increase fuel efficiency for on-road heavy-duty vehicles,…”

The National Center for Policy Analysis estimates that “The Environmental Protection Agency’s second round of heavy-duty truck efficiency standards could cost more than $30 billion.” – costs that will be passed on to consumers.

“Auto manufacturers and the freight and long-haul transportation industry already understand the importance of fuel efficiency. Nearly 3 million heavy-duty Class 8 trucks carry approximately 70 percent of America’s freight, consuming more than 50 billion gallons in fuel and spending more than $140 billion in diesel costs. The industry operates on razor-thin margins and plans its driving routes down to the tenth of a mile to save on fuel costs.” – Nicolas Loris, The Daily Signal

The Obama administration says these new regulations are necessary to meet Obama’s carbon dioxide reduction goals.

The EPA claims “The proposed standards are expected to lower CO2 emissions by approximately 1 billion metric tons…”

So, what benefit will we get for $30 billion? EPA’s own figures show no benefit to the environment and no effect on global climate.

According to the EPA report linked above, the new regulations will accomplish the following (page 6-45): “As a result of the proposal’s emissions reductions from the proposed

alternative relative to the baseline case, by 2100 the concentration of atmospheric CO2 is

projected to be reduced by approximately 1.1 to 1.2 parts per million by volume (ppmv), the

global mean temperature is projected to be reduced by approximately 0.0026 to 0.0065°C, and

global mean sea level rise is projected to be reduced by approximately 0.023 to 0.057 cm.” Wow!

Among the first things the next President should do is to issue an executive order forbidding federal agencies from regulating carbon dioxide emissions and rescind all regulations that do so.

The U.S. Supreme Court recently ruled against the EPA on certain power plant emissions (see here). It remains to be seen whether the principles cited in that case will be extended to motor vehicles.

See more articles on EPA stupidity:

EPA versus Arizona on regional haze issue

EPA war on coal threatens Tucson water supply

EPA fuel standards costly and ineffective

EPA targets wrong cause of haze in Grand Canyon

Impact of new EPA ozone rule

EPA experiments on humans debunk their ozone and particulate matter health claims

EPA conducted illegal and potentially lethal experiments on children

The EPA is destroying America

EPA Clean Power Plan is Junk Science

Replace EPA

But the EPA did get one right: EPA says fracking does not harm drinking water supply

END

Berkeley scientists claim to have directly measured carbon dioxide warming the Earth – So what?

The scientific press is hyping research by Lawrence Berkeley National Laboratory that claims to have, for the first time, measured radiative forcing by carbon dioxide. Even if they have, it’s no big deal because such forcing is assumed from basic physics. They also claim that their measurement provides proof of anthropogenic global warming. But, as we shall see, they may be putting effect before cause.

Seth Borenstein, the Associated Press’ chief climate alarmist, writes “Scientists have witnessed carbon dioxide trapping heat in the atmosphere above the United States, chronicling human-made climate change in action.”

The Berkeley press release is titled “First direct observation of carbon dioxide’s increasing greenhouse effect at the Earth’s surface.” The paper, published in Nature, is somewhat more modest in its claim: “Observational determination of surface radiative forcing by CO2 from 2000 to 2010.”

Within the press release, one of the scientists is quoted as saying, “”We see, for the first time in the field, the amplification of the greenhouse effect because there’s more CO2 in the atmosphere to absorb what the Earth emits in response to incoming solar radiation.”

First, some background on what was done.

The scientists measured down-welling infrared radiation using Atmospheric Emitted Radiance Interferometer spectra from two stations located in Oklahoma and Alaska over the period from 2000 to 2010 during which atmospheric carbon dioxide increased by 22 parts per million. They claim to have 3300 measurements from Alaska and 8300 measurements from Oklahoma. Keep those numbers in mind.

They found that the down-welling radiation increased during that period and attribute that increase to the rise of carbon dioxide.

Some possible problems:

I wonder why the specific time period was chosen. In 2010 a strong La Nina produced a relatively cool tropospheric temperature, while in 2010 a strong El Nino produced a relatively warm tropospheric temperature. The difference was about half a degree Centigrade. A warmer atmosphere will intrinsically produce more down-welling infrared radiation regardless of its composition. So, was the increased down-welling radiation due to increased CO2 or increased temperature? I think they may be confusing cause and effect.

Also curious is that another study (See Evidence that CO2 emissions do not intensify the greenhouse effect ), using the same type of instruments, made 800,000 measurements during the period 1996 to 2010 and found a significant decrease in down-welling infrared radiation.

The Berkeley researchers claim that only about 10 percent of their increased down-welling radiation came from carbon dioxide. As far as I can tell, however, the emissions from carbon dioxide fall within the spectra emitted by water vapor, but the researchers claim a mathematical manipulation allows them to distinguish the 10 percent of radiation from carbon dioxide versus the 90 percent from water vapor and other gases in the atmosphere.

The Berkeley researchers claim to have found a radiation increase of “0.2 Watts per square meter per decade.” How much is that? German physical chemist Dr. Siegfried Dittrich notes:

“The number for the increase in CO2-dependent back radiation given by Nature of 0.2 watt/m2 per decade is indeed in reality nothing more than trifle. Why would the earth be shocked when 1367 watts per square meter strikes the surface at noon along the equator? The ever-changing deviations from this so-called solar constant mean value are in fact considerably greater than the above given 0.2 watts/m2.” An additional complication is that the measurements by Berkeley researchers was for only cloud-free areas.

Another paper in Geophysical Research Letters: “On the Incident Solar Radiation in CMIP5 Models” finds sampling errors much larger than the 0.2 Watts per square meter that the Berkeley researchers claim to have measured. Here is the paper abstract:

“Annual incident solar radiation at the top of atmosphere (TOA) should be independent of longitudes. However, in many Coupled Model Intercomparison Project phase 5 (CMIP5) models, we find that the incident radiation exhibited zonal oscillations, with up to 30 W/m2 of spurious variations. This feature can affect the interpretation of regional climate and diurnal variation of CMIP5 results. This oscillation is also found in the Community Earth System Model (CESM). We show that this feature is caused by temporal sampling errors in the calculation of the solar zenith angle. The sampling error can cause zonal oscillations of surface clear-sky net shortwave radiation of about 3 W/m2 when an hourly radiation time step is used, and 24 W/m2 when a 3-hour radiation time step is used.”

The alleged measurement could easily be instrument error.

There is still a question about whether they saw what they claimed to have seen. This may be an example of confirmation bias, they saw what they wanted to see based on equivocal evidence.

But in the end, to paraphrase a prominent politician: what difference does it make at this point in time?

END