Climate change

Earth Hour: A Dissent

Reblogged from WUWT

Earth Hour: A Dissent

by Ross McKitrick

Ross McKitrick, Professor of Economics, Univer...

Ross McKitrick, Professor of Economics, University of Guelph, Canada. (Photo credit: Wikipedia)


In 2009 I was asked by a journalist for my thoughts on the importance of Earth Hour.

Here is my response.

I abhor Earth Hour. Abundant, cheap electricity has been the greatest source of human liberation in the 20th century. Every material social advance in the 20th century depended on the proliferation of inexpensive and reliable electricity.

Giving women the freedom to work outside the home depended on the availability of electrical appliances that free up time from domestic chores. Getting children out of menial labour and into schools depended on the same thing, as well as the ability to provide safe indoor lighting for reading.

Development and provision of modern health care without electricity is absolutely impossible. The expansion of our food supply, and the promotion of hygiene and nutrition, depended on being able to irrigate fields, cook and refrigerate foods, and have a steady indoor supply of hot water.

Many of the world’s poor suffer brutal environmental conditions in their own homes because of the necessity of cooking over indoor fires that burn twigs and dung. This causes local deforestation and the proliferation of smoke- and parasite-related lung diseases.

Anyone who wants to see local conditions improve in the third world should realize the importance of access to cheap electricity from fossil-fuel based power generating stations. After all, that’s how the west developed.

The whole mentality around Earth Hour demonizes electricity. I cannot do that, instead I celebrate it and all that it has provided for humanity.

Earth Hour celebrates ignorance, poverty and backwardness. By repudiating the greatest engine of liberation it becomes an hour devoted to anti-humanism. It encourages the sanctimonious gesture of turning off trivial appliances for a trivial amount of time, in deference to some ill-defined abstraction called “the Earth,” all the while hypocritically retaining the real benefits of continuous, reliable electricity.

People who see virtue in doing without electricity should shut off their fridge, stove, microwave, computer, water heater, lights, TV and all other appliances for a month, not an hour. And pop down to the cardiac unit at the hospital and shut the power off there too.

I don’t want to go back to nature. Travel to a zone hit by earthquakes, floods and hurricanes to see what it’s like to go back to nature. For humans, living in “nature” meant a short life span marked by violence, disease and ignorance. People who work for the end of poverty and relief from disease are fighting against nature. I hope they leave their lights on.

Here in Ontario, through the use of pollution control technology and advanced engineering, our air quality has dramatically improved since the 1960s, despite the expansion of industry and the power supply.

If, after all this, we are going to take the view that the remaining air emissions outweigh all the benefits of electricity, and that we ought to be shamed into sitting in darkness for an hour, like naughty children who have been caught doing something bad, then we are setting up unspoiled nature as an absolute, transcendent ideal that obliterates all other ethical and humane obligations.

No thanks.

I like visiting nature but I don’t want to live there, and I refuse to accept the idea that civilization with all its tradeoffs is something to be ashamed of.

Ross McKitrick
Professor of Economics
University of Guelph

Humans caused 84% of US wildfires from 1992 to 2012

Although climate change has been blamed for an increase of wildfires in the United States, a new paper, published in the Proceedings of the National Academy of Sciences, concluded that 84% of fires were ignited by humans and this extended the fire season by a factor of three.

Here is the paper abstract:

The economic and ecological costs of wildfire in the United States have risen substantially in recent decades. Although climate change has likely enabled a portion of the increase in wildfire activity, the direct role of people in increasing wildfire activity has been largely overlooked. We evaluate over 1.5 million government records of wildfires that had to be extinguished or managed by state or federal agencies from 1992 to 2012, and examined geographic and seasonal extents of human-ignited wildfires relative to lightning ignited wildfires. Humans have vastly expanded the spatial and seasonal “fire niche” in the coterminous United States, accounting for 84% of all wildfires and 44% of total area burned. During the 21-y time period, the human-caused fire season was three times longer than the lightning-caused fire season and added an average of 40,000 wildfires per year across the United States. Human-started wildfires disproportionally occurred where fuel moisture was higher than lightning-started fires, thereby helping expand the geographic and seasonal niche of wildfire. Human-started wildfires were dominant (>80% of ignitions) in over 5.1 million km2 , the vast majority of the United States, whereas lightning-started fires were dominant in only 0.7 million km2, primarily in sparsely populated areas of the mountainous western United States. Ignitions caused by human activities are a substantial driver of overall fire risk to ecosystems and economies. Actions to raise awareness and increase management in regions prone to human-started wildfires should be a focus of United States policy to reduce fire risk and associated hazards.

Read the full paper here:

Arizona State University researchers want to deploy 100 million ice-making machines to the Arctic

Fourteen researchers from Arizona State University want to save the Arctic ice sheet by deploying up to 100 million ice-making machines at a cost of about $5 trillion over the next 10 years. Essentially, wind-powered pumps will spread ocean water over ice where it will freeze and thicken the sea ice. Their proposal was published January 24, 2017, in Earth’s Future, an open access journal of the American Geophysical Union. You can read their full paper here:

The researchers claim that loss of Arctic sea ice is due to global warming caused by human release of CO2 (they don’t provide any evidence). Thus, there is an “urgent need to deal with climate change.” Within the paper they invoke all the usual boogeymen of dangerous global warming alarmism.

The paper abstract begins: “As the Earth’s climate has changed, Arctic sea ice extent has decreased drastically. It is likely that the late-summer Arctic will be ice-free as soon as the 2030s. This loss of sea ice represents one of the most severe positive feedbacks in the climate system, as sunlight that would otherwise be reflected by sea ice is absorbed by open ocean. It is unlikely that CO2levels and mean temperatures can be decreased in time to prevent this loss, so restoring sea ice artificially is an imperative.”

Their ice-making machine:

“We propose that a wind pump mounted on a large buoy, could perform the function of capturing wind energy to pump seawater to the surface. The basic components of such a device would include: a large buoy; a wind turbine and pump, drawing up seawater from below the ice; a tank for storing the water; and a delivery system that takes the water periodically flushed from the tank and distributes it over a large area. The goal is to raise enough water over the Arctic winter to cover an area approximately 0.1 km2 with approximately1 m of ice. A system of such devices would have to be manufactured and delivered to the Arctic Ocean, probably repositioned each season, and would need to be maintained.”

The researchers recognize “it is a challenge to prevent the water inside the device (tank, delivery system) from freezing.” But, they provide no solution. Where will they get energy to heat the water to prevent a freeze? They also say that the buoy-turbine contraption must be sturdy enough to prevent it tipping over in the fickle Arctic environment.

The researchers propose starting small with only 10 million pumps at a cost of $500 billion. They say we would need 100 million devices costing $5 trillion to cover the entire Arctic.

In my opinion, this is just another wacky and completely unnecessary geo-engineering scheme. It is also a complete waste of money and resources. Within the paper is a discussion of the need for a multinational governance of the Arctic ice. This seems to me to be a plea for more bureaucracy and future funding. Why 14 authors for this paper? Maybe the group wants to get “publish or perish” credit, which is vital in academia, before President Trump pulls the plug. Or, it could be a class project with professors and students. By the way, a note in the paper says: “The authors received no funding to carry out this work.” That probably means they had no special grant funding. I presume that the University pays the professors a salary (with taxpayer’s money).

I saw no mention in the paper of an unintended consequence of freezing ocean water: it will increase the amount of CO2 released into the atmosphere. “When sea water freezes, all of the CO2 that is bound up in that water is forced out. Not only is the dissolved gaseous CO2 released, but all of the CO2 held in the carbonate form is released as well.” (Source)


See also:

Predictions of an ice-free Arctic Ocean

Wacky Geoengineering Schemes to Control Climate

The Arctic-Antarctic seesaw

Climate models for the layman

The Global Warming Policy Foundation, a British think tank, has just published an excellent review of climate models, their problems and uncertainties, all of which show that they are inadequate for policy formulation. The paper is written by Dr. Judith Curry, the author of over 180 scientific papers on weather and climate. She recently retired from the Georgia Institute of Technology, where she held the positions of Professor and Chair of the School of Earth and Atmospheric Sciences. She is currently President of Climate Forecast Applications Network.

You can read the 30-page paper here:

Here is the executive summary:

There is considerable debate over the fidelity and utility of global climate models (GCMs). This debate occurs within the community of climate scientists, who disagree about the amount of weight to give to climate models relative to observational analyses. GCM outputs are also used by economists, regulatory agencies and policy makers, so GCMs have received considerable scrutiny from a broader community of scientists, engineers, software experts, and philosophers of science. This report attempts to describe the debate surrounding GCMs to an educated but nontechnical audience.

Key summary points

• GCMs have not been subject to the rigorous verification and validation that is the norm for engineering and regulatory science.

• There are valid concerns about a fundamental lack of predictability in the complex nonlinear climate system.

• There are numerous arguments supporting the conclusion that climate models are not fit for the purpose of identifying with high confidence the proportion of the 20th century warming that was human-caused as opposed to natural.

• There is growing evidence that climate models predict too much warming from increased atmospheric carbon dioxide.

• The climate model simulation results for the 21st century reported by the Intergovernmental Panel on Climate Change (IPCC) do not include key elements of climate variability, and hence are not useful as projections for how the 21st century climate will actually evolve.

Climate models are useful tools for conducting scientific research to understand the climate system. However, the above points support the conclusion that current GCMs are not fit for the purpose of attributing the causes of 20th century warming or for predicting global or regional climate change on timescales of decades to centuries, with any high level of confidence. By extension, GCMs are not fit for the purpose of justifying political policies to fundamentally alter world social, economic and energy systems. It is this application of climate model results that fuels the vociferousness of the debate surrounding climate models.

NOAA caught manipulating temperature data – again

Dr John Bates, a recently retired senior scientist at the National Oceanic and Atmospheric Administration (NOAA), alleges that a NOAA paper written before the historic climate conference in Paris in 2015 breached NOAA’s own rules and was based on misleading and unverified data. That, to many, looks like the paper was designed to stoke up hysteria over global warming in the run-up to the conference. (Source)

NOAA has often been accused of manipulating data for political purposes. See for instance, my ADI article: The past is getting cooler which reflects a curiosity of published government temperature records that show the 1930s getting cooler and cooler with each update of the record. The more recent scandal derives from NOAA’s attempt to erase the 18-year “pause” in global warming. Even though atmospheric carbon dioxide has been rising, global temperature has failed to respond as the climate models say it should. (See El Nino to El Nino – no warming of global temperature) This recent scandal was exposed by David Rose in an article in the British paper Daily Mail.

Global temperatures published by NOAA compared to global temperatures published by the British MET office shows that NOAA temperatures are consistently higher. In the graph below (source), the red line shows the current NOAA world temperature graph, which relies on the ‘adjusted’ and unreliable sea temperature data cited in the flawed ‘Pausebuster’ paper. The blue line is the UK Met Office’s independently tested and verified ‘HadCRUT4’ record, showing lower monthly readings and a shallower recent warming trend.


David Rose notes: NOAA’s 2015 ‘Pausebuster’ paper was based on two new temperature sets of data – one containing measurements of temperatures at the planet’s surface on land, the other at the surface of the seas. Both datasets were flawed. This newspaper has learnt that NOAA has now decided that the sea dataset will have to be replaced and substantially revised just 18 months after it was issued, because it used unreliable methods which overstated the speed of warming. The revised data will show both lower temperatures and a slower rate in the recent warming trend. The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.

To add to the confusion, NOAA also changed the computer programs it uses to compile temperature data, and guess what? The new program creates global warming where there had been none before. These changes are documented in a post by Rud Istvan.

“A 2011 paper announced that NOAA would be transitioning to updated and improved CONUS software around the end of 2013. The program used until the upgrade was called Drd964x. The upgrade was launched from late 2013 into 2014 in two tranches. Late in 2013 came the new graphical interfaces, which are an improvement. Then about February 2014 came the new data output, which includes revised station selection, homogenization, and gridding. The new version is called nClimDiv.” The graphs below show some of the results for temperatures from 1900 to 2010. Left shows old system results versus new system results on right.




Another way NOAA influences the official temperature is by removal of thousands of weather station land thermometers from remote, high altitude, and/or non-urban locations since the 1970s. These are stations which do not show the warming trends predicted by models, as they are not affected by proximity to artificial or non-climatic heat sources (pavements, buildings, machinery, industry, etc.) like urban weather stations are. (Thermometers near urban heat sources can cause warming biases of between 0.1 and 0.4°C per decade.) This inflates the average temperature reported. Read more

Perhaps the Trump administration can get NOAA out of politics and back to science.

El Nino to El Nino – no net global warming


The Earth experienced two super El Ninos recently: 1997/1998 and 2015/2016. It was expected that 2016 would be the hottest year in the satellite record which begins in 1979. It was, but by only 0.02°C over 1998. That is not statistically significant according to Dr. Roy Spencer, keeper of the UAH satellite system data. (The margin of error is 0.1°C, much larger than the difference between the El Nino years.) The graph above shows the UAH results. A separate satellite analysis by Remote Sensing Systems (RSS) came to the same conclusion.

Satellites measure the temperature of the lower troposphere, the portion of the atmosphere where weather takes place. These measurements give a more realistic picture of global temperature than do surface measurements. Essentially, global temperature now is the same as it was nearly 18 years ago.

The earlier El Nino had a sharp drop off as a strong La Nina cooling took effect. The 2016/2017 La Nina appears to has started in mid December, 2016, and we can expect more cooling during the first half of 2017, but the current La Nino is expected to be weaker.

The media may still proclaim 2016 as the hottest year ever (in a cherry-picked time frame). For some perspective on that let’s see a longer perspective.

CCIP fig1

One thing the media may not mention is that our carbon dioxide emissions seem to have had no effect on global temperature. This was recently noted by Australian Jo Nova in her article “Since 2000 humans have put out 30% of their total CO2 but there is nothing to show for it.” There has been an 18-year “pause” in global warming.

If CO2 is supposed to be the principal cause of global warming, why hasn’t this great outpouring of CO2 had a noticeable effect? According to the Department of Energy, “Since 1751 approximately 337 billion metric tonnes of carbon have been released to the atmosphere from the consumption of fossil fuels and cement production. Half of these emissions have occurred since the mid 1970s.” And 30% have occurred since the 1997/1998 El Nino. There is no indication that all this CO2 is producing global warming.

global-co2-human-emissionsBoth North America and Europe are experiencing record cold weather. The North Atlantic Ocean has been rapidly cooling since the mid-2000s. (Source) Also, Solar activity is now at a low point as the current cycle winds down. Many scientists are confident the next cycle will also be a weak one. Periods of weak solar cycles are associated with periods of global cooling.

It seems that any alleged warming effect that CO2 may have is overwhelmed by natural variation in climate.

See also:

An Illustrated Guide to El Nino and La Nina


Climate Madness 9

The climate madness highlight in November was the UN’s Climate Change Conference in Marrakech, Morocco, held 7-18 November. The bureaucratically official designation of this meeting is: The 22nd session of the Conference of the Parties of the UN Framework Convention on Climate Change (COP 22), the twelfth session of the Conference of the Parties serving as the meeting of the Parties to the Kyoto Protocol (CMP 12), and the first session of the Conference of the Parties serving as the meeting of the Parties to the Paris Agreement (CMA 1).

It seems that the UN delegates are terrified of Trump because it could mean the end of their cash cow. (“My only worry is the money,” said Tosi Mpanu Mpanu of Democratic Republic of Congo, who heads a group of the 48 least developed nations. “It’s worrying when you know that Trump is a climate change skeptic,” he toldReuters.)

COP22 climate conference has now ended – and green groups are just waking up to the fact that without US financial support, nobody has committed any money to anything. Read more

Marc Morano, who publishes the Climate Depot website and co-wrote and hosted the new skeptical film ‘Climate Hustle,’ demonstrated outside the meeting by literally shredding the UN Paris agreement. Morano was removed by UN guards (See videos). Morano also attempted to present a 43-page report on the state of the climate (Read full 43-page report).

This is what the meeting accomplished:

UN Climate Talks Agree to Delay Paris Rules until 2018

by Alister Doyle and Megan Rowling, Reuters

At the end of two-week talks on global warming in Marrakesh, which were extended an extra day, many nations appealed to Trump, who has called climate change a hoax, to reconsider his threat to tear up the Paris Agreement for cutting greenhouse gas emissions. Showing determination to keep the Paris Agreement on track, the conference agreed to work out a rule book at the latest by December 2018. A rule book is needed because the Paris Agreement left many details vague, such as how countries will report and monitor their national pledges to curb greenhouse gas emissions. Read more

Also you can:

Get Your Gender Climate Tracker

by Rupert Darwall

An event of such magnitude struck the latest round of the climate conference – talks which have been going on in various forms since the early 1990s – that the response of many participants and NGOs is to pretend nothing’s happened and carry on as before. Today is gender and education day at the COP22 in Marrakech. Gender equality and the empowerment of women is written into the preamble of last December’s Paris Agreement, the climate treaty that President Obama ratified without sending to the Senate for its advice and consent. ‘Gender justice is climate justice,’ as one feminist NGO puts it.

There are Feminists for a Fossil Fuel Free Future. You can download a Gender Climate Tracker app for iPhone and Android. ‘Our existing economies are based on gender exploitative relationships,’ one speaker told a side meeting. ‘The first ecology is my body,’ another declared. Sexual and reproductive rights require climate justice. ‘Sixty percent of my body is water. What I’m drinking takes me to my city and to the health of the planet.’ Read more (What is she drinking?)

COP22 also had to deal with an inconvenient fact: a dramatic decline in global temperature (1.2°C drop) since early 2016; and the fact that satellites show very different temperatures than “adjusted” land based thermometers. See: Hottest Year?! NOAA claimed ‘record heat’ in numerous locations that don’t have any actual thermometers. Maybe this was the “Gore Effect.” (see ADI explanation)

Other climate madness news:

There Is A Major Climate Issue Hiding In Your Closet: Fast Fashion

by Maxine Bédat and Michael Shank

Disposable clothes, often made from oil, in factories powered by coal, and shipped around the world, mean that the apparel industry contributes 10% of global emissions. Today, more than 150 billion new articles of clothing are produced annually. People don’t keep their clothing anymore; it is no longer owned, it is just consumed. They wear and discard it quickly. That’s fast fashion and it’s ruining our planet. Read more

UK Researchers: Tax Food to Reduce Climate Change

by Eric Worrall

A group of researchers in Oxford University, England have suggested that imposing a massive tax on carbon intensive foods – specifically protein rich foods like meat and dairy – could help combat climate change. Pricing food according to its climate impacts could save half a million lives and one billion tonnes of greenhouse gas emissions. Taxing greenhouse gas emissions from food production could save more emissions than are currently generated by global aviation, and lead to half a million fewer deaths from chronic diseases, according to a new study published in Nature Climate Change. Read more

Children win right to sue US government for climate change inaction

You may not have realized we have the right to a perfect climate. A bunch of kids age 8 to 19 have won the right to take the US government to trial for not protecting the atmosphere. It’s being called the “biggest case on the planet”. Read more

New study quantifies your personal contribution and guilt over Arctic sea ice melt

by Anthony Watts

From the Max-Planck-Gesellschaft and the department of “it’s all YOUR fault and it’s worse than we thought” comes this guilt trip over Arctic sea ice from Greenpeace activist and NSIDC scientist (now just a person because she stopped being a scientist when she started accepting Greenpeace assistance, IMO) Julienne Stroeve. Of course, Stroeve has no explanation of what caused dramatic sea ice melt in 1922, but she’s certain you caused it today.

For each tonne of carbon dioxide (CO2) that any person on our planet emits, three square meters of Arctic summer sea ice disappear. This is the finding of a study that has been published in the journal Science this week by Dirk Notz, leader of a Max Planck Research Group at the Max Planck Institute for Meteorology and Julienne Stroeve from the US National Snow and Ice Data Centre. These figures enable us for the first time to grasp the individual contribution to global climate change. Read more

Feds Join Conference on ‘Psychosocial Resilience’ to Climate Change – Causes Depression, PTSD, Suicide, and Spiritual Problems

by Penny Starr

( – Several federal officials spoke on Friday at a conference in Washington, D.C., organized by The Resource Innovation Group, an Oregon-based organization that promotes the idea that climate change can cause a range of human health problems, including PTSD, depression and suicide and that human behavior should be changed to avoid these problems.

The website said attendees to the conference will learn:

The personal mental health, spiritual, and psychosocial impacts of climate change on youth, adolescents, adults, and why major preventative human resilience-building policies and programs are urgently needed to address the risks.

Methods, policies, and benefits of building personal resilience for climate change-enhanced traumas and toxic stresses.

Methods, policies, and benefits of building psychosocial resilience within all types of groups and organizations for climate change-enhanced traumas and toxic stresses.

Methods, policies, and benefits of building psychosocial resilience within communities for climate change-enhanced traumas and toxic stresses. Read more

Green heads to explode: ‘elimination of GMO crops would cause hike in greenhouse gas emissions’

by Anthony Watts

From Purdue University and the “better living through genetics” department comes this press release that is sure to setup an impossible quandary in the minds of some anti-GMO zealots who also happen to be climate proponents…

Planting GMO crops is an effective way for agriculture to lower its carbon footprint.

A global ban on genetically modified crops would raise food prices and add the equivalent of nearly a billion tons of carbon dioxide to the atmosphere, a study by researchers from Purdue University shows. Using a model to assess the economic and environmental value of GMO crops, agricultural economists found that replacing GMO corn, soybeans and cotton with conventionally bred varieties worldwide would cause a 0.27 to 2.2 percent increase in food costs, depending on the region, with poorer countries hit hardest. According to the study, published Oct. 27 in the Journal of Environmental Protection, a ban on GMOs would also trigger negative environmental consequences: The conversion of pastures and forests to cropland – to compensate for conventional crops’ lower productivity – would release substantial amounts of stored carbon to the atmosphere. Read more

The Latest Global Warming Threat: Trick Or Treating

by Andrew Follett

Environmentalists have decided that letting kids trick or treat on Halloween is increasing carbon dioxide (CO2) emissions, and the only solution is for the activists to get more money to fight it.

Environmentalists suspect that candy eaten by trick-or-treating kids probably generates a lot of CO2 and therefore isn’t sustainable. The environmental website TerraPass even encourages parents to “start a new trend and skip the candy handouts, opting for more sustainable treats as a greener way of participating in the festivities. Instead of candy coated, sugary bites, offer up little storybooks, crayons, playing cards or toys.” Read more (That sounds like the “safe places” offered college students traumatized by Trump’s election.)

Explosive coolant being put into cars to fight global warming

By Ed Straker

A new kind of explosive coolant called HFO-1234yf is being put into cars to fight global warming.

HFO-1234yf is already becoming standard in many new cars sold in the European Union and the United States by all the major automakers, in large part because its developers, Honeywell and Chemours, have automakers over a barrel. Their refrigerant is one of the few options that automakers have to comply with new regulations and the Kigali agreement.

It has its detractors. The new refrigerant is at least 10 times as costly as the one it replaces.

Daimler began raising red flags in 2012. A video the company made public was stark. It showed a Mercedes-Benz hatchback catching fire under the hood after 1234yf refrigerant leaked during a company simulation.

Daimler eventually relented and went along with the rest of the industry, installing 1234yf in many of its new cars.

“None of the people in the car industry I know want to use it,” said Axel Friedrich, the former head of the transportation and noise division at the Umweltbundesamt, the German equivalent of the Environmental Protection Agency. He added that he opposed having another “product in the front of the car which is flammable.”

While cars, obviously, contain other flammable materials, he was specifically worried that at high temperatures 1234yf emitted hydrogen fluoride, which is dangerous if inhaled or touched.

The new coolant is superior to the HFC it is replacing in its impact on global warming.

Man-made global warming is a myth, a fantasy; there has never even been a workable theory to even prove it. (The current theory, that man-made carbon dioxide causes global warming, doesn’t work because most CO2 is produced naturally in the environment, not by industrial output.) And yet our lives are risked, again and again, to protect us against this fantasy.

More and more people are dying because cars are getting lighter and lighter – the left’s human sacrifices to appease their global warming gods. The left won’t be satisfied until we are driving around in vehicles loaded with explosives with the crash-worthiness of papier-mâché. (Source)


Climate Madness 1

Climate Madness 2

Climate Madness 3

Climate Madness 4  

Climate Madness 5

Climate Madness 6

Climate Madness 7

Climate Madness 8

New study shows Antarctic sea ice is the same as it was 100 years ago

The following is from an article in the London Telegraph by Science Editor Sarah Knapton.

The study was based on the ice observations recorded in the logbooks from 11 voyages between 1897 and 1917, including three expeditions led by Captain Scott, two by Shackleton, as well as sea-ice records from Belgian, German and French missions.

Antarctic sea ice had barely changed from where it was 100 years ago, scientists have discovered, after poring over the logbooks of great polar explorers such as Robert Falcon Scott and Ernest Shackleton.

Experts were concerned that ice at the South Pole had declined significantly since the 1950s, which they feared was driven by man-made climate change.

But new analysis suggests that conditions are now virtually identical to when the Terra Nova and Endurance sailed to the continent in the early 1900s, indicating that declines are part of a natural cycle and not the result of global warming.

“We know that sea ice in the Antarctic has increased slightly over the past 30 years, since satellite observations began. Scientists have been grappling to understand this trend in the context of global warming, but these new findings suggest it may not be anything new.

Read full article

Read press release from the European Geosciences Union

Study: Forest Fires in Sierra Nevada Driven by Past Land Use not Climate Change

Researchers from the University of Arizona and Penn State studied fire regimes in the Sierra Nevada Mountain of California for the period 1600 to 2015 and found that land use changes, not climate, were the principal controlling factors.

This result was apparently a surprise to the researchers since they set out to correlate climate with the fires.

“Initially, we did work to see if we could develop long-lead forecasts for fire in the area — six to 18 months in the future — using climate patterns such as El Niño,” said Alan H. Taylor, professor of geography, Penn State. “This would be a significant help because we could place resources in the west if forecasts indicated it would be dry and the southeast would be wet. However, the climate relationships with fire did not consistently track.”

“We were expecting to find climatic drivers,” said lead co-author Valerie Trouet, a UA associate professor of dendrochronology. “We didn’t find them.”

The researchers used tree ring data from 29 sites, historical documents, and 20th Century records of areas burned.

From the UofA press release:

For the years 1600 to 2015, the team found four periods, each lasting at least 55 years, where the frequency and extent of forest fires clearly differed from the time period before or after. The team found the fire regimes corresponded to different types of human occupation and use of the land: the pre-settlement period to the Spanish colonial period; the colonial period to the California Gold Rush; the Gold Rush to the Smokey Bear/fire suppression period; and the Smokey Bear/fire suppression era to present. Finding that fire activity and human land use are closely linked means people can affect the severity and frequency of future forest fires through managing the fuel buildup and other land management practices — even in the face of rising temperatures from climate change.

From the Penn State press release:

Early fires, because they were more frequent, with less fuel build-up, were “good” fires. They burned through the forest, consumed understory fuels and left the majority of trees unharmed. The Native American mosaic of burned and unburned areas prevented fires from continuously spreading.

From 1776 to 1865 the second fire regime, characterized by Spanish colonialism and the depopulation of Native Americans in the area, shows more land burned. European settlers brought diseases against which Native Americans had no immunity and the population suffered. The Spanish built a string of missions in California beginning in 1769 and relocated remaining Native Americans to the mission areas. In 1793, there was a ban on burning to preserve forage, disrupting the pre-colonial Native American burning practices. The incidence of fires became more sensitive to drought and the fire regime changed, creating the time when fires were largest and most closely coupled with climate.

The third fire period is from 1866 to 1903 and was initiated by the California gold rush, when thousands of people poured into the area. Settlement by large numbers of new immigrants began to break up the forest fuel and the creation of large herds of animals, especially sheep, removed large amounts of understory and changed the fire regime.

The fourth fire period began in 1904 and is linked to the federal government’s policy of fire suppression on government lands. The reason pre-colonial and Spanish colonial fire levels were so much higher than today is that the current fire regime is one of suppressions with an extremely low incidence of fires compared to the past. However, suppression over the last century has allowed fuel to build up on the forest floor and opened the door for “bad” fires that destroy the forest canopy and burn large areas of land.

(UofA press release, Penn State press release, paper abstract )

This finding contradicts an alarmist story printed in the Arizona Daily Star this past October (see third reference below).


See also:

Wildfires and Warming – Relationship not so clear
Claim: “Worsening Wildfires Linked to Temp Rise

Media hype about forest fires and global warming
Mega-fires in Southwest due to forest mismanagement

New Study: Cement is a carbon sink

A new study from University or California Irvine shows that cement is a net carbon dioxide sink.

Cement manufacturing is among the most carbon-intensive industrial processes, but an international team of researchers has found that over time, the widely used building material reabsorbs much of the CO2 emitted when it was made.

“It sounds counterintuitive, but it’s true,” said Steven Davis, associate professor of Earth system science at the University of California, Irvine. “The cement poured around the world since 1930 has taken up a substantial portion of the CO2 released when it was initially produced.”

Cement manufacturing is considered doubly carbon-intensive because emissions come from two sources. CO2 molecules are released into the air when limestone (calcium carbonate) is converted to lime (calcium oxide), the key ingredient in cement. And to generate the heat necessary to break up limestone, factories also burn large quantities of natural gas, coal and other fossil fuels.

Davis and his fellow researchers looked at the problem from a different angle. They investigated how much of the gas is removed from the environment over time by buildings, roads and other kinds of infrastructure. Through a process called carbonation, CO2 is drawn into the pores of cement-based materials, such as concrete and mortar. This starts at the surface and moves progressively inward, pulling in more and more carbon dioxide as years pass.

Read more at WUWT