“The University of Arizona has been ordered to surrender emails by two UA scientists that a group claims will help prove that theories about human-caused climate change are false and part of a conspiracy.” (Arizona Daily Star) The professors involved are Malcolm Hughes, who is still with the UA, and Jonathan Overpeck, who left earlier this year.
The backstory begins in 2009:
In 2009, it was revealed that someone hacked in to the files of the Climatic Research Unit (CRU) based at the University of East Anglia, in England. The CRU has been a major proponent of anthropogenic global warming and a principal in report preparation for the Intergovernmental Panel on Climate Change (IPCC).
More than 1,000 internal emails and several reports from CRU have been posted on the internet and the blogosphere had gone wild with the implications of the revealed messages. Dr. Phil Jones, head of CRU, confirmed that his organization has been hacked and that the emails are accurate. This disclosure did not include any Emails at other institutions such as Penn State or the University of Arizona.
The emails reveal a concerted effort on the part of a small group of scientists to manipulate data, suppress dissent, and foil the dissemination of the information by “losing” data and skirting Britain’s Freedom of Information Act. The emails reveal that the contention of dangerous human-induced global warming is not supported by the data, that those supporting that contention knew it, and sought to control the discussion so as to hide the unreliable nature of what they were claiming.
Part of the controversy involved the infamous “hock stick” graph devised by Michael Mann of Penn State and subsequently adopted by the IPCC.
In the “battle of the graphs” the bottom panel shows temperatures based on proxy data and measurements. It shows that the Medieval Warm Period of 1,000 years ago was much warmer than now. Mann’s hockey stick did away with the Medieval Warm Period and showed only a large spike of recent warming – hence the name “hockey stick”. The “hockey stick” made its debut in the journal Geophysical Research Letters in 1999 in a paper by Michael Mann, Raymond Bradley, and Malcolm Hughes that built upon a 1998 paper by the same authors in the journal Nature which detailed the methodology for creating a proxy temperature reconstruction.
There are problems with the Hockey Stick according to Canadian researchers Steve McIntyre and Ross McKitrick. “The first mistake made by Mann et al. and copied by the UN in 2001 lay in the choice of proxy data. The UN’s 1996 report had recommended against reliance upon bristlecone pines as proxies for reconstructing temperature because 20th-century carbon-dioxide fertilization accelerated annual growth and caused a false appearance of exceptional recent warming. Notwithstanding the warning against reliance upon bristlecones in UN 1996, Mann et al. had relied chiefly upon a series of bristlecone-pine datasets for their reconstruction of medieval temperatures. Worse, their statistical model had given the bristlecone-pine data sets 390 times more prominence than the other datasets they had used.
Furthermore, the statistical algorithms in Mann et al. where shown to be flawed. McIntyre ran the Mann’s algorithm 10,000 times, having replaced all palaeoclimatological data with randomly-generated, electronic “red noise”. They found that, even with this entirely random data, altogether unconnected with the temperature record, the model nearly always constructed a “hockey stick” curve similar to that in the UN’s 2001 report.” (See their detailed report)
Mann had another problem. Their proxy data began to rise, but then took a plunge into cooler temperatures. They hid this decline by truncating the proxy data and substituting rising measured temperatures without telling anyone. This became known as “Mike’s Nature Trick”. (Read more)
One other incident: In my article A Simple Question for Climate Alarmists I posed this question: “What physical evidence supports the contention that carbon dioxide emissions from burning fossil fuels are the principal cause of global warming since 1970?” In a public forum, I had the opportunity to pose this question to then UofA professor Jonathan Overpeck. He could not cite any supporting physical evidence.
There has been much controversy over President Trump’s proposed budget and the revision of health care. Much of the proposed spending in Trump’s budget and previous budgets is not supported by the Constitution.
The 2016 federal budget, submitted by Barack Obama, was $4.147 trillion which was 21.5% of GDP and resulted in a deficit for the year of $503 billion. The total federal deficit is almost $20 trillion. Although the President submits or suggests budgets, it is the duty of Congress to appropriate the money. In my opinion, a large part of federal spending is unconstitutional.
The Constitution of the United States grants certain powers to Congress and Executive Branch. Over the years, Congress has greatly exceeded its Constitutional authority. Federal agencies have created thousands of regulations and spent trillions of dollars of taxpayers’ money on things for which they had no authority to do so. These regulations have the force of law, but only Congress can make law. There is a movement to change the constitution with a balanced budget amendment. Such an amendment would be unnecessary if only Congress and the President would enforce the Constitution.
Below are the Constitutionally enumerated powers of Congress. Nowhere in this enumeration can I find the authority for the federal government to have Departments of Education, Labor, or Energy. I see no authority for the Environmental Protection Agency, nor the requirement that citizens buy health insurance. Some may also argue that our whole welfare and medical care systems are unconstitutional. And, as Benjamin Franklin once said, “I am for doing good to the poor, but I differ in opinion of the means. I think the best way of doing good to the poor, is not making them easy in poverty, but leading or driving them out of it.”
The principal authority of Congress is specified in Article I of the Constitution.
Article I, Section 8 :The powers of Congress:
The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States; but all Duties, Imposts and Excises shall be uniform throughout the United States;
To borrow Money on the credit of the United States;
To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes;
To establish an uniform Rule of Naturalization, and uniform Laws on the subject of Bankruptcies throughout the United States;
To coin Money, regulate the Value thereof, and of foreign Coin, and fix the Standard of Weights and Measures;
To provide for the Punishment of counterfeiting the Securities and current Coin of the United States;
To establish Post Offices and post Roads;
To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries;
To constitute Tribunals inferior to the supreme Court;
To define and punish Piracies and Felonies committed on the high Seas, and Offences against the Law of Nations;
To declare War, grant Letters of Marque and Reprisal, and make Rules concerning Captures on Land and Water;
To raise and support Armies, but no Appropriation of Money to that Use shall be for a longer Term than two Years;
To provide and maintain a Navy;
To make Rules for the Government and Regulation of the land and naval Forces;
To provide for calling forth the Militia to execute the Laws of the Union, suppress Insurrections and repel Invasions;
To provide for organizing, arming, and disciplining, the Militia, and for governing such Part of them as may be employed in the Service of the United States, reserving to the States respectively, the Appointment of the Officers, and the Authority of training the Militia according to the discipline prescribed by Congress;
To exercise exclusive Legislation in all Cases whatsoever, over such District (not exceeding ten Miles square) as may, by Cession of particular States, and the Acceptance of Congress, become the Seat of the Government of the United States, and to exercise like Authority over all Places purchased by the Consent of the Legislature of the State in which the Same shall be, for the Erection of Forts, Magazines, Arsenals, dock-Yards, and other needful Buildings; — And
To make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States, or in any Department or Officer thereof.
Other Authority granted to Congress by the Constitution:
Article IV, Section 3, clause 2: “The Congress shall have Power to dispose of and make all needful Rules and Regulations respecting the Territory or other Property belonging to the United States; and nothing in this Constitution shall be so construed as to Prejudice any Claims of the United States, or of any particular State.”
The 16th Amendment: “The Congress shall have power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration.”
See any justification for Departments of Education, Labor, Energy, or Environmental Protection Agency etc. there? Of course, strictly speaking, there is no justification for Social Security or Medicare either. “If Congress can do whatever in their discretion can be done by money, and will promote the General Welfare, the Government is no longer a limited one, possessing enumerated powers, but an indefinite one, subject to particular exceptions.” —James Madison (1792)
The 10th Amendment also limits the powers of Congress: “The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.”
The Heritage Foundation opines: Those who claim the Department of Education is Constitutional say that it promotes the general welfare of the United States, however, this phrase in the preamble of the Constitution does not grant or prohibit power to Congress, that is not its purpose. The preamble simply describes the Constitution and what the document itself was designed to do, and is not actually a binding decree of the Constitution.
The Department of Education was founded using the preamble as the basis for its Constitutionality, but due to what’s stated above, it is clear that it is not. Thomas Jefferson considered the federal government’s involvement in education to be unconstitutional. In 1862, James Buchanan warned that giving education to Congress would create a vast and irresponsible authority. Both he and Jefferson were right. (Source)
Another type of unconstitutional spending occurs when agencies make unauthorized payments. Article I, Section 9 of the Constitution says in part: “No Money shall be drawn from the Treasury, but in Consequence of Appropriations made by Law; and a regular Statement and Account of the Receipts and Expenditures of all public Money shall be published from time to time.” That provision was invoked in a lawsuit House of Representatives v. Burwell, which involved reimbursements the Department of Health and Human Services (HHS) had been paying to insurers to keep out-of-pocket costs artificially low for patients with incomes up to 250 percent of the federal poverty line. Congress refused to appropriate the funds for this scheme, but HHS reimbursed the insurers anyway, whereupon the House sued the Obama administration. The judge ruled that the payment of such reimbursements without congressional authorization “violates the Constitution.” (Source)
The essential idea of the Constitution is that the federal government has limited powers, as stated in the 10th Amendment. It’s time to return to the original meaning of the Constitution and downsize the federal government where it is politically possible to do so. Let each State decide how to handle its own business. States might be more circumspect and accountable to their citizens than is a far-off federal government. (Or, they might become California.)
I’m sure you can think of other instances where the federal government is spending taxpayer money on things not authorized by the Constitution.
Geoscientists agree, there is no such thing as an earthquake season. The tectonic forces producing earthquakes are inured from changes in meteorological or astronomical conditions; the latter involves fluctuation in gravitational forces due to the position of Earth’s Moon.
Arizona does, however, have an earth fissure season. A season when earth fissures are more likely to first appear or undergo renewed activity. Central and southeastern Arizona’s earth fissure season accompanies the onset of torrential rainfall of the summer monsoon, from mid-June to late September, with most precipitation occurring from mid-July to mid-August.
In southern and western Arizona, Cochise, La Paz, Maricopa, Pima and Pinal Counties all host earth fissures. In these five counties, we‘ve identified nearly 30 discrete earth fissure study areas, each with its own history, and comprising a collective 170 miles of mapped fissures and an additional 180 miles of reported but unconfirmed fissures. (Why unconfirmed? Three principal reasons: 1) ground disturbance has effectively masked the fissure; 2) built infrastructure now covers the fissure; 3) the feature was incorrectly identified as a fissure initially.)
Release of 4 revised earth fissure maps
Each monsoon season finds the AZGS’ mapping team in the field addressing new leads and revisiting fissures with a history of activity. Mapping results are compiled on existing earth fissure study area maps, which are then revised, re-versioned, and released online at the interactive Natural Hazards in Arizona site. At the same time, we release an updated ESRI spatial data file (.shp) and Google Earth KMZ file, ‘Locations of Mapped Earth Fissure Traces in Arizona’, versioned to the date of release, in this case 06 Nov. 2017.
This year we are releasing revised maps for four earth fissure study areas (Figure 1):
Two of these, Apache Junction and Chandler Heights, have largely shifted from agricultural lands to residential or industrial use lands, markedly increasing the hazard and risk that accompanies fissuring.
Fissure activity ranges widely within and between study areas. Not all fissures are created equal; and not all fissures display activity each monsoon season. Most study areas retain some active fissures that either display slow incremental expansion, or dramatic episodic growth powered by eroding sheet flow accompanying torrential rains; sheet flow fans across the valley surface as a thin sheet of water spilling into open fissures eroding sidewalls and causing gullying.
Existing fissures frequently capture numerous drainages leading to incision (headcutting) on the up-channel side. Fissures often form orthogonal to the natural drainage direction, so channels and washes intersected by the fissure deliver water during and after rains.
With reduced groundwater harvesting and waning subsidence, fissures may transition from active to inactive status. When this occurs, they become sediment traps for wind- and water-borne sediments – clays, silts and sands – that subsume the fissure, masking its diagnostic morphology – a wide open throat, steep to inclined sidewalls and a hummocky, irregular base. Reactivation of dormant, partially filled earth fissures, may occur if heavy runoff, coupled with even modest land subsidence that produces tensional forces sufficient to reopen the fissure, counters the ‘healing’ process, leading to a wider and more deeply incised fissure.
Apache Junction: Case Study of a Reactivated Fissure
The Apache Junction earth fissure study area map was first released in April 2008. Over the past several years AZGS Earth Fissure manager Joe Cook has revisited the Apache Junction virtually, via Google Earth, and physically to examine new or reactivated earth fissures. According to Cook, ‘there’s about 0.8 miles of new fissures in Apache Junction since April 2008. Many of the new fissures formed parallel to existing fissures or connected gaps in formerly discontinuous fissures.’
On the morning of July 24, 2017, following heavy rains on the late evening and early morning hours of 23-24 July, nearly 320 feet of fresh fissure opened near West Houston Ave., Apache Junction (Figure 2). This new feature is part of a larger fissure zone that stretches for more than 2 miles from near the junction of Baseline and Meridian Roads to south of West Guadalupe Road (Figure 2a). The fissure complex tunnels below streets, state trust land, private property, and large power lines.
According to Joe Cook’s report; ‘The fissure cracked West Houston Ave and an open void was visible beneath the road through a collapsed pothole. The road was closed to vehicle traffic immediately, but additional road collapse occurred over the days and weeks that followed. Large open depressions approximately 5-15 feet across and up to 8 feet deep, partially filled with collapsed material, were visible on private property to the south of Houston Ave. These open depressions were connected by parallel cracks beginning at Houston Ave to the north. Hairline cracks continued south of the southern-most depression for approximately 80 feet. Locally, a void space was visible below the hairline cracks indicating a strong potential for additional collapse.’
‘North of Houston Ave, the new fissure paralleled numerous, active and inactive, older fissures across the undeveloped desert floor. Additional reactivation and collapse along previously mapped fissures was observed beneath the powerlines in the southern part of the Apache Junction Study area.’
‘The cause for collapse of the fissure beneath Houston Ave is probably related to years of subsurface erosion along a buried earth fissure trace which intercepts rainwater from numerous drainages captured by the open portion of the fissure north of Houston Ave. During heavy rain events a substantial volume of floodwater is delivered to the fissure in a drainage ditch adjacent to the north side of W Houston Ave. Waning flow in this drainage ditch was observed to be pouring into the open fissure on the morning of July 25, 2017. No flow in the drainage ditch was observed downstream of the intersection with the earth fissure. The water draining into the fissure was not visible along the fissure anywhere else; water poured into the fissure in a free fall of about 8 feet before disappearing to unknown depths. Void space for further collapse must be substantial, which suggests that continued collapse following heavy rains is possible.’
By August 15, 2017 the collapsed portion of the fissure within the private property south of Houston Ave had been filled, presumably by the owner. But additional damage was evident, and the collapsed section of Houston Ave. remained closed.
Tator Hills: Case Study of a Fresh Fissure
Over the past several years, Tator Hills in southern Pinal County displayed the greatest fresh fissure activity of the four study areas (Figure 3). Imagery served by Google Earth shows that between Mar. 2014 and Dec. 2014, a mile-long, north-south trending earth fissure unzipped about 13 miles south of Arizona City. Sometime after March 2016, the fissure extended an additional ¾ mile to the south. The appearance of this fresh, 2-mile long fissure in an area of modest land subsidence ~ 1 inch (3 cm) annually over the past decade, and otherwise lacking active fissure formation since the early 1990s, was surprising (Cook, 2017).
Applying drone technology to fissures. In Jan. 2017, AZGS geoscientists captured a real-time synoptic view of the newest Tator Hills fissure using a DJI-PhantomTM Drone. AZGS research scientist Brian Gootee piloted the drone and captured videos at 2.7K horizontal resolution, as well as 100s of high resolution, 12 Mb static JPEG images. The latter were stitched together using AgiSoft PhotoScanTM software and analyzed using both AgiSoft PhotoScanTM and ESRI’s ArcGISTM.
At our AZGS Youtube channel, the Tator Hills fissure videos have been viewed an astounding 780,000 times! See Drone technology examining an earth fissure or Drone video of a fresh earth fissure, Tator Hills, Arizona.
The drone’s bird’s-eye view yielded a suite of derivative products – oblique orthoimagery, relief/slope map, digital elevation model (DEM), and topographic maps with contour interval of 1- to 2-feet (Figure 4) – that afford a fresh view of fissure geometry, structure, and topography that may yield new insights into the formative and evolutionary processes of fissures.
Chandler Heights and Three Sisters Buttes. Since release of earlier mapping, we documented subtle changes in some fissures at Chandler Heights (2016), Maricopa County, and Three Sisters Buttes (2012), Cochise County. Chandler Heights infamous ‘Y-Fissure’, so called because of its Y-shaped geometry, remains active but the dramatic reopening and lateral extension observed in previous years has not recurred over the past several years. Nonetheless, the ‘Y-Fissure’ remains of great interest, in part because it winds through neighborhoods in east Queen Creek.
Agriculture is the economic engine that drives Cochise County. The Three Sisters Buttes study area lies several miles southeast of Willcox Playa. Groundwater withdrawal and concurrent basin subsidence continues there unabated and from May 2010 to April 2017, maximum subsidence in the basin reached 9.8 – 15.7 in; 5 to 15 times greater than subsidence observed in the Tator Hills. In rural Cochise County, fissures chiefly threaten roads and pipelines and road signs warning of fissures is a common sight (Figure 5).
Some Observations & Final Thoughts
For the foreseeable future, earth fissures remain a geologic hazard in central and southeastern Arizona. With urban and suburban areas aggressively expanding into former agricultural areas, county and municipal planners may anticipate new and renewed incidents of costly and potentially hazardous impacts, as evinced by the recent damage to the W. Houston Rd. and nearby industrial plant in Apache Junction.
The state of earth fissures in Arizona. Nonetheless, there is hope on the horizon for a diminished threat from earth fissures. According to a recent blog post by hydrologist Brian Conway (Arizona Dept. of Water Resources), ‘Land subsidence rates within the Phoenix and Tucson Active Management Areas (AMAs) have decreased between 25% and 90% compared to the 1990s. This is a result of decreased groundwater pumping, increased groundwater recharge, and recovering groundwater levels in the two AMAs.’
Controlling groundwater pumping reduces basin subsidence, which should in turn re-establish hydrostatic equilibrium across impacted basins, thereby reducing the extensional stresses that lead to fissure formation.
Since 2007, systematic mapping of fissure study areas in Maricopa, Pima and Pinal Counties has uncovered few new earth fissures. Moreover, many fissures mapped between 2007 and 2009 showed physical evidence of having formed years or decades before. It could well be that the rate of earth fissure formation in most study areas reached its apex in the latter quarter of the 20th century. If so, land managers in these impacted areas should anticipate seeing fewer new fissures forming and, perhaps, waning reactivation of existing fissures.
In Cochise County, where groundwater pumping and basin subsidence continues unabated, we anticipate new fissures forming annually and existing fissures reopening.
Note of caution. There could be a substantial time-lag between reduced pumping, waning subsidence rates, and the end of new or renewed fissuring. By way of example, subsidence in the Tator Hills area has slowed substantially since the latter quarter of the 20th century. From 2004 to 2017, total subsidence proximal to the 2-mile long fissure was between 1.6 to 3.2 inches; a magnitude of subsidence that seems inconsistent with the formation of a 2-mile long fissure. This fissure may have formed years before, only to break the ground surface in 2014. This could be true of concealed fissures in other study areas, too. We strongly recommend that civil authorities, farmers, contractors, and the public remain on the alert for the sudden emergence of rogue, outlier fissures.
Final Thoughts. The AZGS fissure mapping team continues to monitor earth fissure study areas, both virtually, via Google Earth and fresh National Agriculture Imagery Program (NAIP) imagery, and physically by returning to study areas. We confer regularly with county and municipal authorities regarding reports of reactivated or new fissures. Last, we remain aware of the potential of fissures forming in areas where the imbalance between groundwater harvesting and recharge leads to measurable basin subsidence, such as in agricultural lands of Cochise County and the McMullen Valley of Maricopa and La Paz Counties.
Arizona Land Subsidence Group, 2007, Land Subsidence and Earth Fissures in Arizona: Research and Informational Needs for Effective Management: Arizona Geological Survey Contributed Report CR-07-C, 29 p.
Schumann, H.H., and Cripe, L.S. (1986). Land subsidence and earth fissures caused by groundwater depletion in Southern Arizona, U.S.A. In A.I. Johnson, L. Carbognin & L. Ubertini (Eds.], Proceedings of the 3rd International Symposium on Land Subsidence, Venice, Italy, 19-25 March 1984 (pp. 841-851). International Association of Hydrological Sciences, Publication 151.
If you drive Interstate 10 between Tucson and Phoenix, about half way you pass between the Picacho Mountains (on the northeast side) and Picacho Peak (on the southwest side). Picacho Peak State Park is a frequent destination for picnics, rock climbing, and viewing spring wildflowers.
The Arizona Geological Survey has recently made available for free download Geologic Field Guides to the Southeastern Picacho Mountains and Picacho Peak. (Link)
From the guide:
The Picacho Mountains consist largely of a compositionally diverse suite of Laramide to middle Tertiary biotite granite, muscovite granite, and heterogeneous to gneissic granite. At the southern end of the range, most of the crystalline rocks have been affected by middle Tertiary mylonitic deformation. Mylonitization is inferred to have accompanied normal faulting and ascent of the bedrock from mid-crustal depths to near the Earth’s surface. [Mylonitization is modification due to dynamic recrystallization following plastic flow.]
Ascent occurred in the footwall of a moderate to low-angle normal fault commonly known as a “detachment fault”. The crystalline rocks of the Picacho Mountains are part of the footwall of a south- to southwest-dipping detachment fault that is exposed only at the base of a small klippe of volcanic rock on a hill top in the southeastern Picacho Mountains. [A klippe is an isolated block of rock separated from the underlying rocks by a fault.]
Picacho Peak, itself, looks like the remnant of a volcano. However, it is an erosional remnant of volcanic rocks that were displaced from over the Picacho Mountains by a detachment fault.
Picacho Peak is composed of multiple andesitic lava flows interbedded with thin sequences of medium- to thin-bedded, well-sorted, medium- to coarse-grained arkosic sandstone and granule sandstone. See the guide for detailed descriptions.
The EPA’s “endangerment finding” classified carbon dioxide as a pollutant and claimed that global warming will have adverse effects on human health. Real research says the opposite: cold is deadlier. The scientific evidence shows that warming is good for health. This is discussed in detail in chapter 7 of Climate Change Reconsidered II: Biological Impacts published by the Heartland Institute. See links to the entire publication at: http://climatechangereconsidered.org/climate-change-reconsidered-ii-biological-impacts/
Here are the key findings based on extensive review of the scientific literature:
• Warmer temperatures lead to a net decrease in temperature-related mortality, including deaths associated with cardiovascular disease, respiratory disease, and strokes. The evidence of this benefit comes from research conducted in every major country of the world.
• In the United States the average person who died because of cold temperature exposure lost in excess of 10 years of potential life, whereas the average person who died because of hot temperature exposure likely lost no more than a few days or weeks of life.
• Some 4,600 deaths are delayed each year as people in the U.S. move from cold northeastern states to warm southwestern states. Between 3 and 7% of the gains in longevity experienced by the U.S. population over the past three decades is due simply to people moving to warmer states.
• Cold-related deaths are far more numerous than heat-related deaths in the United States, Europe, and almost all countries outside the tropics. Coronary and cerebral thrombosis account for about half of all cold-related mortality.
• Global warming is reducing the incidence of cardiovascular diseases related to low temperatures and wintry weather by a much greater degree than it increases the incidence of cardiovascular diseases associated with high temperatures and summer heat waves.
• The adverse health impacts of cold temperatures, especially with respect to respiratory health, are more significant than those of high temperatures in many parts of the world, including Spain, Canada, Shanghai, and Taiwan. In the subtropical island of Taiwan, for example, researchers found low minimum temperatures were the strongest risk factor associated with outpatient visits for respiratory diseases.
• A vast body of scientific examination and research contradict the claim that malaria will expand across the globe and intensify as a result of CO2-induced warming.
• Concerns over large increases in vector-borne diseases such as dengue as a result of rising temperatures are unfounded and unsupported by the scientific literature, as climatic indices are poor predictors for dengue disease.
• While climatic factors largely determine the geographical distribution of ticks, temperature and climate change are not among the significant factors determining the incidence of tick-borne diseases.
• The ongoing rise in the air’s CO2 content is not only raising the productivity of Earth’s common food plants but also significantly increasing the quantity and potency of the many health-promoting substances found in their tissues, which are the ultimate sources of sustenance for essentially all animals and humans.
• Atmospheric CO2 enrichment positively impacts the production of numerous health-promoting substances found in medicinal or “health food” plants, and this phenomenon may have contributed to the increase in human life span that has occurred over the past century or so.
• There appears to be little reason to expect any significant CO2-induced increases in human health-harming substances produced by plants as the atmosphere’s CO2 concentration continues to rise.
Read the full report for details and supporting references.
The U.S. Global Change Research Program (USGCRP) has just released the final version of its Fourth National Climate Assessment report, one that many were claiming that the Trump administration would suppress because, like its predecessors, it is mainly a political document rather than a true scientific assessment . You can read the full 477-page report here: https://science2017.globalchange.gov/
The main conclusion is: “This assessment concludes, based on extensive evidence, that it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century.”
The “extensive evidence” is based entirely on climate modeling rather than on observations. The results produced by models diverge widely from reality. The new report makes the same claims and invokes the same junk science as the previous 2014 report which I analyzed here: National Climate Assessment Lacks Physical Evidence.
As an example of unfounded claims made in the new report we see this statement in the executive summary: “Heatwaves have become more frequent in the United States since the 1960s, while extreme cold temperatures and cold waves are less frequent.”
But plots from the EPA and NOAA show that the most intense heat waves occurred in the 1930s.
Claim in the report: “The incidence of large forest fires in the western United States and Alaska has increased since the early 1980s and is projected to further increase in those regions as the climate changes, with profound changes to regional ecosystems.” This statement is technically correct but it represents cherry-picking and lying by omission.
We see from the table that there were 18,229 fires reported in 1983 which increased to 67,743 fires reported in 2016. What the report doesn’t mention is that the number of fires from 1960 to 1982 were all in the six figure range, e.g., in 1960 there were 103,387 fires and in 1981 there were 249,370 fires. The number dropped to 174,755 fires in 1982.
Section 2.6.1 of the report discusses the “greenhouse effect.” They claim: “As increasing GHG [greenhouse gases] concentrations warm the atmosphere, tropospheric water vapor concentrations increase, thereby amplifying the warming effect.” Climate models depend on this assumption. But NOAA’s own data show that global humidity has been decreasing with warming.
Comments by others:
Theoretical physicist Steve Koonin has an op-ed in the Wall Street Journal entitled “A Deceptive New Report on Climate.”
Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. The WSJ article is pay-walled but you can read extensive excerpts here.
Among his comments:
One notable example of alarm-raising is the description of sea-level rise, one of the greatest climate concerns. The report ominously notes that while global sea level rose an average 0.05 inch a year during most of the 20th century, it has risen at about twice that rate since 1993. But it fails to mention that the rate fluctuated by comparable amounts several times during the 20th century. The same research papers the report cites show that recent rates are statistically indistinguishable from peak rates earlier in the 20th century, when human influences on the climate were much smaller. The report thus misleads by omission.
Note: The rate of sea level rise and fall tends to be cyclical on decadal and bi-decadal periods. See my article: The Sea Level Scam.
Koonin also comments on heat waves: The report’s executive summary declares that U.S. heat waves have become more common since the mid-1960s, although acknowledging the 1930s Dust Bowl as the peak period for extreme heat. Yet buried deep in the report is a figure [6.3] showing that heat waves are no more frequent today than in 1900.
Comments by Dr. Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute, past president of the American Association of State Climatologists, and former program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Read full post “What You Won’t Find in the New National Climate Assessment.”
Under the U.S. Global Change Research Act of 1990, the federal government has been charged with producing large National Climate Assessments (NCA), and today the most recent iteration has arrived. It is typical of these sorts of documents–much about how the future of mankind is doomed to suffer through increasingly erratic weather and other tribulations. It’s also missing a few tidbits of information that convincingly argue that everything in it with regard to upcoming 21st century climate needs to be taken with a mountain of salt.
The projections in the NCA are all based upon climate models. If there is something big that is systematically wrong with them, then the projections aren’t worth making or believing.
The report does not tell you that:
1) Climate model predictions of global temperature diverge widely from observations.
2) No hot spot over tropics: The models predict that there should have been a huge “hot spot” over the entire tropics, which is a bit less than 40% of the globe’s surface. Halfway up through the atmosphere (by pressure), or at 500 hPa, the predicted warming is also twice what is being observed, and further up, the prediction is for seven times more warming than is being observed.
The importance of this is paramount. The vertical distribution of temperature in the tropics is central to the formation of precipitation.
Missing the tropical hot spot provokes an additional cascade of errors. A vast amount of the moisture that forms precipitation here originates in the tropics. Getting that wrong trashes the precipitation forecast, with additional downstream consequences, this time for temperature.
When the sun shines over a wet surface, the vast majority of its incoming energy is shunted towards the evaporation of water rather than direct heating of the surface. This is why in the hottest month in Manaus, Brazil, in the middle of the tropical rainforest and only three degrees from the equator, high temperatures average only 91 F (not appreciably different than humid Washington, DC’s 88 F). To appreciate the effect of water on surface heating of land areas, high temperatures in July in bone-dry Death Valley average 117 F.
Getting the surface temperature wrong will have additional consequences for vegetation and agriculture. In general, a wetter U.S. is one of bumper crops and good water supplies out west from winter snows, hardly the picture painted in the National Assessment.
If the government is going to spend time and our money on producing another assessment report, that report should be based on empirical evidence, not climate models. Note that USGCRP is a conglomeration of 13 federal agencies that had a 2016 budget of $2.6 billion for the climate assessment project. Did you get your money’s worth?
Climate modelers make some outlandish predictions, but occasionally there is a glimmer of honesty:
“The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change.” — James Hansen, “Climate forcings in the Industrial era”, PNAS, Vol. 95, Issue 22, 12753-12758, October 27, 1998.
“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the prediction of a specific future climate state is not possible.” — Final chapter, Draft TAR 2000 (Third Assessment Report), IPCC.
And remember: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken
One other point:
Temperatures recorded by the US Climate Reference Network (USCRN) show no statistically significant trend since this network was established in 2004. This data is from state-of-the-art ultra-reliable triple redundant weather stations placed on pristine environments. As a result, these temperature data need none of the adjustments that plague the older surface temperature networks, such as USHCN and GHCN, which have been heavily adjusted to attempt corrections for a wide variety of biases. Using NOAA’s own USCRN data, which eliminates all of the squabbles over the accuracy of and the adjustment of temperature data, we can get a clear plot of pristine surface data.
By Ken Haapala, President, Science and Environmental Policy Project (SEPP)
What is now called the USGCRP has a murky, politicized past. It was established in 1989 and mandated by Congress in 1990 to “assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” It is to produce a National Climate Assessment every four year. Since 1990, it has produced four reports. The last full report, the 3rd National Climate Assessment, was in May 2014. Apparently, after the election of Mr. Trump, the USGCRP decided on the CSSR, released last week. As with prior NSGCRP reports, it ignores the “natural processes of global change”, which is part of its Congressional mandate.
Such political games are part of USGCRP’s established history. After the election of Mr. Bush, in 2000, under a prior name, the USGCRP released the 2000 U.S. National Assessment of Climate Change report. As shown in the 2008 report of the Nongovernmental International Panel for Climate Change (NIPCC) (Fig 16 & pp 14 to 16), the government report had projections / predictions that were nonsense. The government entity had two different climate models for climate change to 2090, which produced dramatically different results for perception, by regions. The worst example was for the Red River watershed in the Dakotas and Minnesota. One model had a precipitation drop of about 80%, turning the region into a desert, the second model had a precipitation increase of about 80%, resulting in dramatic flooding. The disparity between two models is but one example how inadequately tested global climate models may be used to project / predict almost anything. The federal courts found that the 2000 report did not meet the standards of the Data Quality Act, also called the Information Quality Act. The recent reports of the UN Intergovernmental Panel on Climate Change (IPCC) and the USGCRP have tried to cover up the disparities in the results of their global climate models by blending them into an ensemble. Usually, there are too few runs of any model to establish realistic forecasts for that model. The forecasts change with each run.
Further, the major problem remains, the models are not adequately tested to be used to form government policies on global warming / climate change. As comments by Patrick Michaels carried in last week’s TWTW illustrate, USGCRP ignores the existence of the important problem between the forecasts of atmospheric temperature trends by the global climate models with actual atmospheric temperature trends. The USGCRP ignores physical science.
In 2009, the EPA ruled, under the Clean Air Act, that “the current and projected concentrations of the six key well-mixed greenhouse gases—carbon dioxide (CO2), methane (CH4), nitrous oxide (N2O), hydrofluorocarbons (HFCs), perfluorocarbons (PFCs), and sulfur hexafluoride (SF6)—in the atmosphere threaten the public health and welfare of current and future generations.” In essence, the EPA classified carbon dioxide as a pollutant even though Carbon dioxide is necessary for life on Earth.
For some perspective, note that current atmospheric concentration of carbon dioxide is about 400ppm (parts per million) while the air we exhale with every breathe contains 40,000ppm carbon dioxide. Is breathing causing air pollution?
This EPA ruling in effect allowed EPA to regulate everything from automobile exhaust to power plants to refrigerators. In order to overturn the finding, one would have to successfully show that the underlying scientific basis is wrong – and it is. Another tactic would be to have Congress amend the Clean Air Act, something that is very unlikely in the current contentious Congress.
The EPA’s scientific basis is derived from climate models, predictions of which diverge widely from reality. See my ADI articles:
“In science, if your theory doesn’t take account of all the relevant data, you need a new theory.” Avery shows how the climate models fail to explain observations and notes that thousands of new coal-fired power plants are being built around the world – even in Europe. Avery is a former U.S. State Department senior analyst and co-author with astrophysicist Fred Singer of Unstoppable Global Warming: Every 1,500 Years.
by Alan Carlin. Carlin is a scientist and economist who worked for the RAND Corp. and the EPA.
“Revoking the EF is the only way to bring the climate alarmism scam to the untimely end it so richly deserves in the US and hopefully indirectly elsewhere. Until that happens the CIC [climate industrial complex] will continue to pursue its bad science through reports such as the National Climate Assessment with the recommended disastrous policies that would seriously damage the environment, impoverish the less wealthy, and bring economic disaster for our Nation by raising the prices and decreasing the availability and reliability of fossil fuel energy which is so central to our way of life and economy.”
In a separate post, Carlin also said that “EPA never engaged in a robust, meaningful discussion. Rather, there was a pro forma review after a decision had already been made which met many but not all of the legal requirements.” He lists “six crucial scientific issues that EPA did not actively discuss despite my best efforts to bring a few of them to their attention in early 2009.”
Patrick J. Michaels is the director of the Center for the Study of Science at the Cato Institute. Michaels is a past president of the American Association of State Climatologists and was program chair for the Committee on Applied Climatology of the American Meteorological Society. He was a research professor of Environmental Sciences at University of Virginia for 30 years. Michaels recounts his testimony before the EPA. USGCRP is U.S. Global Change Research Program.
“We the undersigned are individuals who have technical skills and knowledge relevant to climate science and the GHG Endangerment Finding. We each are convinced that the 2009 GHG Endangerment Finding is fundamentally flawed and that an honest, unbiased reconsideration is in order.”
If you had been in Southeastern Arizona eleven or twelve thousand years ago, it would look much different from today. The climate was cooler and wetter, and the rivers actually flowed. Also, you would encounter a suite of large mammals which became extinct in North America. These animals included horses, camels, mastodons, mammoths, long-horned bison, tapirs, shrub oxen, and ground sloths, which were preyed upon by dire wolves, jaguars, cougars, bears, the American lion, and man. (Horses and camels were re-introduced from Europe and Asia.)
We know this because remains of all these animals were found in several sites along the San Pedro River between Tombstone and Bisbee and at other sites in southern Arizona.
At the end of the last glacial epoch, climate became very unstable with the result that many of these megafauna became extinct in North America and the human Clovis culture dispersed. I go into greater detail on extinction hypotheses in my article “Cold case: What Killed the Mammoths?” linked below.
The Arizona Geological Survey published a paper about these animals in 1998 which has recently become available for free download:
Within this 32-page publication are drawings and brief descriptions of the animals and information about Clovis culture humans who hunted them. The paper describes how people hunted and speculates on causes of extinction.
According to AZGS:
Popular literature and illustrations often depict Clovis hunters using stone-tipped spears to attack full-grown mammoth. Archaeological evidence indicates, however, that they more often concentrated their efforts on calves and young adults, sometimes ambushing them near or at watering places. At the Lehner Mammoth Site bones of nine mammoths, all juveniles, were recovered. They were apparently trapped and killed in the stream bed where archaeologists uncovered their bones thousands of years later. The mammoth killed at the Naco Site was also a young adult.
Bison meat appears to have been popular among the Clovis people. At Murray Springs bones of eleven young bison were found along with bones of one mammoth. Both the mammoth and the bison were likely ambushed when they came to water.
Being so large and cumbersome to transport, a mammoth carcass was butchered where it fell. The presence of hearths at kill sites, such as Murray Springs and the Lehner Site, suggests that the hunters also ate some of the meat on the spot, perhaps roasting it as they proceeded with the butchering. Cut marks on bone surfaces, and broken cutting tools indicate that the meat was stripped from the carcass and transported to a nearby camp, where more of it could have been eaten or dried for future consumption.