Earth Hour: A Dissent

Reblogged from WUWT

Earth Hour: A Dissent

by Ross McKitrick

Ross McKitrick, Professor of Economics, Univer...

Ross McKitrick, Professor of Economics, University of Guelph, Canada. (Photo credit: Wikipedia)

 

In 2009 I was asked by a journalist for my thoughts on the importance of Earth Hour.

Here is my response.

I abhor Earth Hour. Abundant, cheap electricity has been the greatest source of human liberation in the 20th century. Every material social advance in the 20th century depended on the proliferation of inexpensive and reliable electricity.

Giving women the freedom to work outside the home depended on the availability of electrical appliances that free up time from domestic chores. Getting children out of menial labour and into schools depended on the same thing, as well as the ability to provide safe indoor lighting for reading.

Development and provision of modern health care without electricity is absolutely impossible. The expansion of our food supply, and the promotion of hygiene and nutrition, depended on being able to irrigate fields, cook and refrigerate foods, and have a steady indoor supply of hot water.

Many of the world’s poor suffer brutal environmental conditions in their own homes because of the necessity of cooking over indoor fires that burn twigs and dung. This causes local deforestation and the proliferation of smoke- and parasite-related lung diseases.

Anyone who wants to see local conditions improve in the third world should realize the importance of access to cheap electricity from fossil-fuel based power generating stations. After all, that’s how the west developed.

The whole mentality around Earth Hour demonizes electricity. I cannot do that, instead I celebrate it and all that it has provided for humanity.

Earth Hour celebrates ignorance, poverty and backwardness. By repudiating the greatest engine of liberation it becomes an hour devoted to anti-humanism. It encourages the sanctimonious gesture of turning off trivial appliances for a trivial amount of time, in deference to some ill-defined abstraction called “the Earth,” all the while hypocritically retaining the real benefits of continuous, reliable electricity.

People who see virtue in doing without electricity should shut off their fridge, stove, microwave, computer, water heater, lights, TV and all other appliances for a month, not an hour. And pop down to the cardiac unit at the hospital and shut the power off there too.

I don’t want to go back to nature. Travel to a zone hit by earthquakes, floods and hurricanes to see what it’s like to go back to nature. For humans, living in “nature” meant a short life span marked by violence, disease and ignorance. People who work for the end of poverty and relief from disease are fighting against nature. I hope they leave their lights on.

Here in Ontario, through the use of pollution control technology and advanced engineering, our air quality has dramatically improved since the 1960s, despite the expansion of industry and the power supply.

If, after all this, we are going to take the view that the remaining air emissions outweigh all the benefits of electricity, and that we ought to be shamed into sitting in darkness for an hour, like naughty children who have been caught doing something bad, then we are setting up unspoiled nature as an absolute, transcendent ideal that obliterates all other ethical and humane obligations.

No thanks.

I like visiting nature but I don’t want to live there, and I refuse to accept the idea that civilization with all its tradeoffs is something to be ashamed of.

Ross McKitrick
Professor of Economics
University of Guelph

The University of Arizona Guide for Snowflakes

If you plan to visit the campus of the University of Arizona in Tucson, you perhaps should read a new 20-page pamphlet produced by Jesús Treviño, Ph.D., Vice Provost for Inclusive Excellence, so that you will be politically correct at all times. The pamphlet is entitled: “Diversity and Inclusiveness in the Classroom.” (Link) This is just one of the things Dr. Treviño does to earn his reported salary of $214,000 per year. (Source)

The pamphlet is introduced with this paragraph:

“With the increase in diversity at institutions of higher education, campus communities are now commonly comprised of individuals from many backgrounds and with diverse experiences as well as multiple and intersecting identities. In addition, many campus constituents have social identities that historically have been under-represented (e.g. Black/African Americans, Latinx/Chicanx/Hispanic [sic], Asian American/Pacific Islanders, Natives Americans, LBTQIA+ folks, international students and employees, people with diverse religious affiliations, veterans, non-traditional students, women, first-generation college students, and people from lower socioeconomic backgrounds). The University of Arizona does not differ from other institutions when it comes to diversity. Considering race and ethnicity alone, currently the UA has over 40% students of color. The multiplicity of the groups mentioned above form a valuable part of our student body.”

This pamphlet was produced for both students and faculty who may occasionally find themselves outside of “safe spaces” and be subjected to or commit a “microaggression.”

Major topics include:

Understanding Diversity and Inclusive Excellence

Tools/Exercises for Preparing Students To Interact in the Classroom

Guidelines for Classroom Discussions

Dialogue vs. Debate

Microaggressions in the Classroom

Among the sage advice given by this document is this: “Oops/ouch: If a student feels hurt or offended by another student’s comment, the hurt student can say ‘ouch.’ In acknowledgement, the student who made the hurtful comment says ‘oops.’ If necessary, there can be further dialogue about this exchange.”

By the way, the document defines “microaggressions” as: “the everyday verbal, nonverbal, and environmental slights, snubs or insults, whether intentional or unintentional, that communicate hostile, derogatory, or negative messages to target persons based solely upon their marginalized group membership.” Welcome to the real world.

This pamphlet is apparently for all students whose parents never taught them how to behave in civil society.

This article was originally published in the Arizona Daily Independent  and received many comments.

See also:

Free Speech and Tender Feelings

History of the Ajo Mining District, Pima County, Arizona by David Briggs

Geologist David Briggs has written another interesting paper on the history of mining in Arizona. This 18-page paper, History of the Ajo Mining District, Pima County, Arizona, was just published by the Arizona Geological Survey and is available as a free download: http://repository.azgs.az.gov/uri_gin/azgs/dlio/1710

I was particularly interesting in the Ajo paper because as a geologist, I conducted exploration at the mine and in the district. Although the mine is now inactive, there is remaining mineralization that can be mined given the right economic conditions. The Ajo orebody is particularly interesting to geologists because paleomagnetic and geologic evidence indicates that the Ajo ore deposit has been tilted to the south a total of approximately 120 degrees in two separate tectonic events. (Source) There is also speculation that a detached piece of the original orebody lies hidden nearby.

Briggs begins his story as follows: “The hostile environment of southwestern Arizona’s low desert presented many challenges to those who sought to discover and exploit the mineral wealth

of the region. Ajo’s remote location combined with hot summer days and scarce water created a number of obstacles that needed to be overcome. Despite these impediments, the district’s wealth was mined by Native Americans long before the arrival of first Spanish explorers, who recognized its potential soon after establishing outposts in this region.”

The Ajo area has a long history. Prior to the arrival of the first Spanish explorers in the 1530’s, the native Tohono O’odham Indians and their ancestors mined hematite, an iron oxide, which they used as body paint. Establishment of Spanish missions in Southern Arizona provided bases from which prospectors combed the country.

With the signing of the Treaty of Guadalupe Hidalgo at the end of the Mexican American War on February 2, 1848, and the subsequent Gadsden Purchase in June 1854, many prospectors tried their luck at Ajo.

Briggs provides great detail as he recounts the many lives of mining ventures in Ajo. Following is a very brief sketch of major events.

The first formal mining began in 1855 and a wagon road was constructed to the railroad at Gila Bend. Ore was also sent by wagon to San Diego and shipped to Swansea, Wales for smelting. High transportation costs eventually made the venture uneconomic.

Briggs recounts the era between 1898 and 1908 when the Ajo deposit saw many promotions and fraudulent mining schemes.

In 1911, the Calumet and Arizona Mining Company, which was operating mines in Bisbee, became interested in the Ajo properties and acquired the New Cornelia Copper Company which owned Ajo at the time. Calumet began an extensive drilling program which confirmed the presence of a large sulfide body of mineralization. They began open pit mining in 1915.

In 1931, Phelps Dodge merged with Calumet and Arizona Mining Company and continued to operate the mine which they did until 1985 when a combination of low copper prices and stricter regulations for smelter air quality caused the company to close the mine.

The Ajo property is now owned by Freeport-McMoRan, Inc. through its merger with Phelps Dodge. According to Briggs, “Freeport continues to periodically assess the economic feasibility of returning the Ajo project to production. As of December 31, 2015, this project is estimated to contain a sulfide resource of 482 million short tons, averaging 0.40% copper, 0.010% molybdenum, 0.002 oz. of gold/ton and 0.023 oz. of silver/ton.”

Other papers by David Briggs, published by the Arizona Geological Survey:

History of the Warren (Bisbee) Mining District

History of the San Manuel-Kalamazoo Mine, Pinal County, Arizona

Recovery of Copper by Solution Mining Techniques

Superior, Arizona – An Old Mining Camp with Many Lives

History of the Copper Mountain (Morenci) Mining District

History of Helvetia-Rosemont Mining District, Pima County, Arizona

 

The Pirate Fault of Canada del Oro

pirate-fault

The Pirate fault forms the western boundary of the Santa Catalina Mountains near Tucson and separates the mountains from the Cañada del Oro basin to the west. The fault occurs just east of the communities of Saddlebrooke, Catalina, and Oro Valley. Remnants of this fault, exposed for about 15 miles along the mountain front, are described in a paper from the Arizona Geological Survey (see reference below). The paper describes geological features of 10 sites along the fault trace.

 

The AZGS says that this fault represents an expression of Basin & Range faulting which was active between 12 million and 6 million years ago. Vertical displacement on the fault is estimated to be about 2.5 miles with the west side down relative to the Santa Catalina Mountains uplift on the east. The fault dips from 50° to 55° west along its entire trace. The Basin & Range era was a time of crustal extension which formed much of the topography in Southern Arizona.

According to AZGS: “ Following cessation of active uplift, the fault was buried under detritus eroded from the uplifted Santa Catalina block and, currently, is being exhumed by the down-cutting Cañada del Oro and its tributaries. This field examination reveals the fault to have left a sparse but diverse collection of remains implying a varied history of fault development and evolution.”

“Deposition of basin-fill material in the Cañada del Oro basin culminated in Pleistocene time (1-2 Ma) following cessation of active uplift on the Pirate fault. Alluvium deposited during this latter time forms the high-stand surface of coalescent alluvial fans composed mostly of detritus eroded from the Santa Catalina Mountains.” That material contains placer gold deposits. The gold was derived from gold-bearing quartz veins in the Santa Catalina Mountains.

The Pirate fault disappears beneath alluvium to both the south and north. If one projects the northern trace, the Pirate fault could intersect the southeast-to-northwest trending Mogul fault. Indeed, near the projected intersection is a decorative stone quarry whose source rock is highly fractured, deformed, and altered bedrock that may be evidence of the projected fault intersection.

Parts of the exposed Pirate fault are stained red by hematite, an iron oxide, suggesting that mineralizing hydrothermal solutions were present during the development of the fault. The exact nature of this mineralization is enigmatic and according to the AZGS, “would seem to defy ready explanation.” “The picture that emerges is that of the Pirate fault as a geologic entity whose tenure as an active participant in the extensional Basin-Range tectonic event has left behind a somewhat sparse and locally enigmatic set of remains from which to infer, caveat emptor, its past.”
Reference:

Hoxie, D.T., Exhuming the Remains of the Inactive Mountain-Front Pirate Fault, Santa Catalina Mountains, Southeastern Arizona. Arizona Geological Survey, Contributed Report CR-12-F, 18p.

Free download: http://repository.azgs.az.gov/sites/default/files/dlio/files/nid1483/cr-12-f_pirate_fault_report_v.1.pdf

See also: The Gold of Cañada del Oro

The Basin & Range Province of North America

American Geosciences Institute’s Critical Issues program

agi

The Arizona Geological Survey’s winter e-magazine features an article about the American Geosciences Institute’s Critical Issues program (www.americangeosciences.org/critical-issues).

The aim of this AGI program is to pioneer a new approach to sharing societally-relevant science with state and local decision makers. “Here in Arizona, we are sharing this with state and local decision-makers to help them wrap their heads around the complex issues involving groundwater, geologic hazards, and sustainable natural resource management.”

The program aims to support connections and communication between the geoscience community and decision makers. Although the program caters to decision makers at all levels, it particularly focuses on state and local decision makers because these stakeholders are commonly underserved by geoscience policy efforts.

The program convenes meetings, such as the AGI Critical Issues Forum, but its main interface is a web-based platform of resources that bring the expertise of the geoscience community to decision makers by offering a curated selection of information products from sources that include state geological surveys, federal and state agencies, and AGI’s member societies.

The Critical Issues program offers the following freely accessible information services:

Research database: Over 4,000 publications primarily from state geological surveys and the U.S. Geological Survey.

Webinars: Free webinars on a variety of topics that bring geoscientists and decision makers together to discuss potential solutions to challenges at the interface of geoscience and society.

Maps & Visualizations: 144 interactive maps and visualizations covering all 50 states and the District of Columbia.

Case studies: A new product that is coming online in Spring 2017. Specific applications of geoscience to societal problems.

Fact Sheets: A new product that is coming online in Spring 2017. Provide more in-depth information on the big issues.

Frequently Asked Questions: 105 questions on topics including: climate, energy, hazards, mineral resources, and water.

Read more at:

http://repository.azgs.az.gov/sites/default/files/dlio/files/nid1709/agi_critical_issues4-final.pdf

This AZGS e-Magazine also includes an article about groundwater use in the United States.

 

Humans caused 84% of US wildfires from 1992 to 2012

Although climate change has been blamed for an increase of wildfires in the United States, a new paper, published in the Proceedings of the National Academy of Sciences, concluded that 84% of fires were ignited by humans and this extended the fire season by a factor of three.

Here is the paper abstract:

The economic and ecological costs of wildfire in the United States have risen substantially in recent decades. Although climate change has likely enabled a portion of the increase in wildfire activity, the direct role of people in increasing wildfire activity has been largely overlooked. We evaluate over 1.5 million government records of wildfires that had to be extinguished or managed by state or federal agencies from 1992 to 2012, and examined geographic and seasonal extents of human-ignited wildfires relative to lightning ignited wildfires. Humans have vastly expanded the spatial and seasonal “fire niche” in the coterminous United States, accounting for 84% of all wildfires and 44% of total area burned. During the 21-y time period, the human-caused fire season was three times longer than the lightning-caused fire season and added an average of 40,000 wildfires per year across the United States. Human-started wildfires disproportionally occurred where fuel moisture was higher than lightning-started fires, thereby helping expand the geographic and seasonal niche of wildfire. Human-started wildfires were dominant (>80% of ignitions) in over 5.1 million km2 , the vast majority of the United States, whereas lightning-started fires were dominant in only 0.7 million km2, primarily in sparsely populated areas of the mountainous western United States. Ignitions caused by human activities are a substantial driver of overall fire risk to ecosystems and economies. Actions to raise awareness and increase management in regions prone to human-started wildfires should be a focus of United States policy to reduce fire risk and associated hazards.

Read the full paper here:

http://www.pnas.org/content/early/2017/02/21/1617394114.full.pdf?sid=97323811-0b09-4bfb-a5d8-8b6480f6aa0f

Arizona State University researchers want to deploy 100 million ice-making machines to the Arctic

Fourteen researchers from Arizona State University want to save the Arctic ice sheet by deploying up to 100 million ice-making machines at a cost of about $5 trillion over the next 10 years. Essentially, wind-powered pumps will spread ocean water over ice where it will freeze and thicken the sea ice. Their proposal was published January 24, 2017, in Earth’s Future, an open access journal of the American Geophysical Union. You can read their full paper here:

http://onlinelibrary.wiley.com/doi/10.1002/2016EF000410/epdf

The researchers claim that loss of Arctic sea ice is due to global warming caused by human release of CO2 (they don’t provide any evidence). Thus, there is an “urgent need to deal with climate change.” Within the paper they invoke all the usual boogeymen of dangerous global warming alarmism.

The paper abstract begins: “As the Earth’s climate has changed, Arctic sea ice extent has decreased drastically. It is likely that the late-summer Arctic will be ice-free as soon as the 2030s. This loss of sea ice represents one of the most severe positive feedbacks in the climate system, as sunlight that would otherwise be reflected by sea ice is absorbed by open ocean. It is unlikely that CO2levels and mean temperatures can be decreased in time to prevent this loss, so restoring sea ice artificially is an imperative.”

Their ice-making machine:

“We propose that a wind pump mounted on a large buoy, could perform the function of capturing wind energy to pump seawater to the surface. The basic components of such a device would include: a large buoy; a wind turbine and pump, drawing up seawater from below the ice; a tank for storing the water; and a delivery system that takes the water periodically flushed from the tank and distributes it over a large area. The goal is to raise enough water over the Arctic winter to cover an area approximately 0.1 km2 with approximately1 m of ice. A system of such devices would have to be manufactured and delivered to the Arctic Ocean, probably repositioned each season, and would need to be maintained.”

The researchers recognize “it is a challenge to prevent the water inside the device (tank, delivery system) from freezing.” But, they provide no solution. Where will they get energy to heat the water to prevent a freeze? They also say that the buoy-turbine contraption must be sturdy enough to prevent it tipping over in the fickle Arctic environment.

The researchers propose starting small with only 10 million pumps at a cost of $500 billion. They say we would need 100 million devices costing $5 trillion to cover the entire Arctic.

In my opinion, this is just another wacky and completely unnecessary geo-engineering scheme. It is also a complete waste of money and resources. Within the paper is a discussion of the need for a multinational governance of the Arctic ice. This seems to me to be a plea for more bureaucracy and future funding. Why 14 authors for this paper? Maybe the group wants to get “publish or perish” credit, which is vital in academia, before President Trump pulls the plug. Or, it could be a class project with professors and students. By the way, a note in the paper says: “The authors received no funding to carry out this work.” That probably means they had no special grant funding. I presume that the University pays the professors a salary (with taxpayer’s money).

I saw no mention in the paper of an unintended consequence of freezing ocean water: it will increase the amount of CO2 released into the atmosphere. “When sea water freezes, all of the CO2 that is bound up in that water is forced out. Not only is the dissolved gaseous CO2 released, but all of the CO2 held in the carbonate form is released as well.” (Source)

 

See also:

Predictions of an ice-free Arctic Ocean

Wacky Geoengineering Schemes to Control Climate

The Arctic-Antarctic seesaw

Climate models for the layman

The Global Warming Policy Foundation, a British think tank, has just published an excellent review of climate models, their problems and uncertainties, all of which show that they are inadequate for policy formulation. The paper is written by Dr. Judith Curry, the author of over 180 scientific papers on weather and climate. She recently retired from the Georgia Institute of Technology, where she held the positions of Professor and Chair of the School of Earth and Atmospheric Sciences. She is currently President of Climate Forecast Applications Network.

You can read the 30-page paper here:

http://www.thegwpf.org/content/uploads/2017/02/Curry-2017.pdf

Here is the executive summary:

There is considerable debate over the fidelity and utility of global climate models (GCMs). This debate occurs within the community of climate scientists, who disagree about the amount of weight to give to climate models relative to observational analyses. GCM outputs are also used by economists, regulatory agencies and policy makers, so GCMs have received considerable scrutiny from a broader community of scientists, engineers, software experts, and philosophers of science. This report attempts to describe the debate surrounding GCMs to an educated but nontechnical audience.

Key summary points

• GCMs have not been subject to the rigorous verification and validation that is the norm for engineering and regulatory science.

• There are valid concerns about a fundamental lack of predictability in the complex nonlinear climate system.

• There are numerous arguments supporting the conclusion that climate models are not fit for the purpose of identifying with high confidence the proportion of the 20th century warming that was human-caused as opposed to natural.

• There is growing evidence that climate models predict too much warming from increased atmospheric carbon dioxide.

• The climate model simulation results for the 21st century reported by the Intergovernmental Panel on Climate Change (IPCC) do not include key elements of climate variability, and hence are not useful as projections for how the 21st century climate will actually evolve.

Climate models are useful tools for conducting scientific research to understand the climate system. However, the above points support the conclusion that current GCMs are not fit for the purpose of attributing the causes of 20th century warming or for predicting global or regional climate change on timescales of decades to centuries, with any high level of confidence. By extension, GCMs are not fit for the purpose of justifying political policies to fundamentally alter world social, economic and energy systems. It is this application of climate model results that fuels the vociferousness of the debate surrounding climate models.

NOAA caught manipulating temperature data – again

Dr John Bates, a recently retired senior scientist at the National Oceanic and Atmospheric Administration (NOAA), alleges that a NOAA paper written before the historic climate conference in Paris in 2015 breached NOAA’s own rules and was based on misleading and unverified data. That, to many, looks like the paper was designed to stoke up hysteria over global warming in the run-up to the conference. (Source)

NOAA has often been accused of manipulating data for political purposes. See for instance, my ADI article: The past is getting cooler which reflects a curiosity of published government temperature records that show the 1930s getting cooler and cooler with each update of the record. The more recent scandal derives from NOAA’s attempt to erase the 18-year “pause” in global warming. Even though atmospheric carbon dioxide has been rising, global temperature has failed to respond as the climate models say it should. (See El Nino to El Nino – no warming of global temperature) This recent scandal was exposed by David Rose in an article in the British paper Daily Mail.

Global temperatures published by NOAA compared to global temperatures published by the British MET office shows that NOAA temperatures are consistently higher. In the graph below (source), the red line shows the current NOAA world temperature graph, which relies on the ‘adjusted’ and unreliable sea temperature data cited in the flawed ‘Pausebuster’ paper. The blue line is the UK Met Office’s independently tested and verified ‘HadCRUT4’ record, showing lower monthly readings and a shallower recent warming trend.

noaa-vs-met

David Rose notes: NOAA’s 2015 ‘Pausebuster’ paper was based on two new temperature sets of data – one containing measurements of temperatures at the planet’s surface on land, the other at the surface of the seas. Both datasets were flawed. This newspaper has learnt that NOAA has now decided that the sea dataset will have to be replaced and substantially revised just 18 months after it was issued, because it used unreliable methods which overstated the speed of warming. The revised data will show both lower temperatures and a slower rate in the recent warming trend. The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.

To add to the confusion, NOAA also changed the computer programs it uses to compile temperature data, and guess what? The new program creates global warming where there had been none before. These changes are documented in a post by Rud Istvan.

“A 2011 paper announced that NOAA would be transitioning to updated and improved CONUS software around the end of 2013. The program used until the upgrade was called Drd964x. The upgrade was launched from late 2013 into 2014 in two tranches. Late in 2013 came the new graphical interfaces, which are an improvement. Then about February 2014 came the new data output, which includes revised station selection, homogenization, and gridding. The new version is called nClimDiv.” The graphs below show some of the results for temperatures from 1900 to 2010. Left shows old system results versus new system results on right.

maine

michigan

california

Another way NOAA influences the official temperature is by removal of thousands of weather station land thermometers from remote, high altitude, and/or non-urban locations since the 1970s. These are stations which do not show the warming trends predicted by models, as they are not affected by proximity to artificial or non-climatic heat sources (pavements, buildings, machinery, industry, etc.) like urban weather stations are. (Thermometers near urban heat sources can cause warming biases of between 0.1 and 0.4°C per decade.) This inflates the average temperature reported. Read more

Perhaps the Trump administration can get NOAA out of politics and back to science.

American mineral production for 2016

The U.S. Geological Survey has just released their annual summary of non-fuel mineral production in the U.S. for 2016. They estimate that the value of all non-fuel minerals produced from U.S. mines was $74.6 billion, a slight increase over production in 2015. “ Domestic raw materials and domestically recycled materials were used to process mineral materials worth $675 billion. These mineral materials were, in turn, consumed by downstream industries with an estimated value of $2.78 trillion in 2016.”

Principal contributors to the total value of metal mine production in 2016 were gold (37%), copper (29%), iron ore (15%), and zinc (7%). The estimated value of U.S. industrial minerals production in 2016 was $51.6 billion which was dominated by crushed stone (31%), cement (18%), and construction sand and gravel (17%).

Nevada was ranked first with a total mineral production value of $7.65 billion, mainly from gold. Arizona came in second in total production with a value of $5.56 billion and first in U.S. copper production. Texas, California, Minnesota, Florida, Alaska, Michigan, Wyoming, Missouri, and Utah, in that order, were next in value of production.

“In 2016, U.S. production of 13 mineral commodities was valued at more than $1 billion each. These were, in decreasing order of value, crushed stone, cement, construction sand and gravel, gold, copper, industrial sand and gravel, iron ore (shipped), lime, phosphate rock, salt, soda ash, zinc, and clays (all types).” Does that order surprise you?

Most of the material mined (stone, sand, lime, clay) is used in construction of our infrastructure.

Gold is used as coinage and to manufacture jewelry. Because gold does not corrode, it is used in solid state electronic devices that use very low voltages and currents which are easily interrupted by corrosion or tarnish at the contact points.

Copper is used mainly to generate and transmit electricity and it occurs in all our electronic devices.

Zinc is used for galvanizing to prevent corrosion and, combined with copper to make brass. Zinc is also combined with other metals to form materials that are used in automobiles, electrical components, and household fixtures. Zinc oxide is used in the manufacture of rubber and as a skin ointment.

Iron is used mainly to make steel.

Phosphate rock is used mainly as a fertilizer and also as a nutritional supplement for animals and humans.

Soda ash (sodium carbonate) is an essential raw material used in the manufacturing of glass, detergents chemicals, softening water, making baking soda, and used in many industrial products.

“U.S. mine production of copper in 2016 increased slightly, to about 1.41 million tons, and was valued at about $6.8 billion. Arizona, New Mexico, Utah, Nevada, Montana, and Michigan, in descending order of production, accounted for more than 99% of domestic mine production; copper also was recovered in Missouri. Twenty-four mines recovered copper, 17 of which accounted for about 99% of production.”

A note on reserves and resources:

Reserves data are dynamic. They may be reduced as ore is mined and (or) the feasibility of extraction diminishes, or more commonly, they may continue to increase as additional deposits (known or recently discovered) are developed, or currently exploited deposits are more thoroughly explored and (or) new technology or economic variables improve their economic feasibility. Reserves may be considered a working inventory of mining companies’ supplies of an economically extractable mineral commodity. As such, the magnitude of that inventory is necessarily limited by many considerations, including cost of drilling, taxes, price of the mineral commodity being mined, and the demand for it. Reserves will be developed to the point of business needs and geologic limitations of economic ore grade and tonnage. For example, in 1970, identified and undiscovered world copper resources were estimated to contain 1.6 billion metric tons of copper, with reserves of about 280 million tons of copper. Since then, more than 500 million tons of copper have been produced worldwide, but world copper reserves in 2016 were estimated to be 720 million tons of copper, more than double those of 1970, despite the depletion by mining of almost double the original estimated reserves.

mineral-industry-trends-2016

As can be seen in the table above, there was a decline in the production of coal, probably due to the rise in natural gas production. Metal production also decreased. According to the USGS, “Several U.S. metal mines and processing facilities were idled or closed permanently in 2016, including iron ore mines in Michigan and Minnesota; three primary aluminum smelters in Indiana, Missouri, and Washington; one secondary zinc smelter in North Carolina; a titanium sponge facility in Utah, the only such facility in the United States; and titanium mineral operations in Virginia.” In 2016, imports made up more than one-half of the U.S. apparent consumption of 50 non-fuel mineral commodities, and the United States was 100% import reliant for 20 of those.

The 200-page report gives detailed information for each commodity.

The full report is available online here: https://minerals.usgs.gov/minerals/pubs/mcs/2017/mcs2017.pdf