2015-11 November

People for the West -Tucson

PO Box 86868, Tucson, AZ 85754-6868 pfw-tucson@cox.net

Newsletter, November, 2015

WHO’s Food Scares

by Jonathan DuHamel

The media are hyping a report from the UN’s World Health Organization (WHO) which claims to link eating processed meat and red meat to cancer.

Scientists from WHO did a data dredge from several hundred epidemiological studies looking for associations between meat and cancer, specifically colon cancer.

Epidemiologic studies never provide a causal link. They can only provide statistical correlations. WHO is in the business of linking things to cancer and the data show red meat was correlated to just 3 extra cases of bowel cancer per 100,000 people. (Source).

Steve Milloy, proprietor of the JunkScience blog says that “Not a single epidemiological study credibly links meat-eating with cancer.” (Source) He notes:

A cardinal principle of epidemiology is that it is a very useful methodology when looking for linkage between high rates of rare diseases the sort of relationship classically found, for example, in outbreaks of food poisoning.

But epidemiology is wholly incapable of identifying low risks of relatively common diseases or conditions, such as most cancers. The reason for this is simple: the margin of error in study data due to inaccurate and incomplete data collection is typically far greater than the size of any statistical relationship that may exist or be detected.

Accordingly, the rule of thumb in epidemiology, as famously espoused by the National Cancer Institute, is that, “In epidemiologic research, [increases in risk of less than 100 percent] are considered small and usually difficult to interpret. Such increases may be due to chance, statistical bias or effects of confounding factors that are sometimes not evident.”

Further, just because a reported risk is greater than 100 percent, that does not necessarily indicate a cause-and-effect relationship. Such reported risks may be statistically insignificant (indicating they could have occurred by chance) or have wide margins of error (indicating flaky data). And, of course, for any statistical risk to have meaning, it must be backed up by biological plausibility.

The other bogeyman is said to be nitrates and nitrites in processed meat. For a thorough discussion of that subject see: Does banning hotdogs and bacon make sense? by Sandy Szwarc.

The terms “nitrate” and “nitrite” seem to be used interchangeably. Nitrate (NO3) and nitrite (NO2) are inorganic ions that occur naturally and are part of the nitrogen cycle.

Nitrites and nitrates have been long used to preserve meat. They block the growth of botulism, prevent spoilage and rancidity, and preserve the color.

According to Szwarc: “Nitrite is formed in especially high amounts in our mouths from bacteria. Salivary nitrite accounts for 70-97% of our total nitrite exposure. Ingested nitrate (from foods and water) is converted to nitrite when it comes into contact with the bacteria in our saliva. About 25% of the nitrate we eat is converted to salivary nitrate, and up to 20% is converted to nitrite. Most absorbed nitrate is simply excreted in the urine within five hours.”

WHO is concerned with nitrites/nitrates in processed meat which contains about 10 ppm (parts per million). They seem unconcerned with vegetables that contain ten to 100 times the amount of nitrate. The European Food Safety Authority (EFSA) tested the nitrate content of vegetables and reported results in their journal in June, 2008. [Note they report results as mg/kg which is the same as ppm.] Here are some examples of nitrate/nitrite content of vegetables:

arugula 4,677 ppm, basil 2,292 ppm, butterhead lettuce 2,026 ppm, beets 1,279 ppm, celery 1,103 ppm, spinach 1,066 ppm, pumpkin 874 ppm.

Szwarc also reports: “In 1981, the National Academy of Sciences reviewed the scientific literature and found no link between nitrates or nitrites and human cancers, or evidence to even suggest that they’re carcinogenic. Since then, more than 50 studies and multiple international scientific bodies have investigated a possible link between nitrates and cancers and mortality in humans and found no association.”


The Need for Civil Asset Forfeiture Reform

By Paul Albaugh, Patriot Post

Among the many reforms needed in government, civil asset forfeiture might be near the top. While a majority of Americans are not affected by the practice, it’s a problem that if left unchecked could get more out of hand than it already is. The French philosopher Frederic Bastiat warned of “legal plunder” and of a government that takes what it wants. And it never ends well.

Jason Snead of the Heritage Foundation defines civil forfeiture as “a policy that enables law enforcement authorities to seize property or currency if they suspect it is involved in, or is the result of, a crime.”

Based on that definition, it sounds like a reasonable practice on behalf of law enforcement agencies. But that’s not always the case.

The problem lies in the fact that since forfeiture proceedings are civil, not criminal, people who are at the wrong place at the wrong time or who hold property where a crime was committed or thought to be committed are not afforded due process under the law. They are not given the right to an attorney, and their cash and/or property are taken with no way of getting it back without costly and lengthy legal action.

What part of the Constitution does our government, including state law enforcement agencies, not understand?

The Fourth Amendment to the Constitution clearly states, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

The Fifth Amendment is equally applicable. It states, “No person shall be … deprived of life, liberty, or property, without due process of law; nor shall private property be taken for use, without just compensation.”

It is absolutely appalling that law enforcement agencies are using their power in some cases to take people’s property and money without any guarantee of returning it should that person(s) be found not to be involved with criminal activity. But as long as law enforcement agencies can continue to pad their pockets with this cash to purchase new equipment or whatever else is deemed necessary by the agency, why would they stop? Read more

Habitat Exchange: A Bad Idea

By Jeffrey Folks

The idea of habitat exchange has been gaining ground at the state and federal level. Already several western states have implemented small-scale exchanges through which businesses, energy companies, government entities, and developers purchase credits to set aside habitat for endangered species. Once this habitat has been purchased, development is then allowed to proceed without the threat of regulatory delay.

If this sounds a bit like mafia protection money, maybe that’s because it is. Developers who are targeted with the threat of expensive permitting delays are given an offer they can’t refuse. Liberals in government aligned with environmental groups are willing to give them a free pass as long as they pay up.

Habitat exchange has the potential to be something big – much bigger than anything Al Capone could think up. It has the potential to affect nearly all development in the U.S. This is because endangered and threatened species, those now designated and those yet to be, are spread across nearly every county and every state. Once the principle of paying to offset even hypothetical environmental impact has been established, developers everywhere will be forced to pay. This of course is just what environmentalists want – more and more land permanently set aside and less available for future development. Read more



Greenland melting hype

The New York Times is reporting that “Greenland Is Melting Away.” The Times has been reporting this hysteria for 85 years (see historical headlines here).

Tony Heller reports on his Real Science blog that “Even the most ridiculous estimates of ice loss in Greenland are less than 200 km³ per year. The volume of the ice sheet is 3,000,000 km³. Using the most aggressive claims, it would take 15,000 years for the ice sheet to melt. That accounts for a sea level rise of about one hundredth of an inch per year. Does Coral Davenport think that one inch of sea level rise over the next century is going to drown her? But the reality is that the surface of Greenland gains about 300 billion tons of ice every year. Greenland is not melting.”

A 2013 report from researchers at the University of Buffalo finds that fossil records show the Greenland ice sheet was smaller 3,000 to 5,000 years ago. “What’s really interesting about this is that on land, the atmosphere was warmest between 9,000 and 5,000 years ago, maybe as late as 4,000 years ago. The oceans, on the other hand, were warmest between 5,000-3,000 years ago.”

Pierre Gosselin of the NoTricksZone blog reports that the paper “Ice thickness in the Northwest Passage,” published in Geophysical Research Letter shows that in 2014 “more ice survived the summer as multi-year ice (MYI) than in the nine most recent years” and it was only “slightly less than during 1968–2015 on average.”

Also “between November 2014 and April 2015, winter air temperatures were between -0.5°C and -1.5°C colder than during 1980–2010.”

Moreover the study also has climate experts profoundly postponing yet another prediction: The Northwest passage will not be navigable for another 40 years…let alone the Arctic becoming ice free! Results indicate that even in today’s climate, ice conditions must still be considered severe. – Read full paper

French mathematicians say “The battle against global warming: an absurd, costly and pointless crusade”

White paper by French Société de Calcul Mathématique SA

There is not a single fact, figure or observation that leads us to conclude that the world‘s climate is in any way disturbed. It is variable, as it has always been, but rather less so now than during certain periods or geological eras. Modern methods are far from being able to accurately measure the planet‘s global temperature even today, so measurements made 50 or 100 years ago are even less reliable.

Concentrations of CO2 vary, as they always have done; the figures that are being released are biased and dishonest. Rising sea levels are a normal phenomenon linked to upthrust buoyancy; they are nothing to do with so-called global warming. As for extreme weather

events – they are no more frequent now than they have been in the past. We ourselves have processed the raw data on hurricanes.

We are being told that a temperature increase of more than 2ºC by comparison with the beginning of the industrial age would have dramatic consequences, and absolutely has to be prevented. When they hear this, people worry: hasn‘t there already been an increase of

1.9ºC? Actually, no: the figures for the period 1995-2015 show an upward trend of about 1ºC every hundred years! Of course, these figures, which contradict public policies, are never brought to public attention. Download 195-page PDF for full report.

House investigating NOAA effort to disappear the pause

NOAA researchers tried last June to make the global warming pause disappear. The House Science Committee is trying to investigate. Dems are squawking. Read more

German Climate Experts Conference: Antarctica Temperatures Show No Warming Trend In 20th Century

By P Gosselin

Helmholtz Center in Germany: Antarctic temperatures show no warming trend in 20th century. Climate models unable to reproduce real temperature development. Read more

New Study: ‘Climate change’ made California drought ‘less likely’ – Published in Journal of Climate

American Meteorological Society Journal:


The current California drought has cast a heavy burden on statewide agriculture and water resources, further exacerbated by concurrent extreme high temperatures. Furthermore, industrial-era global radiative forcing brings into question the role of long-term climate change on CA drought.

How has human-induced climate change affected California drought risk? Here, observations and model experimentation are applied to characterize this drought employing metrics that synthesize drought duration, cumulative precipitation deficit, and soil moisture depletion. The model simulations show that increases in radiative forcing since the late 19th Century induces both increased annual precipitation and increased surface temperature over California, consistent with prior model studies and with observed long-term change. As a result, there is no material difference in the frequency of droughts defined using bivariate indicators of precipitation and near-surface (10-cm) soil moisture, because shallow soil moisture responds most sensitively to increased evaporation driven by warming, which compensates the increase in the precipitation. However, when using soil moisture within a deep root zone layer (1-m) as co-variate, droughts become less frequent because deep soil moisture responds most sensitively to increased precipitation. The results illustrate the different land surface responses to anthropogenic forcing that are relevant for near-surface moisture exchange and for root zone moisture availability. The latter is especially relevant for agricultural impacts as the deep layer dictates moisture availability for plants, trees, and many crops. The results thus indicate the net effect of climate change has made agricultural drought less likely, and that the current severe impacts of drought on California’s agriculture has not been substantially caused by long-term climate changes. Source

Dr. Patrick Moore: Should We Celebrate Carbon Dioxide?

2015 Annual GWPF Lecture

Institute of Mechanical Engineers, London 14 October 2015

My Lords and Ladies, Ladies and Gentlemen:

Thank you for the opportunity to set out my views on climate change. As I have stated publicly on many occasions, there is no definitive scientific proof, through real-world observation, that carbon dioxide is responsible for any of the slight warming of the global climate that has occurred during the past 300 years, since the peak of the Little Ice Age. If there were such a proof through testing and replication it would have been written down for all to see.

The contention that human emissions are now the dominant influence on climate is simply a hypothesis, rather than a universally accepted scientific theory. It is therefore correct, indeed verging on compulsory in the scientific tradition, to be skeptical of those who express certainty that “the science is settled” and “the debate is over”.

But there is certainty beyond any doubt that CO2 is the building block for all life on Earth and that without its presence in the global atmosphere at a sufficient concentration this would be a dead planet. Yet today our children and our publics are taught that CO2 is a toxic pollutant that will destroy life and bring civilization to its knees. Tonight I hope to turn this dangerous human-caused propaganda on its head. Tonight I will demonstrate that human emissions of CO2 have already saved life on our planet from a very untimely end. That in the absence of our emitting some of the carbon back into the atmosphere from whence it came in the first place, most or perhaps all life on Earth would begin to die less than two million years from today. Read full speech

Recent lunar eclipse reveals a sign of global cooling in the atmosphere

by Dr. Richard Keen and Anthony Watts

On Sept. 27th, millions of people around the world watched the Moon pass through the shadow of our planet. Most agreed that the lunar eclipse was darker than usual. Little did they know, they were witnessing a sign of global cooling. But only a little.

Lunar eclipses tell us a lot about the transparency of Earth’s atmosphere. When the stratosphere is clogged with volcanic ash and other aerosols, lunar eclipses tend to be dark red. On the other hand, when the stratosphere is relatively clear, lunar eclipses are bright orange.

This is important because the stratosphere affects climate; a clear stratosphere ‘lets the sunshine in’ to warm the Earth below. At a 2008 SORCE conference Keen reported that the lunar eclipse record indicates a clear stratosphere over the past decade, and that this has contributed about 0.2 degrees to recent warming.

The eclipse of Sept. 27, 2015, however, was not as bright as recent eclipses. Trained observers in 7 countries estimated that the eclipse was about 0.4 magnitude dimmer than expected, a brightness reduction of about 33 percent.

What happened? There is a layer of volcanic aerosols in the lower stratosphere, says Steve Albers of NOAA. It comes from Chile’s Calbuco volcano, which erupted in April 2015. Six months later, we are still seeing the effects of this material on sunsets in both hemispheres, and it appears to have affected the eclipse as well.

Volcanic dust in the stratosphere tends to reflect sunlight, thus cooling the Earth below. Read more

Methane is no big deal

by Dr. Tim Ball

Methane is 0.00017% of all atmospheric gases and only 0.36% of the total greenhouse gases. These fractions were so small that even people who didn’t understand the science became skeptical of the claims that it was doing harm. This is when the concept of climate sensitivity appeared except it was called the Global Warming Potential (GWP). The story was that methane was a small percentage of the greenhouse gases, but much more effective than CO2 or H2O. It is a meaningless measure because

The infrared absorption bands of methane, at wavelengths of roughly 3 and 8 microns, are overlain by absorption from water vapor. But once the water vapor absorbs the radiation in these bands, there is really nothing left for methane to absorb. So the estimates of methane being 20-70 times more effective per molecule than CO2 (as estimated by IPCC), or that methane forcing is 20% of CO2 forcing, as shown in various IPCC reports, makes absolutely no sense. (source)


Yale Study Concludes Fracking Does Not Contaminate Drinking Water

The Sierra Club should start printing retractions (something they’ve been getting a lot of practice doing), because researchers from Yale University have concluded that hydraulic fracturing, or fracking, doesn’t contaminate drinking water! “[There is] no evidence of association with deeper brines or long-range migration of these compounds to the shallow aquifers” concludes the new study, which was published in the highly prestigious Proceedings of the National Academy of Science. The study, the largest of its kind, sampled 64 private water wells near fracking sites to determine if they could be contaminated by fracking fluids. The Yale researchers found essentially no contamination in well water, and the amounts they did detect were hundreds or thousands of times smaller than can be detected by commercial labs. Read more

Bacteria in the world’s oceans produce millions of tons of hydrocarbons each year

University of Cambridge

Scientists have calculated that millions of tons of hydrocarbons are produced annually by photosynthetic bacteria in the world’s oceans. Read more

Study: Wind Electricity Several Times as Expensive as Conventional Sources

by D. Brady Nelson

On average, electricity from new wind resources is nearly four times as expensive as that from existing nuclear sources and nearly three times as expensive as using existing coal sources, a new study by the Institute for Energy Research (IER) reports. The study, What is the True Cost of Electricity?, used data from the Energy Information Administration and the Federal Energy Regulatory Commission. Read more here and see IER report “The levelized cost of electricity from existing generation Resources”

The conclusion from IER:

Most existing coal, natural gas, nuclear, and hydroelectric generation resources could continue producing electricity for decades at a far lower cost than could any potential new generation resources. At a coal-fired power plant, for example, when a component wears out, only the component must be replaced, not the entire plant. The same is true for nuclear plants, until they reach their regulatory end of life, which is currently defined to be 60 years but could be extended to 80.9 Under current laws, rules, and regulations, large amounts of generating capacity is slated to retire and will be replaced with new generating capacity which will produce electricity at a far higher average levelized cost. The Institute for Energy Research recently identified more than 110 GW of coal and nuclear generation capacity set to close as a direct result of federal regulations.

When electricity from an existing electric generating plant costs less to produce than the electricity from the new plant technology expected to be constructed to replace it—and yet we retire and replace the existing plant despite the higher costs—ratepayers must expect the cost of future electricity to rise faster than it would have if we had instead kept existing power plants in service.

An unprecedented amount of generating capacity is set to close due to ongoing renewables policies, undervalued capacity markets, currently low natural gas prices, and additional environmental regulations. In the absence of even some of these factors, most existing power plants would remain operational, helping keep electricity costs low for many years or decades into the future.

For Sustainable Energy, Choose Nuclear

By S. Fred Singer

Many believe that wind and solar energy are essential, when the world “runs out” of non-renewable fossil fuels. They also believe that wind and solar are unique in providing energy that’s carbon-free, inexhaustible, and essentially without cost. However, a closer look shows that all three special features are based on illusions and wishful thinking. Nuclear may be the preferred energy source for the long-range future, but it is being downgraded politically.

Fossil fuels, coal, oil, and natural gas, are really solar energy stored up over millions of years of geologic history. Discovery and exploitation of these fuels has made possible the Industrial Revolution of the past three centuries, with huge advances in the living standard of an exploding global population, and advances in science that have led to the development of sustainable, non-fossil-based sources of energy — assuring availability of vital energy supplies far into the future.

Energy based on nuclear fission has many of the same advantages and none of the disadvantages of solar and wind: it emits no carbon dioxide (CO2) and is practically inexhaustible. Nuclear does have special problems; but these are mostly based on irrational fears. Read more

The Failure of U.S. Biofuels Program

By Belinda Silva

Ending a relationship is never easy, even one with a proven history of broken promises, twisted logic, weak justifications and financial exploitation. Such is the bond between the American taxpayer and the domestic ethanol industry. In the beginning, statements of common goals sparked hopeful enthusiasm. Many eagerly supported the romantic notion of growing our way to energy independence and an American-led green-based movement towards world prosperity. But, alas, the thrill is gone, and the truth exposed. The once proud, almost pompous, biofuels sector is struggling for justification.

The affair began in 2007 with the Energy Independence and Security Act (EISA). Contained within the act is the Renewable Fuel Standard (RFS) provisions that sets forth incentives for the development of biofuels such as plant-based ethanol and biodiesel. At the time, Bush had committed to the goal of ending American’s addiction to fossil fuel. The original promise was a reduced dependency on Middle Eastern oil, cleaner air, a boon to agriculture and reduced fuel costs for consumers.

Unfortunately, ethanol has failed to live up to its promised benefits. Recent low prices at the pump have exposed its life-support dependency on the government. Although direct subsidies have expired, ethanol producers continue to benefit from other financial incentives and federal mandates. A study by the NARC Consulting Group calls the program an economic death-spiral and discloses its many flaws. Yet, industry groups rally for maintaining, even increasing, RFS percentages in the face of mounting evidence of the program’s failure. Still, in a recent rule change proposal, the EPA published a plan to amend the mandates.

The statutory requirement to blend government-supported biofuels with free-market fuels is market manipulation. If the value of ethanol and other biofuels were legitimate, forced consumption, through the RFS, would not be necessary. Congress should end this failed relationship and costly experiment. Let the free market drive innovation and job development. Source



Politically Correct Conditioning: It Starts Early In School

Investor’s Business Daily

Indoctrinated: Some can recall a time when our campuses of higher education were zones where free speech thrived. That was another era, though. Today’s students want speech restricted. How did it come to this?

The results of a poll that should be shocking, but sadly aren’t, show that 51% of students favor their “college or university having speech codes to regulate speech for students and faculty.”

Oddly, 95% say that “the issue of free speech” is important at their college or university, while 73% believe that the First Amendment is “an important amendment that still needs to be followed and respected in today’s society.” Only 21% told the Buckley Free Speech Survey that it is “outdated” and “can no longer be applied in today’s society and should be changed.”

Maybe these findings are not so odd, after all. In today’s America, “free speech” and “First Amendment rights” tend not to include any expression that doesn’t conform to left-wing ideology.

Seven years ago, almost two entire college generations in the past, the Acton Institute observed, “Students at colleges and universities who articulate conservative and traditional views are at particular risk of bullying and indoctrination by campus administrators and faculty who are zealous ideologues.”

In that same commentary, author Ray Nothstine noted, “Some administrators practice a brand of radicalism intent on punishing students who dissent from the ideology of the campus power structure.”

This, says Nothstine, is a danger to free society because “students (will) become accustomed to having their rights limited and will be more lethargic in countering possible oppression from a growing and intrusive state.” Read more

The New Cultural Marxism and the Infantilization of College Students

By Thomas DiLorenzo

When socialism finally collapsed all around the world in the late ‘80s/early ‘90s the academic Marxists did not just throw in the towel and face reality. Indeed, not one of them has ever apologized for providing intellectual support for some of the worst mass murderers in world history – Stalin, Mao, Castro, and the rest of the communist/socialist gangsters. Instead, they reinvented themselves in several different ways, including posing as “environmentalists,” and as “cultural Marxists.”

Taking their cue from socialist economist Robert Heilbroner in a September 10, 1990 New Yorker article entitled “After Communism,” many Marxists began promoting socialist central planning of the economy and of society as a whole (a.k.a. totalitarianism) in the name of “saving the planet” from capitalism. The old Marxism was sold in the name of “the people”; the new Marxism said “to hell with people, we’re for the ants, the lizards, snakes, rocks, trees, etc. – Mother Earth. People Schmeople. Hence the “watermelons” were born: green on the outside, red on the inside.

The cultural Marxists take a different approach. They replaced the Marxist theory of class confict between the capitalist “class” and the working class with a new set of classes. Now the supposed eternal conflict is between an “oppressor” class and an “oppressed” class. In essence, the oppressor class consists of white heterosexual males. The oppressed class is everyone else. Armed with this new totalitarian ideology, egalitarianism is still the secular religion of the academic Marxists, with “diversity” being the mating call of the modern academic administrator.

Now that the cultural Marxists are in charge of so many colleges and universities, they no longer even pretend to defend academic freedom and free speech. Read more

“A precarious and threatening situation has developed for climatology: a tremendous effort was made to land research funds in all countries, mostly the USA, on the basis of frightening people about the possible drastic affects of Man’s activities, and so much has been said about climate warming there will be an awkward situation if the warming doesn’t happen or not to the extent predicted.” – Hubert Lamb founder of the British Climatic Research Unit (CRU) source

“It is the manners and spirit of a people which preserve a republic in vigor. A degeneracy in these is a canker which soon eats to the heart of its laws and constitution.” —Thomas Jefferson, 1781

“Guard with jealous attention the public liberty. Suspect every one who approaches that jewel. Unfortunately, nothing will preserve it but downright force. Whenever you give up that force, you are inevitably ruined.” —Patrick Henry, 1788

* * *

Visit Jonathan’s Wryheat Blog:


Recent past newsletters can be viewed online:


The Constitution is the real contract with America.

* * *

People for the West – Tucson

, Inc.

PO Box 86868

Tucson, AZ 85754-6868


Jonathan DuHamel, President & Editor

Dr. John Forrester, Vice President

Lonni Lees, Associate Editor

People for the West – Tucson, Inc. is an Arizona tax-exempt, 501(c)(3) corporation.

Newsletter subscriptions are free.

In accordance with Title 17 U.S.C. section 107, any copyrighted material herein is distributed without profit or payment to those who have expressed a prior interest in receiving this information for non-profit research and educational purposes only.