Comments On Federal Scientific Integrity
By Kenneth Haapala, President,
The Science and Environmental Policy Project (SEPP)
July 28, 2021
[See original paper with End Notes here]
“It is one thing to impose drastic measures and harsh economic penalties when an environmental problem is clear-cut and severe. It is quite another to do so when the environmental problem is largely hypothetical and not substantiated by careful observations.
This is definitely the case with global warming.” – Frederick Seitz, 17th president of the United States National Academy of Sciences
This paper addresses the scientific integrity involved in the fear that human additions to atmospheric carbon dioxide will cause significant global warming. To comprehend how carbon dioxide influences the globe’s temperatures one must comprehend the greenhouse effect, how different greenhouse gases influence the loss of heat to space, how different greenhouse gases influence the effectiveness of other greenhouse gases in the atmosphere, and how increasing the greenhouse effect influences climate. Further, to comprehend how human emissions of greenhouse gases affect climate requires separating the greenhouse effect from other human impacts on climate such as urbanization. Also, it requires separating human impacts from natural climate change including changing ocean circulation and changing solar influences which we do not fully understand.
This brief paper is divided into six basic sections: One, the importance of the scientific method for understanding the physical world; Two, the changing climate; Three, the importance of the
greenhouse effect, including carbon dioxide, for life on this planet; Four, problems with global climate models used to predict dire consequences from increasing greenhouse gases, particularly
carbon dioxide; Five, modern physical evidence supporting an alternative analysis of the greenhouse effect; and Six, a suggested policy for going forward with steps the nation needs to take.
Section 1, The Scientific Method: The scientific method is a process of eliminating error in thinking and concepts by constantly subjecting concepts to rigorous testing using all available
physical evidence that is appropriate. As physical evidence changes, the concepts must be changed accordingly. The 20th century began without theories of relativity or quantum mechanics, which upset classical physics, and belief that continents did not move. Today, we
make use of these concepts and are experiencing constant change in communications, electronics, and similar technologies. Who knows what new developments may bring?
With dramatic change in our knowledge of the physical world, including science and science-based technology, such as nuclear weapons, scientists acquired political influence and responsibility. There should be no issue as to the rigorous application of the scientific method, particularly by scientists employed or sponsored by the US government, who are responsible to the American public. Political beliefs need to be set aside. As Steven Koonin (the Chief Scientist of the Department of Energy under the Obama Administration), who is familiar with complex mathematical physics, mathematical modeling, and the IPCC process, wrote:
“Philip Handler, a former president of the National Academy of Sciences, identified the problem in a 1980 editorial that resonates eerily four decades later:
‘With scientists’ unique role comes a special responsibility. We’re the only people who can bring objective science to the discussion, and that is our overriding ethical obligation. Like judges, we’re obligated to put personal feelings aside as we do our job. When we fail to do this, we usurp the public’s right to make informed choices and undermine their confidence in the entire scientific enterprise. There’s nothing at all wrong with scientists as activists, but activism
masquerading as The Science is pernicious.’”
Since the 1970s there has been a dramatic increase in evidence (data) on the greenhouse effect and how increasing greenhouse gases influence the earth’s atmosphere, thus the climate. It is
incumbent on government scientists and government sponsored scientists to apply the scientific method and incorporate these new data (evidence) in their reports, so they do not mislead the
Section 2, The Changing Climate: Atmospheric physicist Richard Lindzen is noted for his work in dynamic meteorology, atmospheric tides, ozone photochemistry, quasi-biennial oscillation, and the Iris hypothesis. Lindzen described the generally accepted view of the earth’s climate system as circulation of two fluids (atmosphere and oceans) interacting with each other and the uneven land, made turbulent by the rotation of the globe – exposing the fluids and the
land to uneven heating by the sun. (The energy flow from the sun to the earth varies as well.) The entire system involves fluid dynamics which is not fully understood. As such, “The fact that these circulations carry heat to and from the surface means that the surface itself is never in equilibrium with space. There is never an exact balance between incoming heat from the sun and outgoing radiation generated by the Earth. This is because heat is always being stored in (and released from) the oceans. Therefore, surface temperature is always varying somewhat.”
After discussing the substantial energy transfers from the phase changes of water, Lindzen brings up the greenhouse effect and states: “…that the two most important greenhouse substances by far are water vapor and clouds. Clouds are also important reflectors of sunlight. “The unit for describing energy flows is watts per square meter. The energy budget of this system involves the absorption and reemission of about 200 watts per square meter. Doubling
CO2 involves a 2% perturbation to this budget. So do minor changes in clouds and other features, and such changes are common….”
Lindzen concludes the section by discussing “unforced” natural variation that may take 1,000s of years to appear. Thus, climate change involves two parts of physics for which we have no
comprehensive theories established by physical evidence: 1) fluid dynamics and 2) the greenhouse effect.
Section 3, The Greenhouse Effect: The greenhouse effect makes the earth inhabitable. Developing laboratory experiments starting in 1859, John Tyndall recognized that greenhouse gasses warm the atmosphere by slowing heat loss from the surface to space. This slowing of infrared energy to space makes the earth inhabitable, with the principal greenhouse gas being water vapor. Tyndall noted that the influence of some greenhouse gases is not proportional to
Decades of laboratory experiments show that carbon dioxide is an effective greenhouse gas only at extremely low concentrations. Its effectiveness is exhausted at less than one-half of the concentration it was when humans began using fossil fuels. Humans increasing the concentration of carbon dioxide is like adding a thin sheet on top of a thick quilt, it does little or nothing. In the 1970s, with parts of the world undergoing economic development and carbon dioxide concentrations increasing, surface temperatures indicated the earth shifted from cooling to warming. The National Research Council formed an influential panel which asserted that even
though laboratory experiments demonstrate that carbon dioxide has a modest effect on temperature, the slight warming caused by carbon dioxide – less than 3 percent of the total atmospheric warming effect – would be greatly amplified by increases in water vapor. This was a guess, without physical evidence. The guess managed to turn a modest 2-degree Fahrenheit (°F) maximum increase due to a doubling of carbon dioxide alone into a speculative 6 °F total increase.
The UN Intergovernmental Panel on Climate Change (IPCC) and many US scientific bodies have incorporated the guess into unstated assumptions. However, starting in 1979, the US
developed a significant body of observations of the atmosphere using satellites. Forty years of measured atmospheric temperature trends, the only comprehensive global temperature dataset
existing, confirm a century of laboratory experiments. The effect of increasing carbon dioxide is small, much less than natural variation. At the surface, it is difficult to separate the increase in the greenhouse effect from natural variation. Further, the speculated amplification from increased water vapor cannot be found. For over 40 years the US has compiled data on the greenhouse effect itself, supporting the atmospheric temperature trends – increasing carbon dioxide will produce a modest warming. The current warming of the atmosphere is 0.25 °F per decade since January 1979, or about 1 °F since January 1979, or about 2.5 °F per century. It is in the middle of the lowest set of estimates of warming currently developed by the IPCC, which assumes little increase in carbon dioxide. This includes
all greenhouse gases and natural variation. It is well within the range of natural historic warming.
Based on observations by NOAA at Mauna Loa, Hawaii, each year the maximum atmospheric carbon dioxide concentration occurs in May. May measurements grew from 339 parts per million in volume (ppm) in 1979 to 419 ppm in 2021. This is an increase of 80 ppm or 24%. Yet the increase in atmospheric temperatures from all sources was only 1°F. The most appropriate physical evidence does not support the fear that increasing carbon dioxide is causing dangerous warming.
For increasing carbon dioxide to cause surface warming, the atmosphere must warm at a greater rate than the surface, but the opposite is happening. The probable causes of surface warming are
urbanization, changes in ocean circulations, and solar variations that we do not fully understand, not greenhouse gases. In general, those using surface data to claim dangerous warming ignore such changes. They use models which have never been validated (using physical evidence from the atmosphere) to speculate 30 to 80 years into the future.
Section 4, Problems with Global Climate Models: In his book “Unsettled,” Steven Koonin identifies numerous deficiencies in the IPCC process that need to be addressed; its findings are used for public policy. Among the more serious deficiencies Koonin discusses are:
A) a confusion in scale between Celsius and Kelvin when estimating the influence of doubling carbon dioxide resulting in significant error, and
B) IPCC models do not track the warming trend in the surface temperature record between 1910 and 1940.
Koonin also points out the complexity of the climate models, which divide the surface and the atmosphere into various hypothetical boxes called cells. Accurate measurements are needed for all the cells, but the measurements don’t exist. Further, the cells are so large that important weather events may be missed. Most important, the IPCC conclusions are political not scientific:
“And—a very key point—the IPCC’s ‘Summaries for Policymakers’ are heavily influenced, if not written, by governments that have interests in promoting particular policies. In short, there are many opportunities to corrupt the objectivity of the process and product.”
A book by Japanese climatologist and former NASA researcher Mototaka Nakamura shows the deficiencies in Earth’s surface temperatures and considers them unreliable before 1980.
A quasi-global observation system has been operating only for 40 years or so since the advent of artificial satellite observation. Temperature data before then were collected over extremely small
(with respect to the Earth’ s entire surface area) areas and, thus, have severe spatial bias. We have an inadequate amount of data to calculate the global mean surface temperature trend for the pre-satellite period. This severe spatial bias in reality casts a major uncertainty over the meaningfulness of “the global mean surface temperature trend” before 1980.
Nakamura also discusses efforts to discredit his views which failed. Unlike Koonin, who accepts global mean surface temperature trends before 1980, Nakamura states these trends are highly
questionable. For example, for over 2,000 years changes in land use, such as draining wetlands, clearing forests, irrigation, urbanization, etc., have been recognized to change local climate temperatures. Nakamura states that global surface temperature trends are based on a few, highly localized measurements, and the entire record on which global climate models are based may be highly biased. Indeed, when comparing the results of global climate models to what is occurring in the atmosphere, where the greenhouse effect occurs, the models are highly biased in overestimating the warming effect of carbon dioxide and other greenhouse gases. Both authors
express major difficulties in the approach used by the UN IPCC and US government in assessing the effect of greenhouse gases.
The atmospheric temperature effects of greenhouse gases are far less than what the models show.
Christy, et al., compared four different satellite datasets, four different weather balloon datasets, and four sets of weather reanalyzes with the average of model simulations used in the Fifth
Assessment Report of the IPCC. The researchers found the models grossly overestimate actual atmospheric temperature trends and that the disparity is increasing. Global climate models may
be useful teaching tools, but they are not useful for government policy on greenhouse gases.
Section 5, Modern Physical Evidence Supporting an Alternative Analysis of the Greenhouse Effect: During the 20th century great changes occurred in physics such as relativity and quantum mechanics. which describes the physical properties of nature on the molecular, atomic, and subatomic level. Quantum mechanics led to the field of physics called Atomic, Molecular, and Optical Physics (AMO), which enabled the development of databases that can be
used to directly calculate the greenhouse effect in the atmosphere.
Using the HITRAN database, AMO authorities W. A. van Wijngaarden and W. Happer have estimated the influence of water vapor, carbon dioxide, ozone, nitrous oxide, and methane in a
cloud free atmosphere to increase global temperatures. Under current atmospheric conditions, increasing water vapor and carbon dioxide have a minimal effect on temperatures. At last, we have calculations that agree with measurements of what is occurring in the atmosphere.
Section 6, Going Forward: Scientific integrity requires that the Biden Administration employ the most rigorous application of the scientific method. As shown above, observations using 21st century technology support certain concepts of the 20th century and demonstrate others to be false. Scientific integrity requires that, the administration should not use long-range models for policy until the models reach the high standards for verification and validation met for modeling the reliability of nuclear weapons by Sandia
National Laboratories; or the standards required by the Apollo Team of scientists and engineers for manned lunar exploration.
Since there is no current physical evidence of dangerous global warming from greenhouse gases or their effects, and no physical evidence of a climate crisis, the administration should use
atmospheric temperature trends and the MODTRAN and HITRAN databases to estimate the effects of increasing greenhouse gases in the atmosphere. Further, the government should continue to
• Monitor atmospheric temperatures as has been done for 40 years and
• Monitor outgoing electromagnetic radiation as being done by the CERES project.
Above all, the Biden Administration should inform the public that there is no current threat, and that it is using the best science possible to monitor the situation to assure that a threat is not
[Note: other papers from SEPP and a weekly review “The Week That Was”, published every Monday, can be found at http://www.sepp.org/ ]