Big Brother and Fake People

It appears that the Federal government, or at least the U.S. Air Force, is seeking computer software to create fake people for propaganda purposes. Federal contract #RTB220610, Persona Management Software:

Software will allow 10 personas per user, replete with background , history, supporting details, and cyber presences that are technically, culturally and geographically consistent. Individual applications will enable an operator to exercise a number of different online persons from the same workstation and without fear of being discovered by sophisticated adversaries. Personas must be able to appear to originate in nearly any part of the world and can interact through conventional online services and social media platforms. The service includes a user friendly application environment to maximize the user’s situational awareness by displaying real-time local information.

The software enables the government to shield its identity through a number of different methods including the ability to assign unique IP addresses to each persona and the ability to make it appear as though the user is posting from other locations around the world.

The intent is to create fake people to post on social network sites for propaganda purposes. So, is this the electronic equivalent of dropping propaganda leaflets, or is it something more ominous? See full story here.

Chicago election officials take note.

This story comes to you from the Ministry of Truth, providing fair & balanced news since 1984.

Is Ford’s New High-Tech Control Dangerous?

“Ford has a better idea” was the company’s slogan in the 1960s. I’m wondering if its current “better idea” is dangerous. I’m referring to a new driver interface in the Ford Edge (also available in the Lincoln).

Ford dashA touch-screen interface does away with most knobs and buttons. According to Consumer Reports (Feb. 2011):

The driver interface systems use an 8-inch video touch screen in the center of the dashboard, with a panel of touch-sensitive buttons under it. It also includes two 4.2-inch dashboard displays flanking the speedometer that can be configured to show different gauges and perform some of the same functions as the center screen.

If that sounds confusing, it gets worse: The system also recognizes and responds to voice commands. It all adds up to three or four ways to make what should be simple adjustments. None of the options works as well or is as easy to use as old-fashioned knobs and switches, and they can be more time-consuming and distracting to operate.

The center screen’s cluttered pages, tiny buttons, and small fonts make choosing the right spot to touch difficult. The screen can be slow to respond.

Touch-sensitive buttons are designed to respond to a finger tap or swipe across their surface. They look high-tech but tend either to make bigger adjustments than you want or not respond at all – especially if you are wearing gloves. Their small size makes them difficult to find at a glance.

How is use of this interface less distracting than talking or texting on a cell phone? In places where use of cell phones while driving is outlawed, could this car be outlawed? Bring back the knobs and buttons.


I do not like robo-calls from political campaigns or any other telemarketers. I know that a call is much less expensive than campaign literature, but a call is an intrusion. And I tend to vote against intruders. And, I’m wise to the trick of push polls. A push poll is where, under the guise of opinion polling, disinformation about a candidate or issue is planted in the minds of those being ‘surveyed’. Push-polls are designed to shape, rather than measure, public opinion.

As for other telemarketers, I have a policy of not doing business with any company that makes cold calls to me. And I always report them to the FTC. (You can file a complaint with the FTC here: )

So politicians, do not call. Send your campaign literature if you must; it makes good compost.

Biosphere 2 Ready for New Research

Biosphere 2, that grand experiment with a checkered history, is being readied for new research conducted by the University of Arizona. Tuesday evening, Dr. Travis Huxman discussed plans for the facility with a group of about 30 people at the Cushing Street Bar.

Huxman, who has a doctorate in biological sciences, and is an associate professor of ecology and evolutionary biology at the U of A, is the new director of the Biosphere 2 research program.

For those who may be unfamiliar with Biosphere 2, here is some background. The concept was to construct a self-contained biosphere to investigate what would be needed to colonize other planets, such as Mars. The main structure, built near the town of Oracle, AZ, is a 3.15 acre greenhouse which was to be a self-sustaining ecosystem containing several plant biomes and an “ocean” to grow fish. The facility was built with $150 million in private funds in the late 1980s.

In September of 1991, a group of “biospherians” (four men and four women) entered the greenhouse for a planned two-year stay. It was intended that they depend only on what was inside the enclosure. As noted in a Wikipedia article: “All seven ecosystems of Earth exist within the confines of Biosphere II. They are a rainforest, a desert, a savanah, a marsh, a farmland (in an area called the Intensive Agriculture Biome), and a ‘human habitat’.” [I guess the ocean makes seven.] “Thus, it contains soil, air, water, animals, and plants. About 4,000 plants and animals were introduced to Biosphere II, and the ocean contained 900,000 gallons of water. It was hoped that these provisions would give the ecosystems enough material to be self-sustaining.”

As with many experiments, things didn’t go as planned. One of the main problems was that organic-rich soil consumed too much oxygen. The original oxygen content of 20.9% dropped to 14.5% after 18 months. That’s the equivalent of an altitude of 13,400 feet, and the biospherians suffered from high-altitude effects. Because they were in a greenhouse, the daily fluctuation of carbon dioxide was about 600ppm (current atmospheric concentration is about 390 ppm). During the day, with strong sunlight, plants revved up photosynthesis and used up carbon dioxide, but respired it back at night. There was also a seasonal variation in carbon dioxide, and wintertime levels reached about 4,000 ppm.

This first phase ended in September, 1993 as planned. After a 6-month transition, another group of seven people entered the greenhouse, but injuries and social problems caused abandonment of the project in 1994.

Columbia University took over in 1995 and operated the facility until 2003. Columbia “broke the seal” and formed a flow-through system to test effects of carbon dioxide among other things.

Through all of this, the facility was open for tours and derived much of its operating revenue from visitors. By 2006 the property was zoned for urban development and in 2007 sold to a developer who had planned houses and a resort hotel. However, the University of Arizona took over management responsibilities in June, 2007. And that brings us back to Huxman.

Huxman said that U of A research will “focus on environmental challenges of the day.” And by that he meant they would study initially, at least, the relationship between carbon, water, and energy, essentially photosynthesis, and how it can be applied to current issues.

Huxman mentioned solar power and the smart grid system since apparently Biosphere 2 gets some of its electricity from solar collectors. He said that with a smart grid system, the power company can turn off an individual’s solar system, which might generate power to the common grid in order to protect workers doing repairs on the lines. Biosphere 2 will not be a participant in the smart grid system so as to prevent such power outages. This will allow researchers to better control variables and also test software that manages smart grids.

Huxman says that under U of A management, Biosphere 2 will be better committed to a relationship between science and society, and that even now visitors can watch graduate students conducting experiments.

One of the planned projects is to build a model of a watershed to study the dynamics of how water gets to plants and how soil structures evolve. He wants to know how water gets into the aquifers. (A geologist could tell him that most aquifer recharge occurs at the mountain front.) After the “naive” model is working, they will introduce plants to see how that changes the soil structure. Once they learn from the model, they plan to try it outside in the real world.

They will also study ways to stabilize mine tailings.

Who is paying for all this? According to Huxman, major funding is coming from the facility owners and foundations. Much of the operating budget will come from visitor admissions; a minor part comes from the University and from corporations.

Will they be successful? Only time will tell. You can visit Biosphere 2. You can get information from , email to or call 520-838-6200. Currently admission price is $20 for adults. Lower prices are available for seniors and children.

And, by the way, the Cushing Street Bar has Guinness on tap.

Another Federal Boondoggle?

“Obama’s federal government can weatherize your home for only $57,362 each.” That was the headline in a Los Angeles Times story yesterday, based on numbers from the GAO. Nobel prize wining energy secretary Steven Chu blames government red tape. Who would of thought it? The Energy Department disputes GAO figures.

Read the Story:

Climategate The Plot Thickens

Here’s some of the news that our Main Stream Media didn’t report.

Russian IEA claims CRU tampered with climate data – cherrypicked warmest stations

“On Tuesday, the Moscow-based Institute of Economic Analysis (IEA) issued a report claiming that the Hadley Center for Climate Change based at the headquarters of the British Meteorological Office in Exeter (Devon, England) had probably tampered with Russian-climate data. The IEA believes that Russian meteorological-station data did not substantiate the anthropogenic global-warming theory. Analysts say Russian meteorological stations cover most of the country’s territory, and that the Hadley Center had used data submitted by only 25% of such stations in its reports.”

Source: Scroll about half way down the page.

Global weather dataset being systematically corrupted

“For the past six days, several climate scientists have discovered an alarming trend: clear evidence of alteration of historical data at weather stations around the world, in order to support the contention of anthropogenic global warming (AGW). The changes appear to affect the Global Historical Climate Network (GHCN), a project of the National Oceanographic and Atmospheric Administration’s National Climate Data Center.”


Antarctic GHCN uses single warmest station instead of whole dataset

“Of all the stations available in the antarctic, GHCN has chosen to use a single station on the Antarctic Peninsula to represent an entire continent of the earth for the past 17 years. But it’s not just any station, it’s a special one. Rothera Point has the single highest trend of any of the adjusted station data.”


Computer programmer makes case that release of files from CRU was an inside job.


How Wikipedia’s green doctor rewrote 5,428 climate articles


Politicians take note: It could be that all the sound and fury over climate change is based on bad data. For traders in carbon credits: the house of cards is beginning to fall and your market may be the next multi-billion dollar bubble to burst.

Feedback from a Vested Interest

In my previous post on the global warming industry, I mentioned the names of several companies that I thought had a vested interest in maintaining the myth that carbon dioxide is a major driver of temperature.

Yesterday, I received some feedback from one of those companies : I represent Hara and wanted to clarify a sentence you wrote: “Al Gore’s venture capital firm, Hara Software which makes software to track greenhouse gas emissions, stands to make billions of dollars from cap-and-trade regulation.” My intention is not to dispute your opinion, but rather to make clear a fact: Hara is one of Kleiner Perkins Caufield & Byers portfolio companies. Al Gore is a partner at KPCB, but Hara is not his VC firm.

The KPCB website says to “Think of it as relationship and venture capital.” So KPCB is a venture capital organization and Al Gore is a partner. The Hara website says “Hara was originally funded in 2008 by Kleiner Perkins Caufield & Byers.” That means to me that Hara is a firm founded by a venture capital investment from Al Gore and his partners. Glad we cleared that up.

Hara sells, among other things, software to track greenhouse gas emissions. From the Hara website I learned, that the City of Palo Alto, California, has a “Climate Protection Team” and a “Sustainability Team” and that with Hara software “each employee can enter commute and other data that impact overall City emissions.” I bet the city employees love that.

Note: The City of Tucson has a Climate Change Advisory Committee. Maybe they are potential customers for Hara.

Upon looking at KPCB’s website, under “initiatives” I found this statement: “At the same time we face climate crisis.” Recent events show the “crisis” is manufactured, see here and here.

KPCB’s next statement: “Atmospheric CO2 levels are at an all-time high, with accelerating growth,” is wrong on two counts. Atmospheric carbon dioxide has, for most of the history of this planet, been more than 10 times current level. According to the NOAA Mauna Loa Observatory, carbon dioxide levels are increasing but not accelerating. See graphs below.








Slack lines cause power outages

On Tuesday, Oct. 27, my westside neighborhood experienced approximately 30 short power outages during the very windy day and evening. Each would last from 30 seconds to five minutes. That made work on a desktop computer very difficult. By the way, my neighborhood has all underground utilities.

I inquired of TEP to see what was happening and what they were doing about it.

This morning I received a call from TEP spokesman Joseph Barrios. He said that overhead lines leading into the neighborhood carry both high-voltage transmission lines and lower-voltage distribution lines. Slack in the lines caused them to touch and short out, tripping circuit breakers. The breakers would automatically reset.

Barrios said that if this happens more than four times, the TEP crews search out the cause. Apparently over time, the lines stretch, resulting in too much slack. Barrios said that crews have now tightened the lines and installed “separators” which hopefully will prevent future problems, at least for a while.

Although the power outages were inconvenient, I appreciate that TEP crews got to work to identify the problem and that Mr. Barrios telephoned me with an explanation.

Arctic Temperatures: Not So Hot

A new study claims that Arctic temperatures have risen 2.2 degrees Fahrenheit over the last decade, bringing to an end a 2000-year cooling trend. The study authors claims thathuman CO2 emissionsare the cause.


The authors claim: “Our reconstruction shows that the last half-century was the warmest of the last 2,000 years. Not only was it the warmest, but it reversed the long-term, millennial-scale trend toward cooler temperatures. The cooling coincided with the slow and well-known cycle in Earth’s orbit around the sun, and it should have continued through the 20th century.” “The evidence was found by generating a 2,000-year-long reconstruction of Arctic summer temperature using natural archives of climate change from tree rings, glacier ice and mostly from lake sediments from across the Arctic, a region that responds sensitively to global changes.”

Why did they use proxy data for the last 100 years when they could have just looked at thermometer records? Oh, but thermometry shows that is was warmer in the 1930s and 1940s.

The new study presents a curve which is reminiscent of the thoroughly debunked “Hockey Stick” of Michael Mann. The new proxy reconstruction fails to show the well-documented Medieval Warm period of 1,200 yeas ago when temperatures were higher than now. It appears that authors of the new study are using the same statistical malfeasance and cherry-picking of data that were used for the old hockey stick.

Steve McIntyre of Climate Audit discusses the new study. “The problem with these sorts of studies is that no class of proxy (tree ring, ice core isotopes) is unambiguously correlated to temperature and, over and over again, authors pick proxies that confirm their bias and discard proxies that do not.”

Records from the Danish Meteorological Institute show no warming since 1958 and that the 2009 temperature variation is almost identical to 1958. DMI says that the Arctic was warmer in the 1940s than now.

A Duke University-led analysis of available records shows that while the North Atlantic Ocean’s surface waters warmed in the 50 years between 1950 and 2000, the sub-polar regions cooled at the same time that subtropical and tropical waters warmed. This pattern can be explained largely by the influence of a natural and cyclical wind circulation pattern called the North Atlantic Oscillation (NAO)

A 2008 study by Håkan Grudd of Stockholm University’s Department of Physical Geography and Quaternary Geology, found that “The late-twentieth century is not exceptionally warm in the new Torneträsk record: On decadal-to-century timescales, periods around AD 750, 1000, 1400, and 1750 were all equally warm, or warmer. The warmest summers in this new reconstruction occur in a 200-year period centred on AD 1000. A ‘Medieval Warm Period’ is supported by other paleoclimate evidence from northern Fennoscandia.”

Besides the controversy over temperatures, there is also media attention given to Arctic sea ice extent. For instance, news media made much of the fact that during the summer of 2007, Northern Hemisphere sea ice area was at a historic minimum (2.92 million sq. km). What was little reported, however, was that in 2007, Southern Hemisphere sea ice extent broke the previous maximum record of 16.03 million sq. km and reached 16.26 million sq. km. (August, 2007). [Source: The Cryosphere Today, a publication of The Polar Research Group, University of Illinois]

To put things in further perspective, consider these reports:

“A considerable change of climate inexplicable at present to us must have taken place in the Circumpolar Regions, by which the severity of the cold that has for centuries past enclosed the seas in the high northern latitudes in an impenetrable barrier of ice has been, during the last two years, greatly abated.”

“2000 square leagues [approximately 14,000 square miles] of ice with which the Greenland Seas between the latitudes of 74 and 80 N have been hitherto covered, has in the last two years entirely disappeared.”

These paragraphs, however, are not the latest scare story from the greenhouse industry, but extracts from a letter by the President of the Royal Society addressed to the British Admiralty, written in 1817 (Royal Society, London. Nov. 20, 1817. Minutes of Council, Vol. 8. pp.149-153).

 When this report was written, 192 years ago, the planet was in the midst of the Little Ice Age. How could the ice disappear in a Little Ice Age?

There is also the following story:


“The Arctic ocean is warming up, icebergs are growing scarcer and in some places the seals are finding the waters too hot, according to a report to the Commerce Department yesterday from Consul Ifft, at Bergen , Norway .
Reports from fishermen, seal hunters and explorers, he declared, all point to a radical change in climatic conditions and hitherto unheard-of temperatures in the Arctic zone. Exploration expeditions report that scarcely any ice has been met with as far north as 81 degrees 29 minutes. Soundings to a depth of 3,100 meters showed the gulf stream still very warm.
Great masses of ice have been replaced by moraines of earth and stones, while at many points well known glaciers have entirely disappeared. Very few seals and no white fish are being found in the eastern Arctic, while vast shoals of herring and smelts, which have never before ventured so far north, are being encountered in the old seal fishing grounds.”
 This is from an AP story which appeared in the Washington Post, November 2, 1922.

Could it be that carbon dioxide and global warming have nothing to do with it? Well, yes.

A study conducted by NASA’s Jet Propulsion Laboratory, says unusual winds caused the 2007 Arctic minimum. Their press release says:

“Unusual atmospheric conditions set up wind patterns that compressed the sea ice, loaded it into the Transpolar Drift Stream and then sped its flow out of the Arctic. When that sea ice reached lower latitudes, it rapidly melted in the warmer waters.”

“The winds causing this trend in ice reduction were set up by an unusual pattern of atmospheric pressure that began at the beginning of this century.”

The fact that a 192-year-old report on Arctic ice is very similar to one today lends credence to the contention that changes in ice cover are natural cyclic phenomena and not due to the increase in atmospheric carbon dioxide.  AccuWeather says the changes in wind may be due to changes in the Arctic Oscillation (AO) and the North Atlantic Oscillation (NAO) which are large atmospheric circulations that have major impacts on the weather in certain parts of the world.

Perhaps reporters should do some investigation so they can report all of the news and put things in perspective. Ah, but only sensational headlines sell papers.

Oceans Warmer?

A front page story in the Arizona Daily Star today (9-2109) proclaims “World’s Oceans Warmer Than Ever.” Well, not exactly, it depends on which data set you are reading.  The graph below from NOAA (NOAA’s ERSST.v3b version) shows that past July ocean temperatures have exceeded current values.

To see an analysis of this Associated Press story, see