USA Surface Stations


Two years ago I tackled the subject of the reliability of the published USA average surface temperature that has been maintained through 1200 plus stations for over a hundred years. A poster on a climate forum had wrapped himself with the mantle of authority to proclaim that this had all been rigorously audited and could be accepted as such. I was looking for some actual work on the subject because everyone was relying on these numbers. I never was satisfied at the time and came across some pretty bad examples to help raise my concerns that the impact of urbanization was huge and not accounted for.

Next Month’s Analog magazine tackles the problem with Koonstra and he goes to a site organized a couple of years back (about the time I was getting frustrated) for hard data.

http://www.surfacestations.org/

It is really pretty awful. I thought that we faced a modest erosion of data quality. It looks more like a landslide and plausibly ninety percent of the data is compromised. The system has never been subjected to a scientific audit. In fact it is so bad that it is necessary to build a parallel monitoring station to each and every extant site in a nearby location that properly replicates original conditions. This will then allow a decade of parallel data collection to take place and it will give us a present measure for the bias.

It will not chart the historical development of the bias but it will develop subsets that are naturally creditable. The Midwest and Great Plains should be easy to tighten up and properly correct, while I really do not want to try it at all in the mountains for many sites. There is a lot of expensive work to be done to get the data properly back to been reliable.

The evaluation prepared can be downloaded and includes excellent maps that shows just how bad it is. I quote from the summary as follows:

During the past few years I recruited a team of more than 650 volunteers to visually inspect and photographically document more than 860 of these temperature stations. We were shocked by what we found.

We found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. We found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas.

In fact, we found that 89 percent of the stations – nearly 9 of every 10 – fail to meet the National Weather Service’s own siting requirements that stations must be 30 meters (about 100 feet) or more away from an artificial heating or radiating/ reflecting heat source.

In other words, 9 of every 10 stations are likely reporting higher or rising temperatures because they are badly sited. It gets worse. We observed that changes in the technology of temperature stations over time also has caused them to report a false warming trend. We found major gaps in the data record that were filled in with data from nearby sites, a practice that propagates and compounds errors. We found that adjustments to the data by both NOAA and another government agency, NASA, cause recent temperatures to look even higher.

The conclusion is inescapable: The U.S. temperature record is unreliable.

The errors in the record exceed by a wide margin the purported rise in temperature of 0.7º C (about 1.2º F) during the twentieth century. Consequently, this record should not be cited as evidence of any trend in temperature that may have occurred across the U.S. during the past century. Since the U.S. record is thought to be “the best in the world,” it follows that the global database is likely similarly compromised and unreliable.

The report can be downloaded here:

http://wattsupwiththat.files.wordpress.com/2009/05/surfacestationsreport_spring09.pdf


Once we know how badly a given site is biased, it becomes possible to isolate that site and through historical investigation it may be possible to correct some of the sites to some degree or the other by establishing dates of change that affected the original site. I am not too optimistic on this, but subsets and a firm end point should give us an excellent corrective factor for the data that should bring it all back in line.

At this point, I do not think that we are going to find a temperature shift that is in excess of normal experimental error. Climate did change in the past century but only within its normal range. Yet this data was central in the drive to find a theory of causation. In short the conjecture of anthropomorphic global warming may be a response to non existent data.

This was supposed to be the best data available on the globe OMG.

No comments:

Post a Comment