Wednesday 9 December 2009

That Met Office Data In Full...

The Met Office has released some land surface climate station records.

The data subset consists of a network of individual land stations that has been designated by the World Meteorological Organization for use in climate monitoring. The data show monthly average temperature values for over 1,500 land stations.
But without the full list of stations, we can't verify if the station set is either correct or complete using their algorithm.
The data that we are providing is the database used to produce the global temperature series. Some of these data are the original underlying observations and some are observations adjusted to account for non climatic influences, for example changes in observations methods.
 So, it's not the raw data. Which means we can't verify the adjustments being correctly made.
The data set of temperatures back to 1850 was largely compiled in the 1980s when it was technically difficult and expensive to keep multiple copies of the database.
The unzipped database is 33mb in size. You could comfortably fit that on a mag tape.

Let's be clear about this from a data processing perspective: You don't delete raw data. You keep it because you can always then reconstruct things. Mag tapes weren't so expensive that someone couldn't afford to keep all the data on one.

This is not a new data set. Data sets are only released when they have gone through the proper process of scientific review.

Can someone explain this? We've got altered temperatures, yet no original data and no program code for the alterations. How was this peer-reviewed then?

No comments:

Post a Comment