Sunday, March 24, 2019

Climate data is the "This is Rachel from cardholder services . . . " of science.

From Bring the Proxies Up to Date!! by Steve McIntyre.

One of my longstanding criticisms of climate alarmism is that the underlying data sets are such a kluge. As a management consultant having performed many IT system implementations and conversions over the years, data quality, data consistency, and data constancy are critical elements for the smooth management of incredibly complex global enterprises. I add data genealogy to that list - where did the data originate, what has been done to it, what has been the chain of custody?

If you ask the question as to what would give us a good temperature data set for understanding climate change (and its sources), you would probably answer at a minimum:
A standard technology for measuring the temperature across the vertical (troposphere to surface to oceanic depths)

A standard technology for measuring the temperature across the territory, i.e. a uniform network of measures across the entirety of the globe.

A common degree of accuracy (one degree, point one degree, point zero one degree?)

A clear documentation record of raw data adjustments (why, how, how much, etc.)

A record that stretches uniformly across centuries and millennia.

A record that is publicly available to all.
But we are a long way from that minimum for many legitimate (and also illegitimate) reasons. That is nobodies fault, but it raises concerns about just how much reliance can be place on the forecasts of klugey models using even more klugey data.
We use multiple proxies at different points in time (ice cores, pollen, tree-rings, etc.).

We don't understand the sensitivities of those proxies, how they correlate with one another, and how they evolve with evolving environmental circumstances.

We use different modern technologies which are not well calibrated with one another.

We use technologies with different degrees of precision.

Prior to 1970 the record is constituted of multiple technologies with no global calibration.

Prior to 1970 the record is geographically patchy with some areas oversampled and other (large to immense) geographic areas un-sampled at all for long stretches of time.

When using proxies, they are typically small samples in small time frames in a very small set of locations.

Records of raw data adjustments are poor or non-existent.

Many data sets have limited public availability
It is not that there definitely is global warming of definitely not global warming. The only thing that is definite is that we do not have adequate data sets to make any firm inferences.

In this post by McIntyre, many of these issues are raised, both in the post and in the commentary attached to it. McIntyre is focused on understanding what has happened to the historical proxies such as tree-rings since the 1980s (when proxies were replaced with satellite data)?
I will make here a very simple suggestion: if IPCC or others want to use “multiproxy” reconstructions of world temperature for policy purposes, stop using data ending in 1980 and bring the proxies up-to-date. I would appreciate comments on this note as I think that I will pursue the matter with policymakers.

Let’s see how they perform in the warm 1990s -which should be an ideal period to show the merit of the proxies. I do not believe that any responsible policy-maker can base policy, even in part, on the continued use of obsolete data ending in 1980, when the costs of bringing the data up-to-date is inconsequential compared to Kyoto costs.

For example, in Mann’s famous hockey stick graph, as presented to policymakers and to the public, the graph used Mann’s reconstruction from proxies up to 1980 and instrumental temperatures (here, as in other similar studies, using Jones’ more lurid CRU surface history rather than the more moderate increases shown by satellite measurements). Usually (but not always), a different color is used for the instrumental portion, but, from a promotional point of view, the juxtaposition of the two series achieves the desired promotional effect. (In mining promotions, where there is considerable community experience with promotional graphics and statistics, securities commission prohibit the adding together of proven ore reserves and inferred ore reserves – a policy which deserves a little reflection in the context of IPCC studies).
Its a great question. Apparently, nobody has looked.

McIntyre is pretty outraged, as is warranted, by the alarmist community dishonesty.
One of the first question that occurs to any civilian becoming familiar with these studies (and it was one of my first questions) is: what happens to the proxies after 1980? Given the presumed warmth of the 1990s, and especially 1998 (the “warmest year in the millennium”), you’d think that the proxy values would be off the chart. In effect, the last 25 years have provided an ideal opportunity to validate the usefulness of proxies and, especially the opportunity to test the confidence intervals of these studies, put forward with such assurance by the multiproxy proponents. What happens to the proxies used in MBH99 or Moberg et al [2005] or Crowley and Lowery [2000] in the 1990s and, especially, 1998?

This question about proxies after 1980 was posed by a civilian to Mann in December at realclimate. Mann replied:
Most reconstructions only extend through about 1980 because the vast majority of tree-ring, coral, and ice core records currently available in the public domain do not extend into the most recent decades. While paleoclimatologists are attempting to update many important proxy records to the present, this is a costly, and labor-intensive activity, often requiring expensive field campaigns that involve traveling with heavy equipment to difficult-to-reach locations (such as high-elevation or remote polar sites). For historical reasons, many of the important records were obtained in the 1970s and 1980s and have yet to be updated. [my bold]
Pause and think about this response. Think about the costs of Kyoto and then think again about this answer. Think about the billions spent on climate research and then try to explain to me why we need to rely on “important records” obtained in the 1970s. Far more money has been spent on climate research in the last decade than in the 1970s. Why are we still relying on obsolete proxy data?

As someone with actual experience in the mineral exploration business, which also involves “expensive field campaigns that involve traveling with heavy equipment to difficult-to-reach locations”, I can assure readers that Mann’s response cannot be justified and is an embarrassment to the paleoclimate community. The more that I think about it, the more outrageous is both the comment itself and the fact that no one seems to have picked up on it.
From a data integrity perspective, I become alarmed when data is hidden or restricted, when there are claims about data complexity, arm-waving about how expensive it would be to validate data.

These are all reasonable excuses under particular circumstances. But just as when you take a call and discover that "This is Rachel from cardholder services . . . " and you know without listening that this is a time-suck or fraud about to be committed, similarly when researchers, data analysts or business people start talking about complexity and cost of data integrity and arguing for why the data cannot be shared, you immediately know that something is rotten in the state of Data Denmark.

No comments:

Post a Comment