With the diplomacy of Climate Change in some disarray after Copenhagen, back to all those emails.

Here is a powerful piece from Dale Amon at Samizdata which looks at the technical methodology. Namely, if (big assumption) one pulls together a great mass of reliable data, how to present the data in a meaningful way?

Lots of assumptions are built in to this process. And it can be done with more or less transparency. And no electronic hiccups.

Or not:

Let me explain. Computers represent numbers in binary. Any signed representation (ie one that handles plus and minus) will use some formatting trick to differentiate the two. The problem is, if a positive number gets incremented to be one bit too big… it may suddenly become a negative number.

Regardless of what does happen, any calculation using the value after an overflow might as well be a random number generator. The results are totally, utterly worthless. There is not a chance in hell that the output will be meaningful.

There are ways of dealing with this sort of thing but I will not go into that sort of techno-detail here. My goal is simply to point out that if the statements I heard are true, I must cease to believe the validity of any output from CRU and CRU related models.

There is really only one acceptable way for the field to recover credibility and reinvigorate trust. The code for models must all be made open source. It must be released into the public domain where experts in numerical programming can openly argue about the validity of the code, the mathematical techniques and the mathematical and physical simplifications and assumptions it contains.

I will no longer believe results which lack this corroboration. If an author refuses, I am going to assume they have misdeeds to hide.

Fair enough?

But also this:

A collapse in carbon output is going to occur and the reasons for it have nothing to do with cap and trade or Copenhagen or any other State or NGO foisted crisis plan. By the middle of this century liquid fuels such as gasoline will be generated using the Fischer-Tropsch process in some updated form.

It will be carbon neutral because part of the feedstock will be free for the taking: atmospheric CO2. It will be split using either grid power, mechanical nanotechnology or genetically modified algae (some of which is purportedly working already). With the addition of energy, CO2 -> CO + O, and the Carbon Monoxide may be fed into the same FT process that was used to fuel the Nazi war machine. Towards the end of World War II this was nearly the only source of fuel available to Germany. Anyone who believes this technology is unproven on an industrial scale is simply historically ignorant.

Carbon based grid power is already declining as a relative portion of US energy (30% according to a recent SciAm article). I expect that decline to accelerate as use of ever cheapening and ever improving solar panels really starts to bite.

We will also see inputs from Space Based Solar Power growing explosively by 2050. New technology nuclear and perhaps even game changing wild cards like Polywell Fusion will be taking up major roles by then as well.

If you toss in the huge impacts nanotechnology will have on all facets of technological civilization and the expected population decline in the second half of the century one begins to wonder exactly what will be the climate change problem of 2100? If human CO2 inputs collapse and population declines what climatic impact will the modeling of that scenario show?

Good questions.

If you want LOTS more analysis of the leaked emails from someone with, hem, strong academic science credentials, here is John P. Costella B.E.(Elec.)(Hons.) B.Sc.(Hons.) Ph.D.(Physics) Grad.Dip.Ed.