Lord Krebs from Jesus College Oxford lays it on the line about scientists’ human frailties:
First, scientists, just like every other trade — bus drivers, lawyers and bricklayers — are a mix. Most are pretty average, a few are geniuses, some are a bit thick, and some dishonest.
Glad that’s cleared up! Maybe the BBC can help us by putting its various scientists in the right category when they appear on TV or radio.
Scepticism is fine but science is not a free-for-all. Whether or not you accept the sceptics’ view should depend on careful weighing of the evidence.
Fair enough.
Is not the problem here a deep one of Method?
I was at Harvard when the Y2K scare was at its scariest. Various experts told us that computing is built on layer after layer of coding, some of which is now old and buried so deep in the programs that it is for all practical purposes undiscoverable.
Which was why the world was about to grind to a halt when the year 2000 dawned. Buried deep in our myriad computer programs was long-lost or inaccessible code which would not recognise the new year’s 00 format, and this would cause everything to stop working.
Luckily we saw that one coming. And life went on.
So when different people try to assess what areas of activity of human activity are (a) causing significant climate changes where (b) such changes are evidently negative, and (c) advise us all on what to do about it, the issue stands or falls on the issue of measurement.
Voluminous quantities of data need to be gathered from all round the world for sustained periods, and then aggregated to try to see what trends and patterns might be found.
Of course it looks convincing when all that data is added to the computer and elegant hockey-stick graphs pop out from the printer.
But, crucially, the results are only as good as the assumptions factored in to the computer programs which have spewed them out, and the accuracy of the computer programs themselves in reflecting those assumptions in the ensuing calculations.
Small errors buried deep in assumptions may create an impression of trends stretching far into the future where this is just not so. And/or an impression of accuracy and precision about issues which is just not correct and fair.
Which is why the rigour of the scientific principles and outcomes described by Lord Krebs in fact can be tested not so much by other scientists who are experts in their field, but only by the geeks clever and able enough to confirm that the underlying data processing by the computers themselves has been done to tip-top accuracy.
Hence the work being done by Bishop Hill and others to explore that angle. Try this for size:
Now to get the station error, εi, the three error components are joined together by quadrature as follows:

Just as I expected.
Which is why the Climate Change argument is going nowhere, for better or worse, until not only the data but also the underlying computer coding is made available so that it can be independently checked.
We expect new drugs to have the most stringent tests before it is marketed. Not to mention new cars.
That same rigour has to be applied to all this Climate science. And the best if not only way to do that is via 100% transparency about data and programing alike.
Update: see also this excellent Guardian piece on this very subject, where Professor Darrel Ince is ruthless:
… if you are publishing research articles that use computer programs, if you want to claim that you are engaging in science, the programs are in your possession and you will not release them then I would not regard you as a scientist; I would also regard any papers based on the software as null and void.
Phew.
No good the house looking terrific if the foundations are rotten.










