Re 2010 “records”. Saying that 2010 is heading globally for being the warmest on record must come as ironic to inhabitants of Dublin, which is heading locally for something a low, especially if the last month of 2010 is included in the year as normal and continues cold.
Dublin Airport has bizarre anomalies in its “homogenization” of annual averages which frustrate attempts to look back for high and low temperatures in the historical record.
In the homogenized data the high of 2006 (9.92) beats the high of 1989(9.76), whereas in the raw data the high of 2006 (10.12) does not beat the high of 1989 (10.88).
Similarly there are problems with the “record cold”: in the raw data 2010 is clearly competing for the holder of “cold” record with only 1963 and 1947 colder since the late 1910s. However in the homogenized data 2010 looks unremarkable compared with the 1960s.
A few minutes with a spreadsheet is needed to unpick this. Comparing the raw with the homogenized year by year by subtracting raw from homegenized for each of the 130 years it is clear that the homogenized data has been generated by adding a very clear downward staircase with bogus increments of exactly 0.100 deg and a “step” of 12 years, leaving the early years untouched and the later years lowered, ultimately by one degree, but stopping in around 1988-1991 with an accumulated decline of 1.2 degrees. This gives a bogus decline of 0.8 degree per century(!) over the preceding hundred years. At this point, an upward staircase was added at a much higher slope, so that the “adjustment” eventually completely vanished by 2010. However, this means that that homogenized data shows a totally bogus phenomenal rate of rise of 0.6 degree per decade over the last two decades. Kiwigate was, AFAIR, solely about bogus upward “adjustments”. To see a progress downward “adjustment” over a century and then an upward adjustment in two decades, all in the same 130 year record for one station, is brilliant!
 Dublin Airport RAW EARLY: http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=621039690000&data_set=0&num_neighbors=1
 Dublin Airport HOMOGENIZED http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=621039690003&data_set=2&num_neighbors=1
 Dublin Airport RAW LATE: http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=621039690003&data_set=0&num_neighbors=1
Now here’s an odd thing. Here are the mean surface air temperature readings from Richmond, VA, according to NASA/GISS for the last eleven years. The RAW column shows what the thermometer read. The HOM column shows what it “should” have read, that is, it has been “homogenized” to correct for, erm, whatever homogenization corrects for. DLTA = HOM-RAW. You will see that the correction is always an exact multiple of 0.10 degree, and it goes in downward steps each year (except 2006-7). The effect is that over the last ten years the homogenized has an extra decline of about one degree per decade compared to the actual data. My rhetorical question is: what was the rationale behind this “correction”? For example, what physical process might there be which caused the Richmond thermometer to be consistently over-reading by 0.6 degree (for several decades, not shown here), and then gradually cause it to be more accurate over the last ten until it is spot on this year? The graph of this is shown here.
YR RAW HOM DLTA 2000 14.69 15.29 0.60 2001 14.39 14.99 0.60 2002 15.59 16.19 0.60 2003 14.21 14.81 0.60 2004 14.97 15.47 0.50 2005 15.50 15.90 0.40 2006 15.47 15.77 0.30 2007 15.79 16.09 0.30 2008 15.38 15.58 0.20 2009 15.06 15.16 0.10 2010 15.92 15.92 0.00
 Delta: http://wp.me/P1ecES-7#Richmond
This is such an interesting plot that it makes sense to look at the effect on the temperature chart itself:
Personally I prefer to fit a sine wave with a period of about 120 years through the raw data: