This article (ignore the cuteness at the beginning) Tamino vs random walk clearly explains that the basic methodology of global warming science, practiced by the IPCC, is wrong.
The premise of global warming modeling is the assumption that global climate models fail to account for all of the variation in surface temperature caused by natural forces, therefore, whatever is left over must be human caused.
Here’s how it’s done: Pre-industrial global average temperature is defined as “Normal” and plotted in a straight line through time. Temperature variations caused by natural forces are considered “random” and are rejected from the model, leaving the signal from human caused climate influence.
There are two fatal errors in this assumption:
1) “Normal” global average temperature is not linear through time. Variation is the norm and has been normal throughout the history of the Earth, in patterns that are predictable in the long term. Comparing present average global temperatures as variations from a lineal, short term norm presents a false picture of overall climate variation and its causes.
2) Natural climate change patterns appear random, but are predictable in the long term. When global climate records are cut off arbitrarily at 1850, this rich resource of climate data is thrown out, along with all other data considered “random.” Thus, the “signal” left over when the “random noise” is discarded is mistakenly assumed to be the human contribution to atmospheric forcing. What’s left is an artificially isolated “trend” that gives a false impression of warming.
As demonstrated in this graph of global temperature variation from 1980 (click to enlarge), extrapolated to 2100, patterns of past temperature fluctuations predict dramatically cooling temperatures over the next 90 years.