Hm, as you can see my hide has been worn thin with these topics. There's plenty of creationists, Nazi-apologists, GW-deniers, tobacco-lobbyists etc. out there who would write such a statement and MEAN it.
Well, I get frustrated by having my scepticism (for both sides, BTW) being compared to holocaust denial.
Perhaps a

would have been a good idea?
Check what I posted....
To me, the data so far suggests (seeing the retraction of the correlation paper etc.) that it simply is not a significant part of the issue. Solar radiation is well enough understood, to my knowledge, for the broad trend - remember that all your arguments FAIL in one respect:
if solar irradiance was behind the massive warming, then where did all the effect of the CO2, methan and other greenhouse gasses go?
Ok - firstly, my goal was to discuss the possibility of a lack of solar activity potentially causing severe cooling, but it seems to be hijacked (partly by myself, incidentally) into another GW thread.
To answer that part, that is predicated on the assumption that CO2 etc causes warming - which is fair enough. It does not necessarily suggest that the magnitude of that warming is known with a high degree of precision. To demonstrate this point, the IPCC claimed in both 2001 and again in 2008 that the radiative forcing effect of CO2 had a "high" level of scientific understanding (incidentally, in 2001, they noted that the LOSU for solar was 'very low', improving to 'low' in 2008). However, in 2001, they claimed that CO2 had a magnitude of
1.46 ± 10%. In 2008, this had increased to
1.66 ± 10%. Why is this relevant?
Because the assumed value used in 2008 was not within the uncertainty margins of the 2001 estimate. This suggests to me two things: 1.) that a 'very high' level of understanding can't be that high in absolute terms, and that secondly, the value that they have assumed for solar, when they recognise that they know little about it, could be very, very wrong. Yes, they put a high uncertainty on it, but that's a percentage. The error band does suggest that they think its a skewed uncertainty - have they under-estimated it?
In short - with their estimates and uncertainties, can we really believe that the
relative forcings are correct?
It is all nice and well to say that there might be some little factor out there that may or may not have some influence - but said factor has been shown to have either no significant part, or tot toally hide the signal from a well-known, perfectly understood other factor.
But its
not a 'perfectly understood (...) factor'. Its a 'high' level of understanding, but even that appears fairly subjective.
For example, I understand that they have derived their relative values from trying to back-cast models to fit temperature history. Errors in that temperature history, not incorporating all significant factors in to the model, not getting cause & effect right or getting the weightings of the various components wrong will distort the answers. This modelling is basically a massive degrees-of-freedom issue, where they are having to make a lot of assumptions to lock-down the degrees of freedom to even get a result. One such example is the formation of low-level clouds (cooling effect - not included in the models) vs high level ones (heating effect - included as a positive feedback loop) as the globe warms (this is more on the 'tipping point', but is related to the models). Another is the sulphate aerosols - used as a fudge-factor to match the temperature history. Interestingly, it is believed that they provide a strong cooling effect. It is predicted that as pollution standards improve, then this effect will be lost, leading to more net warming. However,
recent studies have suggested that a reported post-war temperature drop was not a real phenomena. It was a difference in how sea surface temperatures were measured. As such, the 'fudge factor' introduced with sulphate aerosols may not have been required. And it means that we may not see the increased net warming predicted by cleaning-up these aerosols.
The question is not 'what and how much does the sun activity do' - we know the broad basics. And the question also (at first) is not what will happen with it. The question rather is how you can explain the past, if we assume a strong influence of sunspots or whatever else fluctuates on the sun's surface. And to my knowledge, you simply can't. Therefore, it is totally sound to assume that the influence of unknown factors is at worst small.
I disagree. I don't think that they
have explained the past. If they had a reliable temperature history, supported by models that
accurately history-match to those temperatures, in which they had a high level of understanding of all input factors, and by which they could use sensitivity analysis to their input factors to back-up the robustness of the model, then by all means use it for
forecasting. Note that the IPCC does NOT claim that it makes forecasts, because its models do not follow the scientific rules for forecasting - they are projections. In fact, they are basically sensitivity studies around CO2 levels. Get the CO2 impact wrong, or get the knock-on effects wrong, and the projections are wrong. Note that most of these models
assume positive feedback loops. And how many positive feedback loops are their in nature?
I agree that it would be nice and helpful to understand it all in detail, but the words '(little) ice age' suggest a massive influence. One that can be confidently excluded from the list of options.
So is it your position that the CO2 effect, and global warming has a stronger influence on the planet than variations in solar activity? Do you believe that if the CO2 concentration had been (say) 350 ppm in the 1600's, that there wouldn't have been a 'little ice age'??