Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Twitter Facebook YouTube Mastodon MeWe

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Search Tips

Comment Search Results

Search for TSI measurements

Comments matching the search TSI measurements:

    More than 100 comments found. Only the most recent 100 have been displayed.

  • At a glance - Empirical evidence that humans are causing global warming

    walschuler at 00:39 AM on 20 July, 2023

    This brief greenhouse gas theory history omits the very important paper, "On the Influence of Carbonic Acid in the Air Upon the Ttemperature of the Ground,"the first true model of the effect, the hand-calculated model by Svante Arrhenius published in 1896. It modeled the atmosphere as a single layer and the effect of setting the concentration of CO2 to 2/3 of the value at his time, and at values up to 3 times higher, for 10 degree latitude steps both north and south, for 4 seasons and for the annual mean. His results for this simple model were within a factor of 10 of current calculations and measurements as the value has grown. His interest was in explaining newly discovered evidence of ancient ice ages.


    You also omit Horace-Benedict de Saussures' important measurements (in "Continuation du Voyage Autour du Mont-Blanc," Chapter XIII, Voyages dans les Alpes VII, 1779, S . Fauche, Neuchatel, pp353-355, and pp365-367). The first demonstrates the existence of "chaleur obscure" (="dark heat" = infrared radiation) and its reflection and concentration, using metal mirrors, just like visible light. The second records measurements of the greenhouse effect temperature rise in a cubic foot wooden box, insulated on all but one side with blackened cork, and that side closed by two layers of glass. He placed thermometers between the glass layers and inside and outside the box, and traveled the assembly from sea level up to high altitude in the Alpes, measuring the temperatures inside and outside the box as he went. He ascribes the decrease of temperature with altitude to the increasing transparency of the air as you  ascend. I made a translation from the French which is available upon request. Fourier refers to this work in the paper of 1827 cited above.

  • Climate's changed before

    theSkeptik at 06:48 AM on 25 February, 2020

    @michal sweet, MA Rodger and KR:

    First of all I very much appreciate your quick comments on my post.

    to 1) yes, I realize now that ice core measurements have been taken in the antarctica (not arctica) which I assume means it's about CO2 from the atmosphere, not the sea water. The shown direct relationship to the temperature is therefore plausible to me.

    to 2) I am discussing solely the meaning of the presented data from the University of Copenhagen. The article claims it supports the assumption that there is a causal relationship between the GHG and global temperature. Don't get me wrong, there may be other evidence for that claim but that's not my point here.

    @michael sweet

    The predicion you mentioned about the global warming 100 years ago is outside the scope of the discussed data. Apart from that, there are only two possible outcomes from such a prediction: a) It can turn out right - temperatures are rising or b) it can turn out wrong - temperatures are falling. So even with an uneducated guess one would have a 50% chance to be right. Finally, the graphic doesn't even show any evidence of global warming, though it does show a very significant raise in methane and CO2 in the last decades.

    @MA Rodger

    I see your first argument is in line with another claim of the article that recent data show a phase shift in GHG and temperature. Since 2012 GHG movement is said to no longer lag behind temperature data. I agree this would be an indication of a significant change. Unfortunately this data is not shown in the article and it can't be seen in the presented graph. Your second argument just seems to support my concern: Global warming can't be seen in the antarctica according to the chart so far. It is possible that it shows up in the future, but the shown data gives no evidence to that assumption.

    Finally I do not make any claims about any relationships between GHG and temperature or other related parameters. I am just looking for unbiased information and constantly come across overinterpreted data and conclusions driven by preoccupation. If one claims a causal relationship between two parameters its up to them to give evidence, not to me to proof otherwise.

    @KR

    As a physicist working for several decades in RnD companies I am not easily convinced of simple models describing the behaviour of complex reality. At least it's not obvious to me that a gas in a concentration of only several hundreds of ppm is likely to have such a significant influence on global temperature. It might not be impossible and I will surely have a closer look at this matter in due course. However, as I mentioned before this is not my current point. I am discussing the presented graphic which seems to support nothing of the claims about global warming apart from unusual high greenhouse gas concentrations.

  • It's cosmic rays

    Daniel Bailey at 06:22 AM on 4 December, 2019

    jmh530, the best available evidence we have is that there is no direct linkage between the sun’s output and cosmic rays impacting the Earth’s climate. Now that’s a broad statement, but let’s examine some more in-depth evidence on those individual components.

    Scientists use a metric called Total Solar Irradiance (TSI) to measure the changes in output of the energy the Earth receives from the Sun. And TSI, as one would expect given the meaning behind its acronym, incorporates the 11-year solar cycle AND solar flares/storms.

    The reality is, over the past 4 decades of significant global warming, the net energy forcing the Earth receives from the Sun had been negative. As in, the Earth should be cooling, not warming, if it was the Sun.

    It's not the sun

    The scientists at CERN designed an experiment called CLOUD to evaluate the potential impacts of cosmic rays on clouds and cloud nucleation (Cloud Condensing Nuclei = CCN).

    Per CLOUD director Kirkby:

    "At the present time we can not say whether cosmic rays affect the climate."

    Looking at the results of CLOUD, if cosmic rays were a significant factor in affecting our climate, the Earth should have been cooling, not warming. Instead 8 of the warmest 10 years have all occurred in the most recent 10 years.

    Erlykin et al 2013 - A review of the relevance of the ‘CLOUD’ results and other recent observations to the possible effect of cosmic rays on the terrestrial climate

    The problem of the contribution of cosmic rays to climate change is a continuing one and one of importance. In principle, at least, the recent results from the CLOUD project at CERN provide information about the role of ionizing particles in ’sensitizing’ atmospheric aerosols which might, later, give rise to cloud droplets. Our analysis shows that, although important in cloud physics the results do not lead to the conclusion that cosmic rays affect atmospheric clouds significantly, at least if H2SO4 is the dominant source of aerosols in the atmosphere. An analysis of the very recent studies of stratospheric aerosol changes following a giant solar energetic particles event shows a similar negligible effect. Recent measurements of the cosmic ray intensity show that a former decrease with time has been reversed. Thus, even if cosmic rays enhanced cloud production, there would be a small global cooling, not warming.”

    Modern CCN are pretty much insensitive to cosmic rays and changes in TSI from the Sun, compared to the very much larger anthropgenic and natural contributions (volcanoes, oceanic oscillations and wildfires):

    "New particle formation in the atmosphere is the process by which gas molecules collide and stick together to form atmospheric aerosol particles. Aerosols act as seeds for cloud droplets, so the concentration of aerosols in the atmosphere affects the properties of clouds. It is important to understand how aerosols affect clouds because they reflect a lot of incoming solar radiation away from Earth's surface, so changes in cloud properties can affect the climate.

    Before the Industrial Revolution, aerosol concentrations were significantly lower than they are today. In this article, we show using global model simulations that new particle formation was a more important mechanism for aerosol production than it is now. We also study the importance of gases emitted by vegetation, and of atmospheric ions made by radon gas or cosmic rays, in preindustrial aerosol formation.

    We find that the contribution of ions and vegetation to new particle formation was also greater in the preindustrial period than it is today.

    However, the effect on particle formation of variations in ion concentration due to changes in the intensity of cosmic rays reaching Earth was small."

    And

    "...solar cycle variations of ion concentration lead to a maximum 1% variation of CCN0.2% concentrations. This is insignificant on an 11 year timescale compared with fluctuations due to, for example, the El Nino-Southern Oscillation, variations in wildfires, or volcanoes."

    Gordon et al 2017 - Causes and importance of new particle formation in the present-day and preindustrial atmospheres

    And the coup de grace for cosmic rays, being proven to unable to significantly affect clouds and climate, is that CCN respond too weakly to changes in Galactic Cosmic Rays to yield a significant influence on clouds and climate.

    Pierce 2017 - Cosmic rays, aerosols, clouds, and climate: Recent findings from the CLOUD experiment

    Scientist Richard Alley pretty much killed the cosmic ray hypothesis here (the relevant part of the lecture starts at 42:00)

    "We had a big cosmic ray signal, and the climate ignores it. And it is just about that simple! These cosmic rays didn’t do enough that you can see it, so it’s a fine-tuning knob at best."

    To recap, the Laschamp excursion (the strongest cosmic ray event in the past 40,000 years) hammered climate for 2,550 years about 40,000 years ago. The flux of beryllium-10 produced by cosmic rays greatly increased as the Earth’s magnetic field weakened by 90%.

    Climate ignored it.

    Here is the chart he’s referring to, showing how the flux of beryllium-10 produced by cosmic rays greatly increased as the Earth’s magnetic field weakened by 90% about 40,000 years ago.

    It's not cosmic rays

    From the AR5, WG1, Chapter 7, p. 573:

    "Cosmic rays enhance new particle formation in the free troposphere, but the effect on the concentration of cloud condensation nuclei is too weak to have any detectable climatic influence during a solar cycle or over the last century (medium evidence, high agreement). No robust association between changes in cosmic rays and cloudiness has been identified. In the event that such an association existed, a mechanism other than cosmic ray-induced nucleation of new aerosol particles would be needed to explain it. {7.4.6}"

  • Other planets are warming

    MA Rodger at 19:53 PM on 25 January, 2019

    S0urce @53,

    There is usually no dispute that the strength of the Sun has increased over the eons. (So says the Standard Solar Model although the 'weak early sun paradox' does occasionally throw up some contradictary ideas, eg Graedel et al 1991). Yet an increase of 200K in the Sun's temperature over periods of a century or even centuries is in a different league. A rise in temperature from 5800K to 6000K (with theSun unchanged in size) would result in a 14% increase in solar radiation, boosting Earth's insolation from 1366Wm^-2 to 1442Wm^-2. Solar insolation has been accurately measured for 40 years with no sign of such a rise.Solar Insolation graphAnd actually, we wouldn't have required such measurements to notice an increase of that size. A rise of 200K in the Sun's temperature would have applied a ~12Wm^-2 forcing to Earth's climate, enough to boost Earth's global temperature by ~10ºC, a bit of a game changer if it happened over a period of a century or so.

  • Increasing CO2 has little to no effect

    scaddenp at 07:12 AM on 10 July, 2018

    jessicars - when you do the experiment outside and you want to calculate the effect of your plastic bag of CO2, then you need to consider that column of gas under consideration is 55km tall and you have changed the CO2 content of the bottom few centimeters. Yes, that CO2 will trap IR radiation and re-radiate to the surface. Because that surface is receiving more radiation, it will heat up and the surface will heat the bag.  You note that it is surface under that bag that is heated by back radiation, not the gas in bag. (hence thermometer under bag). The question is how much, and in your setup, yes, it is insignificant. To replicate the atmospheric GHE, you need  a column of gas 55km or so high.

    The experiments we pointed you to isolate other effects and applify the CO2 effect to make it measurable with thermometer. Radiation is radiation whether it is coming from heat lamp or a warm surface.

    If you are looking for empirical evidence of GHE, then that paper on direct measurement is one, but see also our reference especially the CO2 traps heat section with papers from Harries, Evans, Griggs , Philpona, Chen who all compared the calculated radiation at either top of atmosphere or surface of earth and compared it to direct measurements.

  • Increasing CO2 has little to no effect

    Glenn Tamblyn at 18:30 PM on 8 July, 2018

    Tristan, adding some points.

    When the CO2 is released from the spritzer, the gas will come out somewhat cooled, so that may explain the lower CO2 temperature in the bag. How good an insulator is the plastic, that may impact how easily any temperature difference due to this corrects?

    Better would be to set up the entire setup outside in the same environment, and take measurements before you add any gas. Then look at how the temperatures change immediately after you add the gases. Then monitor how they change subsequently after that.

    Next, although CO2 absorbs infrared, most of that absorption in the atmosphere takes 100's to 1000's of meters to be 100%. How much will be absorbed over inches?

    Next, what is the transmissivity of the plastic - how much radiation passes through the bag? The bags may be transparent to visible light (very high transmissivity) but most materials behave very differently for infrared light. Most naturally occuring materials have extremely low transmissivity to infrared. You would need to research the properties of the plastic involved. Otherwise you are effectively carrying out the experiment with an opaque (to infrared) bag.

    The greenhouse effect depends on the behaviour of the entiure atmosphere over vertical distances of many kilometers. Extra absorption by CO2 at the surface is only a small part of any change in the GH effect. The big changes involve how much emission by CO2 changes at high altitude - 10 km or so. It is very hard to model the GH effect with small, surface based setups.

  • 2018 SkS Weekly Climate Change & Global Warming Digest #5

    NorrisM at 10:19 AM on 20 February, 2018

    Bob Loblaw @ 93

    My apologies.  Your question was obviously not a yes or no answer.  My answer should have simply stated that I cannot pick one of your arbitrary groups.   We simply have to let the chips fall where they may on a cost/benefit analysis (outside of pollution costs) when it comes to the use of fossil fuels over the last 150 years because all groups however you classify them have benefitted.  To try to do some kind of "weigh scale" measurement of costs and benefits as between different groups would be impossible.  Again, to the extent we are talking pollution rather than the consequences of rising temperatures and rising sea levels I think measurements can and should be made.

  • Why the Republican Party's climate policy obstruction is indefensible

    NorrisM at 16:31 PM on 6 July, 2017

    Rightly or wrongly, I think that ClimateGate had a very damaging effect on the climate change views of conservatives everywhere.  It is very similar to evidence given by a witness testifying in some legal case who is  completely honest in his testimony until the last question, where, in his desire to "win the case" for whatever side, he  "fudges" his last answer.  The cross-examining lawyer then leads another witness who proves on that very point that the witness was not telling the truth.  For any jury, ALL of the evidence of that witness is tainted.  I truly think this happened with this issue.  Judith Curry has herself admitted that this made her seriously question her position which was until then "mainstream".   It is just about irrelevant now as to what was or was not the intention of those emails.  The damage has been done.  End of story.

    When you add this to the issue of the "hiatus" of X number of years whether or not it was really there (the IPCC at least in 2013 coined that term) has added to the legitimate questions of conservatives that are we being led down a garden path.  The models did not predict this and therefore are unreliable.    That is not an unreasonable position to take IF the hiatus really occurred.  For now let us not get into arguments about this because you will NOT convince the Republicans by one "new study" that shows that the IPCC was mistaken. 

    Then you add on John Christy's famous graph which so impressed Steve Koonin between the predictions of the models and the actual observations (see APS panel hearing below).  Do you not think that those pressing the Republicans not to do anything on the climate change file have not read the transcript of the APS panel hearing where three (3) of the top IPCC contributing climate scientists, Collins, Hand and Santer, admitted that the model predictions do not track the observations?  Their answer was that they do not trust the observations.  Can you not see how this would make conservatives suspicious?

    So "97% of climate scientists" does not cut it with Republicans.  They simply do not trust the climate scientists believing, rightly or wrongly, that their bread and butter is really based upon making sure that climate change is primarily man-made.  Can anyone really be a scientist and say that 100% of climate change is man-made?  On  that point I fully agree with Perry.  Climate change has been on-going for the life of the planet and the man-made CO2 emissions simply cannot be 100% unless you have strong evidence that we are in a natural "cooling period".  It is not possible that the climate naturally is not going either up or down.  When you say "100%" you sound like an extremist.  Most people, and especially conservatives, do not like extremists.  Not a smart thing to say.

    But back to the Republican position.  When they see there are real-life climate scientists like Judith Curry (who I have to admit sounds much more balanced than Michael Mann in testimony before the various Congress committees and who is not subject to any "ad hominem" attacks that seem to be levelled at Christy and Lindzen), then the "red team blue team" approach with other scientists (primarily physicists I hope) may be the best answer to the Republicans.  Give it a go and see what happens.  If the Koch Bros result happens again, then you will have a very legitimate and strong position to force the Republicans to act.  If their own "red team blue team" comes to the conclusion that CO2 emissions are really the cause then we are at least then only into the question of how much warming and decisions as to how best to approach this.  So I say, fully support the "red team blue team" even if it has been done before. 

    Once we get past what Dessler calls "positive statements" (in his very good book on climate change) which are the facts, then we can get into "normative statements" on what we think the results are in economic terms and what we should do about it, both as to mitigation and adaptation.

    I do suspect that such a "red team blue team" debate will get bogged down on the facts and largely because we do not have the proper instruments to measure what is happening year to year.  If the result is that the Republicans do at least decide to dedicate much more money to funding both weather/climate satellites and water buoys and on-land temperature measurements then it will be a "win" for the majority of climate scientists who believe that we are the cause.

    What I found most unsatisfying about the APS panel struck in 2014 to re-evaluate their statement on Climate Change is that after having somewhat of an "appellate hearing" there were no "reasons for judgment", just a decision by the Board of Directors of the APS one year later to effectively stick with their previous statement.  I have no problem with them sticking with their same statement but by providing their reasons they could have provided massive "independent evidence" outside the climate science community that man made warming is a major threat to our world.  On another post, I have made reference to the APS panel.  You can read the APS Workshop Framework Questions and transcript of the proceeding with 6 of the top climatologists on both sides of this debate on the APS.org website just searching "Climate Change Policy Review".

    I just think the climate science community has to do a reality check.  Trump won and he in all likelihood is here for at least for the remainder of his first term and possibly 8 years (would Pence be any better?).  Anyone who does not accept this is really like the ostrich in the sand pictured on the home page of this website. 

    I personally am very unhappy with this situation but the American people have spoken!  Get used to it!  As Winston Churchill has noted, democracy is close to unworkable but compared to the alternatives, it is the best.  Comey must stay awake at nights realizing how he might have turned the course of history.  

     

  • Increasing CO2 has little to no effect

    Tom Curtis at 09:13 AM on 1 May, 2017

    vatmark @312, sorry for my delayed response.  I am suffering from poor health at the moment, and am finding it difficult to respond to involved posts in a timely manner.  Unfortunately this may mean a further delay in responding to two other posts directed to me by you on another thread, for which I also apologize.

    1)

    "This does not convince me that climate models are doing it right by using backwards calculations where emitted radiation is causing the temperature of layers below."

    I should hope not, as that is not what General Circulation Models (GCM) do.  Rather, they divide the ocean and atmosphere into a number of cells, and for each time step solve for all energy entering, absorbed and emitted from that cell, including energy transfers by radiation, latent heat, diffusion and convection.  In doing so, they maintain conservation of energy and momentum (or at least as close an approximation as they can maintain given the cellular rather than continuous structure of the world).  When they do this, properties of the simplified models of the greenhouse effect used primarilly for didactic purposes are found to emerge naturally, thereby showing those simplified models to capture essential features of the phenomenon.

    2)

    "He says that observed heat from the earth is not in balance, the heat flux from the sun that heats earth is larger than the amount of heat that earth emit to space. I find that logical, the earth is not equally warm throughout, and then it has to emit less energy. Only when the system is equally warm in every point inside, it emits as much heat to space as it receives."

    You have taken a reqirement for a body, heated externally, and equally from all directions and assumed it is a universal condition.  It is not.

    To take a simple example, if a spherical body having the same thermal conductivity throughout, bathed in a fluid of uniform temperature, but having a significant heat source at the center.  According to you it must have the same temperature throughout before energy in can equal energy out.  But, based on Fourier's law of conduction, if there is no temperature gradient, there is no movement of energy by conduction.  If follows that based on your theory, the heat from the heat source at the center can never leave, which must result in an infinite energy build up at the center.

    Your assumed requirement does not even describe such very simple models.  It has been falsified, in fact, since Fourier's experiments that led to his seminal work.  It certainly does not apply to the complicated situation of an atmosphere, or a large, massive rotating sphereoid heated intensely from one side, and situated in a heat bath of near zero degrees absolute, ie, to the Earth.

    Your claim is also refuted by the Earth itself, which has existed for long enough, with a very stable energy source, that it is in near thermodynamic equilibrium.  If your supposed condition held, then there would be no significant difference in temperature with altitude.  Despite that, ice has existed at altitude in the tropics for hundreds of thousands of years. 

    3)

    "Hansen wrote about satellite measurements showing an imbalance of 6.5W/m^2 averaged over 5 years. Then he says it was thought to be implausible and they made instrumentation calibrations to align the devices with what the models say, 0.85W/m^2."

    Satellite measurements currently suffer a disadvantage, in that while they are very accurate in showing relative changes in Total Solar Irradiance (TSI) and Outgoing Long Wave Radiation (OLR), they are fairly inaccurate in showing absolute values.  This was known from design specifications, and also by comparison of the data from instruments of the same, or different design over the same period, as here:

    That means, while we can know the annual change in the energy imbalance quite accurately, we cannot know it's absolute value from satellites alone.  Two different methods are used to compensate for this.  In the past, the values from climate models were used of necessity.  Since the advent of Argos, the rise in OHC is sufficienty well known that it can be used to calibrate the absolute energy imbalance.  Hanson discusses both methods (which approximately agree, and certainly agree far better than does either with the value from the satellites).  Further, the specific use of computers you mention was not Hanson's, but that of Loeb (2006).

    4)

    "How can forcings be known accurately if they are not a result of measurements? Not any of the studies show how any numbers of forcing has been achieved."

    Hanson does not say the forcings are known accurately.  Rather, he shows the Probability Density Functions of the forcings:

    As can be seen, the 95% confidence limits of the greenhouse gas forcing amount to a range of about 1 W/m^2, or approximately a third of the best estimate forcing.  In constrast, the aerosol forcing has a 95% confidence limit range of about 3 W/m^2, or just over twice the best estimate.

    5)

    "And I can´t find any descriptions of the heat flow the way I think it should be done, or rather, the way I like it."

    Given the level of understanding of thermodynamics shown by you in your claims about equal temperature, it is neither a surprise nor a problem that you cannot find descriptions of heat flow the way you like.  GCMs do use, however, the standard laws of thermodynamics, and of heat flow in its various forms.

  • There's no correlation between CO2 and temperature

    Tom Curtis at 14:29 PM on 28 December, 2016

    HB @167 claims:


    "And yet you use a fudge factor called albedo. Trenberth himself makes no secret of how they adjust albedo to cover up for imbalance.

    How can anyone make an argument of "540.1" when ~~~~30% is yanked from the input value without justification from real measurements?"


    In fact, the outgoing Short Wave radiation at the Top Of the Atmosphere is measured by the CERES instrument flown on the Terra and Aqua satellites.  Together with Total Solar Irradiation (TSI) data from the TIM instrument, that allows the direct calculation of the energy balance and albedo as:

    Energy balance = TSI/4 - (OLWR + OSWR)

    Albedo = (TSI - 4 x OSWR)/TSI,

    where OLWR is Outgoinging Long Wave Radiation, and OSWR is Outgoing Short Wave Radiation.

    TSI is divided 4 in the energy balance equation as it is measured relative to a flat plane perendicular to the incoming radiation, and needs to be averaged over the sphere to match the measured values of the other two products which are measured as averaged over the Earth's surface.  Likewise, to convert the OSWR to the equivalent of the TSI, it needs to be multiplied by 4 in the Albedo equation.

    For CERES best product (syn1deg), the values are:

    OLWR:  237.2 +/- 10 W/m^2

    OSWR: 97.7 +/- 3 W/m^2

    Incoming Solar (=TSI/4): 341.3 +/- 0.2 W/m^2

    That yields an energy imbalance of 6.4 W/m^2, which contradicts the far more accurately measured energy imbalance from ocean heat content measurements.  Knowing the large errors in absolute magnitude of the values, they are therefore adjusted by 27%, 73% and 450% of the 2 sigma error values respectively (for the values shown in the figure shown @159 above).  Note that graph is from a slightly different time period from the error values and absolute values I have shown, so that part of the discrepancy may be a difference in the observed values.

    The upshot is that the adjustment to the albedo term in the energy budget amounts to approximately 3 W/m^2.  HB instead describes it as a greater than 100 W/m^2 fudge.  His fudge on the adjustment amounts to a factor of >33.  At the same time he describes the OSWR as unobserved which is blatantly false, and neglects that the reason for the fudge is to bring the energy balance into line with observed changes in surface heat content, ie, a decision to use the more accurate determination of the total energy imbalance in preference to one whose inaccuracy due to instrument limitations was an order of magnitude greater.  In HB's version of science, scientists should always place greatest weight on their least accurate observations.

    I need only add that Trenberth describes the above sources of data, and the reasons for the adjustments at the same place as he mentions them.  Given the standard etiquette of quotation and citation, if you are relying on somebody else's word as to what somebody said, you need to quote them rather than the original source.  As HB mentions Trenberth directly, he should be assumed to be referencing Trenberth directly, and hence has demonstrated a complete inability to understand the cited source, or a breath taking dishonesty.  Perhaps, however, he is as uninformed about the etiquette of citation as he is about climate science, and has merely demonstrated an abominable lack of desire to fact check any factoid he gleans which supports his bizarre theory of what science is.

  • It's the sun

    Bill N at 03:06 AM on 16 October, 2016

    Bob,

    You have mischaracterized what I have said on almost every count.

    The only thing you have corrected me on is that the FOV for these instruments encompasses the entire solar disk, which only serves to strengthen my argument that the solar light hitting the cavity wall will degrade it and cook any outgassing contamination on it.  I never said an adjustment must be made to account for what I thought was a limited FOV, so again you mischaracterize.

    I never said that there were any other optics besides the open aperture and the optical cavity, so again you mischaracterize.  You yourself said that the only room for optical change is the cavity reflectance (absorption ratio).  But that's exactly what I have been talking about, nothing more, nothing less.  It is the long term stability of this optical property under degrading environmental influences that is the question here.

    I did not feel the need to explain what "active cavity radiometers" are beyond the optical components being evaluated, assuming that interested readers here can Google this in a split second and get a lot more detail on them than what I could provide in writing.  So what's you beef here.

    Again you BS when claiming that I said the instrument developers did not consider stability.  I have stated here till I'm blue in the face that there is disagreement amongst the developers and users over what the stability is, even to the point that some think they can not be relied upon for long term TSI variability measurements.  So obviously this is a hot topic for them, as I have said again and again.

    Finally, I have made no claim whatsoever as to being an authority on space-based TSI measurements, so you last mischaracterization of me about this is a cheap shot.  My only "authoritity" is my general experience as an optical engineer flying spaceborne instruments, with of course then radiometry part of that experience (some of this covered solar measurements).

    So summarizing Bob, you're really a piece of work.

    After recieving so much vitriol about reasonably poised questions and thoughts by me, I decided to Google what other folks think about what is going on at ScepticalScience.  Wow!  It seems the world opinion is that this site is populated by a bunch of alarmist trolls (kids mainly) who engage in dirty tactics to voraciously defend their pseudo-scientific viewpoints, so that anyone who comes here with a differing viewpoint, no matter how reasoned, will not be treated fairly.  Well, that is certainly what I have experienced here (with a few exceptions).  So good bye.

  • It's the sun

    Bob Loblaw at 01:38 AM on 16 October, 2016

    I will try to keep this short for the moment, as it look as if BillN may be leaving.

    BillN has made several assertions about space-based measurements of TSI. I have not been involved in any space-based measurements, but I have a dozen year of experience in ground-based measurements of direct beam solar radiation using Eppley HIckey-Frieden (HF) cavity radiometers, of identical type to those that have been used in space. [All Eppley HF radiometers are built to the same space-rated specifications. I can't point to a peer-reviewed article that says so, so in a scientific paper I would have to reference this as "John Hickey, personal communication". He's the "Hickey" in "HF"...]

    Anyway, BillN has made several questionable assertions. I will respond to a few:

    • He refers to "optical stability". In the Eppley HF, the only "optics" are a black cavity that is fully-exposed to sunlight - no glass, no optics to focus sunlight, just an exposed cone-shaped receiver. The important "optical" characteristic of this receiver is its absorption ratio (or reflectivity, if you prefer). If that were to change, then stability would be affected, but all that cavity does is absorb solar radiation.
    • The radiometer also has a tube and calibrated orifice arrangment to limit the field of view. You may also call this "optics", if you like, but it's not as if there is a telescope or anything like that. It's much like limiting your field of view by holding a paper towel tube in front of your eye. It's fancier than that - black interior, etc., to limit stray light reflections, and a controlled area aperture at the end so that you get an exact field of view, but that's it.The view of the sun is completely unobstructed.
    • The field of view of the Eppley HF is slightly larger than the diameter of the sun, so BillN's assertion in comment #1180 that "Even though the FOV (Field of View) of the instrument picks up only a small fraction of the solar disk..." is simply wrong for the Eppley HF. For ground-based measurements, this means that the instrument also views a bit of scattered sunlight around the sun, but in space this will not happen. There is no adjustment for seeing a portion of the solar disk, as BillN has stated.
    • BiilN correctly refers to "active cavity radiometers", without explaning what they are. The Eppley HF can be operated either in active or passive modes. The principle of operation is that the cavity that absorbs solar radiation will heat up, which introduces a temperature gradient measured by a thermopile. In active mode, this heating is offset by an electrical heater, and by measuring the electrical heating rate you will know the solar heating rate. In passive mode, you measure the thermopile output caused by solar heating (no electrical offset), but periodically shade the instrument (no sun) and substitute a short period of electrical heating to check the calibration. The calibration results is used to convert the solar-heating output to irradiance. Ground-based observations using the HF will usually use passive mode (e.g. at the International Pyrheliometer Comparisons held every five years in Davos, Switzerland, where the World Radiation Reference is maintained. These IPCs (which have been happening since the 1960s) are a primary indicator of instrument stability in ground-based measurements.

    So, stability of an HF instrument depends on the absorption in the cavity remaining stable, and the electronics that measure the electrical heating remaining stable. There are no other "optics" involved.

    Rather than taking my word on any of this, HIckey, Frieden, and Brinker have reported on the stability of the Eppley HF after six years in space:

    Report on an H-F Type Cavity Radiometer after Six Years Exposure in Space Aboard the LDEF Satellite

    J R Hickey, R G Frieden and D J Brinker

    Metrologia, Volume 28, Number 3

    This is a 1990 paper, unforunately paywalled, but the abstract reports a 0.1% stability, but with a 0.1% uncertainty on that value. 0.1% of 1368 W/m2works out to less than 1.5 W/m2. After accounting for global abedo (30%) and dividing by 4 (area of sphere vs. area of circle), this leads to an uncertainty of less than 0.25 W/m2 in global absorbed solar radiation. Much less than the CO2 forcing.

    BillN is wrong in implying that the developers of such instruments have not considered stability. Tom Curtis' post above also explains how examiniation of multiple instruments and multiple sources of analysis increases confidience in the readings of TSI.

    In short, BillN's implied position of infallible authority on matters of spaced-based TSI measurements is fallible.

  • It's the sun

    Bill N at 00:05 AM on 16 October, 2016

    Hello Tom,

    Thank you for the excellent post about the satellite instruments.  It looks like you put a lot of work into it, which is greatly appreciated.

    I especially took notice of the part about comparing 3 instruments at once to reduce errors.  This of course significantly reduces all "drift errors" in which the drift mechanism has a equal likilyhood of moving in either direction.  Unfortunately, optical radiometric changes are typically "one way," so that for instance all 3 compared instruments will have a reduced sensitivity over time due to the three error sources indentified previously: outgassing induced contamination, solar light induced degredation of the cone optical surface, and accumulation of spaceborn dust.  All three sources will increase the diffuse reflectivity of the (designed) specular cone, thereby reducing the amount of light collected, yielding then a lower measured TSI over time for all instruments flown.

    Of course, the degree of change for any particular source will vary between instruments.  One of these error sources is likely to be predominant over the others (my guess would be either the outgassing contamination or the solar light induced degredation), and if its rate of change varied significantly from one instrument to the next, it would be detected as a relative change in the sensitivity of 3 instruments being compared.  But IMHO, the variation between instruments for the rate of change of the predominant error source, is likely to be less than the average rate of change for the instruments combined, so as long as the average rate of change is not dramatic, the variation in the rate of change between instuments will not be significant enough to indicate the presence of the error.  The TSI measurements themselves are "stable" enough to indicate that the optical error sources are not dramatic, so the predominant one is of course also not dramatic.  Therefore, what ever the predominant optical error source is, it is reducing the sensitivity of all instruments together, but not so much as to indicate the variation in the rate of change from one instrument to the next.  However, the average variation can still be large enough to significantly exceed the 0.01% stability requirement neccessary to rely on these instruments to show the TSI levels are not changing.  Therefore, the issue still remains.

    You are right in not taking my word on this issue with any more credence than what anyone else has to say, especially against the engineers/scientists building and using these instruments.  However, in one respect, I am simply pointing out what any objective optical engineer will tell you, which is that an optical stability level of <0.01% for radiometric instrumentation is extremely difficult to obtain and prove, especially for "field instruments" such as ones flown in space.  You should also be aware that there is disagreement amongst the instrument engineers and scientists as to whether they can be relied upon at all for measuring TSI changes (as opposed to absolute TSI measurements which everyone seems happy with).  In particular, the claims of achieved stability made by PMOD are in dispute, as are their resultant measurements.  I suggest you Google on this issue and take a look for yourself (I mentioned a starting point in my original post).

    I believe I've "done my job here" in pointing out that the claimed stability of these instruments in order to successfully infer that the TSI is not changing (or even going down), is in question, even amongst the engineers and scientists that built and use these instruments.  So I am now "signing off."

  • It's the sun

    michael sweet at 23:52 PM on 15 October, 2016

    Bill,

    At this time you have provided no links to any "peer reviewed" studies that you used to form your opinions.  I have only your unsupported claim that you are an expert on TSI (although you have apparently never been involved with the measure of TSI).  I have provided links to peer reviewed studies that support my position.  The Comments rules for Skeptical Science require peer reveiwed links to support your position.

      Eclectic and I have shown that TSI is not required to determine that carbon pollution is the cause of warming and not an unmeasured increase in TSI.  It is incumbent on you to answer these arguments.  Since you have refused to even acknowledge them I presume that you have realized that you cannot respond and concede our position.

    You have not " [shot] down a major cornerstone used by folks claiming that the observed warming must be due then to manmade greenhouse gas emissions", you have made unsupported assertions from unsubstantiated authority.  Tom Curtis above has demsonstrated (using peer reviewed sources) that your argument is incorrect.  That leaves you with only your unsupported opinion.  Since I have shown that TSI measurements are unnecessarry to prove that warming is not due to the sun, you have shown nothing.

    You are welcome to decide who you want to communicate with.  If you want to argue by claiming expertise in a subject you clearly have not studied well (I look forward to your peer reviewed studies that answer Tom above) , claim you have proven a substantial part of AGW theory is incorrect when you have not, refuse to provide peer reviewed data and ignore arguments from other sources of analysis that show your argument fails, go for it.  I think you will find that your unsupported opinions do not convince people at this web site.

    A word to the wise: people who come here with bold assertions that they have overturned everything scientists have learned over the past 150 years are frequently recieved harshly.  If you instead ask questions about what you do not understand people are happy to discuss these issues in great detail.  If you say you do not understand the  TSI measurements and how they relate to overall AGW theory you will be much better received.  If you say you will not discuss how scientists know that the sun is not the cause of warming because you are an expert (at something you have never measured) you will not get friendly responses.

  • It's the sun

    Tom Curtis at 17:36 PM on 15 October, 2016

    Bill N @various, the IPCC AR12, Working Group 1, Chapter 8.4.1 says:

    "Total solar irradiance (TSI) measured by the Total Irradiance Monitor(TIM) on the spaceborne Solar Radiation and Climate Experiment(SORCE) is 1360.8 ± 0.5 W m–2 during 2008 (Kopp and Lean, 2011) which is ~4.5 W m–2 lower than the Physikalisch-Meteorologisches Observatorium Davos (PMOD) TSI composite during 2008 (Frohlich, 2009)."

    The SORCE/TIM home page gives an accuracy to 1σ of 350 ppm, or 0.48 W/m^2 of 1360.8 W/m^2, so I take that accuracy to be represnt one standard deviation, giving a 2σ accuracy of 0.07%.  The SORCE/TIM home page also gives a"long term repeatability" of 10 ppm per annum (1σ), or 0.002% per annum, where the former is the accuracy of the absolute estimate of TSI, and the later is the accuracy of estimates of relative change in TSI from year to year.

    It also says:

    "TSI variations of approximately 0.1% were observed between the maximum and minimum of the 11-year SC in the three composites mentioned above (Kopp and Lean, 2011). This variation is mainly due to an interplay between relatively dark sunspots, bright faculae and bright network elements (Foukal and Lean, 1988; see Section 5.2.1.2). A declining trend since 1986 in PMOD solar minima is evidenced in Figure 8.10. Considering the PMOD solar minima values of 1986 and 2008, the RF is –0.04 W m–2. Our assessment of the uncertainty range of changes in TSI between 1986 and 2008 is –0.08 to 0.0 W m–2 and thus very likely negative, and includes the uncertainty in the PMOD data (Frohlich, 2009; see Supplementary Material Section 8.SM.6) but is extended to also take into account the uncertainty of combining the satellite data."

    (Emphasis in original)

    The Guidance Note on Uncertainty indicates that something is "very likely" if it has a 90-100% probability (Table 1), so we can take that uncertainty range to be the 90% uncertainty range.  The reduction in the PMOD composite over that period is 0.04 W/m^2, giving an estimated reduction, with 90% confidence intervals of 0.04 +/- 0.04 W/m^2, which represents an accuracy of 0.003% relative to the 1360.8 W/m^2 estimate of TSI from SORCE/TIM.  Assuming a normal distribution, a 90% confidence interval of 0.04 W/m^2 is equivalent to a 2 standard deviation interval of 0.05 W/m^2. Even allowing that the error on the estimate the IPCC estimate represents an 80% range ( the minimum consistent with their expressed likelihood) yields a 2 standard deviation interval of 0.06 W/m^2.  

    The current version of the PMOD composite, with monthly values estimated by taking the mean of daily values, excluding missing values) shows an OLS trend of -0.014 +/- 0.008 W/m^2 per annum (2σ error margin).  The absolute difference between the Months with the lowest Sunspot Number is the respective minimums (June 1986 and Aug 2008 respectively) is -0.255 W/m^2, while that between the average of the two years is -0.219 W/m^2.  The trend difference is -0.3 +/- 0.18 W/m^2.  Expressed as changes in forcing to two significant figures, these values are -0.04 W/m^2, -0.04 W/m^2 and -0.05 +/- 0.03  W/m^2 respectively.  It appears, therefore (and is confirmed by the IPCC 8.SM.6), that the value expressed by the IPCC is the absolute difference between the relevant years expressed as a forcing rather than as TSI.  Expressed as percentages of the 2008 TSI (SORCE/TIM), they all represent 0.02% of the value, rounded to two significant figures. 

    The IPCC's (and PMODS's) stated accuracy is less than the relative accuracy of TIMS (by all accounts their most reliable single instrument) if expanded over the June 1986-Aug 2008 interval, which would have an accumulated error of 0.6 W/m^2 (or 0.11 W/m^2 expressed as a forcing; 2σ error margin).  This is in part due to the fact that typically (and always from 1986 to 2008) at least three space borne instruments have observed TSI at any given time.  This is important, both because the drift in satellite instruments is unlikely to be synchronous, and because multiple measurements reduce error (as errors are summed in quadrature).

    More importantly, in developing PMOD, each satellite record of TSI was fit against the square root of the Sunspot Number (SSN), which then provided a framework to develop the composite TSI record.  In that way, the superior accuracy of the SSM record is used to overcome the deficiencies of the satellite record.  Because of this use of the SSN, and because of the use of multiple instruments observing simultaneously, no consideration of innate instrument accuracy alone can correctly characterize the error in TSI observations.  

    The use of SSNs by PMOD also indicates that belief in the decline in solar intensity is not based on satellite instruments alone.  Indeed, the SSNs show a trend of -3.09 +/- 1.33 per annum (2σ range).  Given that you accept the accuracy of the SSN record, and accept a high correlation between SSNs and solar intensity, that should mean you also agree that solar intensity has declined over the last four decades, although we may be in doubt as to just by how much.

    In the mean time, and with respect, you retain effective anonymity so your authority on this or any topic (as is mine) is just that of "some random guy on the internet".  Even if we take your claimed credentials and expertise on face value (as I am inclined to do), those credential and experience are no better than those of the teams of scientists working on individual TSI instruments, composites, and in reviewing the data for the IPCC.  Given that, I see no reason to give your beliefs on this matter particular credence.  

  • It's the sun

    Eclectic at 11:32 AM on 15 October, 2016

    Bill N . . . my apologies for not completely following your chain of logic.  Surely there must be some factor I am overlooking?

    You have said there is no clear evidence (from the satellite measurements) that TSI has increased or fallen, since 1978.   As to TSI in earlier times : observed sunspot activity has less than perfect correlation. And ice-core and tree-ring proxies are based on a deposition of certain isotopes - an isotopic deposition suffering considerable variation from changes in terrestrial magnetic field and changes in atmosphere circulation.

    It is well established that there has been rapid global warming in the last 50 or 100 years.   And the general scientific view is that that change is well explained by the radiative properties of CO2 (and other greenhouse gasses) .

    Yet you yourself feel that possibly there has been a relatively large [ 0.1% or greater ] rise in TSI over the past century or two, and which possibly may have caused all the recent warming.

    In addition, you will somehow have to abolish the warming effect of CO2 and other greenhouse gasses, in order for your proposed (but not demonstrated) large and rising increment of TSI to do its work in heating up the Earth.    On top of that, you will need to explain away the observed diurnal and climatic changes mentioned by Michael Sweet (above) .

    Bill N, your line of argument does not hold together.

    So there must be some other factor which I have overlooked?

  • It's the sun

    michael sweet at 09:47 AM on 15 October, 2016

    Bill,

    It strikes me that you are speaking with a great deal of confidence for someone who has not read very much about AGW.

    In the Scientific Guide to Global Warming Skepticism, there is an illustration at the bottom of page 3 that shows why we know without doubt that the warming is due to carbon dioxide and not the sun.  If the Sun was causing warming we would expect days to warm faster than nights, summer faster than winter, the stratosphere to warm with the troposphere, the same amount of heat to be returning to Earth as backradiation, the same heat escaping to space and several others.  We measure that nights are warming faster than days, winter is warming faster than summer, the stratosphere is cooling as the troposphere warms, more backradiation, and less heat is escaping to space.  You will have to counter all of these observations if you wish to support your claim that unmeasured TSI increase could be the cause of warming.

    I suggest that you forget all the propaganda that you have read at WUWT and other skeptical sites and try reading the Newcomers Start Here post on the home page.  If you continue to post here with claims supported only by your opinion you will not get very far.  Your opinion as an engineer about TSI measurements does not count much against the observations I have summarized above.  It is not necessary to have any TSI measurements to be sure that the warming is caused by carbon dioxide pollution and not the Sun.  

    Keep in mind that the warming caused by carbon dioxide pollution was predicted in 1896 by Arhennius.  Arhennius predicted most of the observations that I listed above 100 years before they were measured.  You are countering a 120 year old scientific prediction with an ad hoc explaination that has a great deal of evidence against it and no measured support.

  • It's the sun

    Bill N at 06:28 AM on 15 October, 2016

    Hi Micheal,

    You have hit the nail on the head bringing up the other measurements spanning since well before the satellite measurements.  Indeed,  a key reason for me "throwing out" reliance on the satellite measured TSI changes, is to force reliance on just the ground based measurements you are talking about.

    These measuremnts primarily fall into two groups:  accurate measurements of sunspots and faculae since around 1850; measurements of carbon-14 levels in tree rings going back much further.

    On the sunspot/faculae measurements, the best solar modelling to date establishes a causal relationship between average magnetic flux in the solar outer layer (averaged over the 11yr sunspot cycles), and the "amplitude" of the sunspot cycles, with a higher amplitude meaning a higher magnetic flux.  In turn, the solar modelling also establishes a causal relationship between the average magnetic flux and the TSI, with a higher flux meaning a higher TSI.  Now until the latest sunspot cycle, the cycle "amplitude" (variation in areal sunspot/faculea coverage) has been observed to increase since the beginning of when such measurements were made starting around 1850.  Using the latest greatest TSI vs magnetic flux vs sunpot cycle amplitude modelling, the predicted averaged TSI has increased on the order of 0.1% from 1850 to present.  Based on our latest greatest atmospheric temperature modelling under solar loading, this predicts beautifully the observed solar warming of about 0.7C from 1850 to present.  So there you go.  Once you throw out the unreliable satellite data, then the observed global warming since 1850 is completely explained by the observed/modelled solar TSI increase, down to evidently the observed "stratospheric cooling" by including the above mentioned "atmospheric layering effect" of the natural greenhouse gas that is always present.

    The carbon-14 data is used to determine past TSI levels by using modelling in which the cosmic rays inducing carbon-14 production, are partially "blocked" by solar wind production that has a known causal relationship with sunspot activity.  The carbon-14 is absorbed by trees as they grow, with the tree rings giving a timeline for the past carbon-14 presence in our atmosphere.  This then is used to infer past sunspot and solar wind activity, which infers then past TSI levels.  The resultant historical TSI levels inferred from this technique, fit beautifully with the TSI levels based on the directly observed sunspots since 1850 to present, providing then key support for the conclusion that the observed global warming is solar induced.

    With my principle goal at this website established, mainly throwing out reliance on the satellite data, the onus on folks thinking that the observed global warming is due to manmade greenhouse gas emissions, is to prove that it is NOT indeed due to a solar TSI increase since the mid-1800's, as this is the most straightforward conclusion that can be made based on the best modelling and science we have to date.

  • It's the sun

    michael sweet at 03:02 AM on 15 October, 2016

    Bill,

    In the graph of TSI in the intermediate OP their data goes back to 1880, well before satellite measurements.  In the link to their data (Krivova 2007) they describe a model that estimates the TSI based on a series of ground based measurements.  I am not expert like you, but it seems to me that your argument that the satellites are not stable enough for the displayed data is moot since the data is not satellite based.  Perhaps they use the satellites to ground truth the model?  Can you address the issues with the model used to generate the graph of the data?  Obviously it is best if you have a direct measurement but for the period before satellites you have to use the data you have and a model.  The described calibration issues of the satellites are much higher than the changes in the TSI over the period of observation.  Since it is based on ground measurements can the model be used to correct the calibration errors of the satellites?  

    It seems reasonable to me that if you had 30 years of satellite data you could calibrate a model that would generate data covering the period before you had the satellite data.  For the climate argument they do not require the model to be absolutely correct, they only need relative TSI to determine if the warming correlates with TSI.

    In the linked data source they describe several models of TSI that are used to generate data over various time periods before we had satellite or ground based machne based TSI measurements.

  • It's the sun

    Bill N at 22:54 PM on 14 October, 2016

    There is no way that the satellite instruments measuring TSI are stable enough to make any claims one way or another about its change since they have been flown (1978 to present).  So the belief that TSI has been stable or even gone down based on these measurements is a myth.

    Stability requirement:  As is understood, a change on the order of 0.1% in TSI could yield the observed average warming.  So in order to use these instruments to successfully state that this is not happening, they would need to be stable by <0.01% !!!.  This is this stability requirement stated by the instrument engineers and scientists themselves (see for instance the Wikipedia "Solar irradiance" article, and then click on links to articles written by the instrument engineers/scientists).  Mind you, this requirement needs to be satisfied over the entire series of instruments flown since 1978, with the change in TSI "passed through" several of these instruments to the present, using each instrument in the "daisy chain" to calibrate the next one.

    Community stability assessment:  In the community of engineers and scientists that actually build and fly these instruments, there is a high degree of skepticism that they are stable to the 0.01% requirement (or even anywhere close to this).  Just Google the subject of satellite TSI instruments, or fan out from the Wikipedia "Solar irradiance" article, for yourself.

    My background:  I was an optical instrument engineer for my entire career, with a lot of that time working as a contractor for NASA GSFC (such as working to fix the Hubble telescope).  I designed, built, and calibrated flying instruments, as well as the instruments used to test same.  A considerable amount of my time was spent dealing with issues affecting the long term performance of optical instruments, including radiometric stability.  With this experience, I was constantly called upon by NASA to evaluate the instruments built by others.  If NASA had used GSFC to fly these instruments, as it "should" have done since it is GSFC's pervue to fly any unmanned bird within the bounds of lunar orbit, there was a fair likilyhood that the long term stability assessment of these instruments would have ended up on my desk.

    My stability assessment:  Any optical engineer with any radiometric experience whatsoever, would ROFL if you claimed that you had a radiometric instrument stable to <0.01%, let alone a whole "daisy chain" of them flown over decades in space.  This is certainly true after examining the design of these instruments (active cavity electrical substitution radiometers), and the environment in which they operate.  Leaving the electronics aside, there is no way that these radiometers could be assumed to optically stable to 0.01%, or anywhere near that performance.  OMG, your hitting the radiometric cavity with full vacuum sunlight.  Do you have any idea what that does to any optical surface over time?  Also, no matter how careful you are, there can and will be outgassing from the internal surfaces of the instrument cavity (especially all that baffling), which will collect on the (designed) specular surface of the light cone used trap the light.  When the sunlight hits that contamination, it will "fry" it, causing "globules" that will increase the diffuse reflectivity of the cone over time, thereby decreasing the amount of specular light absorbed by the cone, and therefore its sensitivity.  This is something we intrument engineers have seen time and again with spaceborne instruments measuring full sunlight.  The present generation has some accompanying ground "witness instruments" that are twins of what is being flown, and the engineers/scientists have wisely put one permanantly in a vacuum chamber along with a solar simulator to measure its long term stability under the expected conditions.  But this measurement is still in progress (in my understanding), and I'll bet my last dollar that this witness won't even come close to being stable enough.

  • It's the sun

    Tom Curtis at 14:29 PM on 15 July, 2016

    sailingfree @1169, a brief read of Soon, Connolly and Connolly (SC&C)shows it to be a smorgasborg of cherry picks.  They start by cherry picking the ACRIM reconstruction of Satellite measurements of Total Solar Irradiance (TSI) in preference to the PMOD reconstruction, or the IRMB reconstruction.  They do this despite the fact that, by their own admission that comparisons of the reconstructions to ground based data were "were slightly
    better for PMOD".  They then cherry pick one of eight reconstructions of TSI since 1850, choosing one with the highest variability in TSI.  From what I know of the issues, neither choice is justified, but I will leave that to be argued by others.

    Moving on, they procede to cherry pick their own NH temperatures series using just rural stations from China, the United States, Ireland and the Arctic Circle.  Last time I looked, there were more locations than that in the NH.  Their resulting reconstruction is significantly different from that using the GHCN (essentially the NOAA temperature reconstruction).  That is odd because Caerbannog has repeatedly shown using randomly selected rural stations chosen to maximize territorial coverage that just a few tens of stations essentially reproduces the standard records:

     

    More troubling than the difference is the cherry pick of a NH only temperature reconstruction.  The NH temperature record is considerably more variable than that of the SH:

    Presumably global forcings will have global effect, so that effects seen primarilly in one hemisphere only cannot be attributed to global forcings.  The choice of a NH temperature series (strictly a 3 nation plus Arctic series) invalidates the study without further analysis.

    Proceding further on, SC&C test the correlation between their cherry picked reconstruction of TSI and their cherry picked reconstruction of NH temperatures.  They then assume that CO2 forcing only accounts for the residual of TSI based temperature reconstruction.  This pair wise comparison proceedure is not a valid statistical technique for testing the correlation of multiple factors.  If it were valid, it would generate the same linear dependence between CO2 and temperature regardless of whether you tested CO2 against temperature and TSI against residuals, or the reverse.  As it happens SC&C do test both and show that they do not generate the same factor.  They claim this demonstrates they should use the solar first priority, whereas it actually disproves the validity of their technique.

    Finally, SC&C find a variation in temperature relative to changes of TSI of 0.2112 C/ (W/m^2) (Figure 28 a).  Adjusting for albedo and the fact that the Earth is spherical, that becomes 1.207 C/ (W/m^2) of solar forcing.  For an equivalent forcing to the doubling of CO2, that represents a TCR of 4.465 C.  In contrast, for CO2 they find a change in temperature relative to change in forcing of -0.1039 C/ (W/m^2) (Figure 29 b).  For a doubling of CO2, that represents a TCR of -0.384 C.  

    The TCR of CO2 differs from their stated estimate, which was calculated based on CO2 concentration (Figure 29 a) rather than radiative forcing.  We can default to their stated TCR value of 0.44 C for CO2.  That leaves unexplained why their trend line for CO2 and for CO2 radiative forcing have opposite signs.  It also leaves unexplained why they repeatedly mistate the TCR as being the "climate sensitivity".  The most fundamental problem however is, why is the temperature response to changes in solar radiative forcing 10 times greater than that due to CO2 radiative forcing in their model?  That is an extraordinary result that requires extraordinary explanation.  The default assumption must be that reponse to radiative forcing is approximately the same across all forcings.

    To summarize, even if we ignore their multiple cherry picks - the use of a NH only temperature series; and of singular sequential linear regression rather than multiple regression means the paper is scientific garbage.  Unsurprisingly, it produces a garbage result (temperature responce to solar forcing ten times that due to CO2 forcing).

  • Venus doesn't have a runaway greenhouse effect

    HK at 07:31 AM on 30 June, 2016

    This discussion has been mostly about Earth lately, so I thought it was time to return to our beautiful/hellish sister planet!

    The chart below illustrates the point I made in @141 and in the last paragraph of @160:
    If the extreme temperature on Venus was caused by any kind of additional energy source rather than the atmosphere slowing down the heat loss to space, the red curve based on measurements outside the atmosphere would have been much closer to the black one.
    It’s hard to find a more crystal clear fingerprint of an extreme greenhouse effect in action than this!

    Radiation from Venus' surface vs. radiation escaping to space

    Source:
    SpectralCalc.com and figure 3c here.

  • Mars is warming

    HK at 02:53 AM on 13 June, 2016

    Dutchwayne@49:
    "There is little heat being generated in the planet unlike Earth and therefore virtually all the thermal impact comes from only one driver: Solar radiance."

    That is also true for our own planet! The sun delivers about 240 watts/m2 or at least 2600 times more than the tiny 0.09 watts/m2 trickling out from the interior of Earth. The corresponding number for Mars is probably about ¼ of that. Only the gas giants have enough internal heat to make a large impact on their climate.

    "Mars is in fact a better indicator of Solar activity variation than Earth…"

    Maybe, but if we can measure the solar activity directly from Earth much more precisely, what’s the point in making a poor estimate based on very incomplete temperature data from Mars? Those measurements clearly show a reduction of the total solar irradiance (TSI) over the last 35 years or so, especially during the last solar cycle. (see the last chart in @51). In addition, the recent warming on Earth has certain fingerprints that don’t match what to expect from a warming caused by increased solar activity. Some of these fingerprints are more warming in winters and at high latitudes and a cooling of the upper atmosphere.

    If Mars actually has undergone a warming caused by the reasons mentioned by Tom in @51, it’s just a coincident and certainly not related to the warming on Earth, as there is no way albedo changes on one planet can have an impact on another.

  • Mars is warming

    Tom Curtis at 08:20 AM on 12 June, 2016

    Dutchwayne @49, it is difficult to take seriously a comment that says, "The biggest weakness though is on the side of the critics of Mars warming"  while ignoring that the evidence put forward for Mars warming by AGW "skeptics"  seems primarilly to be evidence of changing seasons.  See my comments at 29, and at 31, along with HK's comment at 40.

    If we consider the more serious evidence of a warming Mars, it comes in two forms.  The first is the study by Fenton (2007) that showed changes in albedo on the Mars surface:

    Averaged across all areas, the surfuce darkened, implying a greater absorption of sunlight.  From this, Fenton deduced a global warming on Mars of surface by approximately 0.6 C.  Note that we do not possess independent measurements of Global Mean Surface Temperature of Mars of sufficient duration to independently confirm the magnitude of that increase.  Rather, from a change in albedo, we deduce a change in GMST for Mars.

    This immediately suggests a problem with your claim that we use Mars as "a cleaner way to determine the true solar radiation impact on Earth".  The GMST of Mars is not effected by incoming sunlight alone, as you claim, but also by changes in albedo.  Worse, while we know of the changes in albedo, we do not know the value of the change in GMST on Mars in any event in order to make your indirect measure.

    There is some direct, but inconclusive evidence of warming on Mars.  Specifically there have been reductions in the amount of CO2 ice at the South Pole, shown here:

     

    (See also here and here.)

    This evidence does show that there is polar warming at the South Pole of Mars, but does not show conclusively a reduction of martian GMST, as it may be matched by cooling elsewhere (particularly tropical regions, in some of which there are significant increases in albedo).  However, it also shows an increase in CO2 in the martian atmosphere - and consequently an increase in the martian greenhouse effect.  This introduces a second confounding factor to the martian GMST proxy of changes in insolation, measurements of which we do not have in any event.

    Given these two confounding factors (changes in martian albedo, and changes in martian atmospheric CO2) and the lack of long term temperture measurements on Mars, perhaps (as Eclectic sugggests @50) we should relly on the satellite record of Total Solar Irradiance.  That record, of course, shows an overall decline in TSI since the 1980s:

     

  • A striking resemblance between testimony for Peabody Coal and for Ted Cruz

    Tom Curtis at 13:43 PM on 23 January, 2016

    It occurs to me that part of the reason people buy the "satellites are more accurate" line is that they never see graphs of the raw data.  I'm not sure where you could find a graph of the true raw data, and certainly would not be able to make one.  But Po-Chedley et al (2014) have a graph of minimally processed data:

    As I understand it, the top panel already has processing in the form of a rough alignment of means ot each satellites series.  At least, it would be stunning if such a close allignment of absolute values was achieved in raw data from satellites.  For comparison, here are the absolute alignment of satellite measurements of the solar constant:

    Certainly similar problems of alignment are experienced by Earth observing instruments measuring the IR spectrum, and I see no reason why the microwave observing instruments (which operate on the same principles as the IR and solar observing instruments) should be any different.

    Further, it is possible that the top panel in Po-Chedley's figure 3 also includes adjustments for problems with the hot target.

    In any event, the transition between the top panel and the third panel is not a given.  It represents serious adjustment to the data - and different teams disagree about how that adjustment is best done.  As a result they also significantly differ about satellite data trends.

  • It's the sun

    Tom Curtis at 07:04 AM on 13 October, 2015

    Pfc Parts @1153:

    1)

    "The paleo record clearly shows an upward trend in TSI. To counter the obvious conclusion reached from these measures, the author changes his reference to satellite observations, which show a locally declining trend. This is, without doubt, a choice biased by the author's ideology and his intention to refute a rising TSI either exists or is a significant factor in rising global temperature."

    This is transparently false.  The sunspot number shows the same decline since 1979 (ie, since the commencement of satellite observations) as is to be found in the PMOD TSI index.

    2)

    "In general, use of measures for either solar output (TSI) or surface temperature taken before the broad use of the telegraph should be discarded; these measures were taken by hand using uncalibrated instruments and communicated by horse drawn carriage and sailing ship. The are not accurate or precise to the levels claimed by the models based on them, which are defined in fractions of a Watt and degree Centigrade. It's frankly absurd to use these data. Reconstructions (Wang et. al.) are even more difficult to accept; the error of estimate exeeds the observed variation in the measured value."

    Uncertainty of the mean of n independent variables equals the uncertainty of the measurement divided by the square root of the number of measurements.  Therefore if we have a number of observations with an uncertainty of 2 then we have uncertainties of (with number observations followed by uncertainty:

    2 0.71
    5 0.45
    10 0.32
    20 0.22
    100 0.10
    200 0.07
    1000 0.03
    2000 0.02
    10000 0.01

    As can be seen, uncertainty decreases rapidly with multiple measurements.  Ergo, the uncertainty in such things as global means surface temperature due to even quite large instrument errors is small.  Uncertainty due to coverage biases are a different matter but that has nothing to do with the accuracy of instruments or the means of communicating results.

    Clearly your mathematical argument does not hold water (which is itself no surprise as the scientists doing the reconstructions are themselve competent mathematicians).

    @1154, the linear regressions in question are not thermodynamic equations.  Ergo your "point" is a simple non sequitur.  

  • It's the sun

    Pfc. Parts at 05:23 AM on 13 October, 2015

    The fundamental problem with this analysis lies in the measurements used. The author begins with a paleo record (Wang 2005), which provides an estimate of TSI based on theororetical reconstructions and concludes his argument with direct instrument measures of TSI using advanced orbital measures obtained over the period between 1978 and 2010.

    The paleo record clearly shows an upward trend in TSI. To counter the obvious conclusion reached from these measures, the author changes his reference to satellite observations, which show a locally declining trend. This is, without doubt, a choice biased by the author's ideology and his intention to refute a rising TSI either exists or is a significant factor in rising global temperature.

    In general, use of measures for either solar output (TSI) or surface temperature taken before the broad use of the telegraph should be discarded; these measures were taken by hand using uncalibrated instruments and communicated by horse drawn carriage and sailing ship. The are not accurate or precise to the levels claimed by the models based on them, which are defined in fractions of a Watt and degree Centigrade. It's frankly absurd to use these data. Reconstructions (Wang et. al.) are even more difficult to accept; the error of estimate exeeds the observed variation in the measured value.

    This is the root of the problem climateologists face when building models or presenting the results of them; they lack sufficient data. Climate change is a slow process that is detectable in very small changes. To be useful, measurements used must come from calibrated instruments with the accuracy and precision needed to build models capable of making predictions with error bars signigicantly smaller than +/- 1 degree centigrade. It is statistically impossible to use data such as those presented in this article to achieve that goal.

    Impossible. This is not an ideologically based argument; it is mathematical. The problem Climate Science faces isn't theoretical, it's based on measurement. Measurements with the necessary precision and accuracy simply are not available over the necessary time frame. There is no way to correct this problem.

  • Models are unreliable

    Tom Curtis at 13:50 PM on 5 May, 2015

    Klapper @891:

    1)

    ""...because of recent small scale volcanism (also not included in the models).."

    I don't accept that argument."

    I really don't care about your propensity for avoiding inconvenient information.  Recent papers show that the volcanic effect has influenced temperature trends and and TOA energy imbalance.  Thus we have Santer et al (2014):

    "We show that climate model simulations without the effects of early twenty-first-century volcanic eruptions overestimate the tropospheric warming observed since 1998. In two simulations with more realistic volcanic influences following the 1991 Pinatubo eruption, differences between simulated and observed tropospheric temperature trends over the period 1998 to 2012 are up to 15% smaller, with large uncertainties in the magnitude of the effect."

    Haywood et al (2013):

    "Using an ensemble of HadGEM2-ES coupled climate model simulations we investigate the impact of overlooked modest volcanic eruptions. We deduce a global mean cooling of around −0.02 to −0.03 K over the period 2008–2012. Thus while these eruptions do cause a cooling of the Earth and may therefore contribute to the slow-down in global warming, they do not appear to be the sole or primary cause."

    And most directly of all, Solomon et al (2011):

    "Recent measurements demonstrate that the “background” stratospheric aerosol layer is persistently variable rather than constant, even in the absence of major volcanic eruptions. Several independent data sets show that stratospheric aerosols have increased in abundance since 2000. Near-global satellite aerosol data imply a negative radiative forcing due to stratospheric aerosol changes over this period of about –0.1 watt per square meter, reducing the recent global warming that would otherwise have occurred. Observations from earlier periods are limited but suggest an additional negative radiative forcing of about –0.1 watt per square meter from 1960 to 1990. Climate model projections neglecting these changes would continue to overestimate the radiative forcing and global warming in coming decades if these aerosols remain present at current values or increase."

    If you add the -0.1 W/m^2 additional aerosol load after 2000 to the approximately -0.1 W/m^2 from the the discrepancy between modeled and observed solar forcing, you get a CMIP5 absolute value energy imbalance of 0.72 W/m^2 from 2000 to 2010, ie, only 16% greater than observed (Smith et al), and using drift corrected figures the modelled TOA energy imbalance becomes 14.5% less than the observed values.  Forster and Rahmstorf used values from prior to these analyses and so cannot be expected to have incorporated them.  Therefore citing Forster and Rahmstorf  is not a counter argument.  It is merely an appeal to obsolete data.

    2)  With regard to the SORCE data, the situation is very simple.  The SORCE reconstruction is essentially an earlier reconstruction that was benchmarked against PMOD which has been rebenchmarked against the SORCE data.  The effect of that it to shift the entire reconstruction down by the difference between the TSI as determined by PMOD, and that as determined by SORCE.  Consequently the TOA shortwave down radiation is shifted down by a quarter of that value over the entire length of the reconstruction.  Because that shift occures over the entire length of the reconstruction, it means the difference between twentieth century values of the solar forcing and preindustrial values(ie, rsdt(y) minus rsdt(pi), where rsdt(y) is the downward short wave radiation at the tropopause in a given year, and rsdt(pi) is the downard short wave radiation at the tropopause in 1750) does not change, because both the twentieth century values and the preindustrial values have been reduced by the difference between PMOD and SORCE.  Ergo there is no appreciable change in the solar radiative forcing in the twentieth century as a result of the difference.  

    In contrast, for twenty-first century values, the models use a projection so that the difference between (model rsdt minus SORCE value) and the mean twentieth century difference is significant because it does represent an inaccurate forcing in model projections.

    The tricky bit comes about in a direct comparison of TOA energy imbalance.  In determining the "observed" energy imbalance, Smith et al following Loeb et al adjust the satellite observed rsdt, rsut and rlut so that the net value matches the calculated increase in OHC from 2005-2010, and so as to maximize the likilihood of the adjustments given the error margins of the three observations.  Consequently, in all likelihood, they have adjusted the rsdt upward from the SORCE estimate.  Therefore when comparing observations to models we are dealing with two adjustments to rsdt.  First we have an implicit adjustment in the models that results in the radiative forcing being preserved in the models.  This implicit adjustment is equivalent to the average difference between the model rsdt and the SORCE reconstruction.  Secondly, we have another smaller adjustment to the SORCE value that results from the benchmarking of the empirical values.  Because this adjustment is smaller than the first, it generates a persistent gap between the observed and modelled rslut resulting in a persistent difference in the energy balance.

    From the fact that this gap is persistent, the size of the TOA energy imbalance and that temperatures were rising from 1861-1880, it is evident that the gap (and hence the persistent bias) is less than 0.2 W/m^2.  I suspect, however, that it is at least 0.1 W/m^2 and probably closer to 0.2 than to 0.1 W/m^2.

    3) 

    ""....KNMI climate exporer (sic) are strictly speaking top of troposhere"

    What makes you think that?"

    The fact that the graph of rsdt shows a clear downward spike in 1992 (Pinatubo) and another smaller one in 1983 (El Chichon).  That makes sense with increases in stratospheric aerosols, but is impossible if the data is trully from the TOA (rather than the TOA by convention, ie, the tropopause).

    4) 

    ""...CMIP5 forcings are known to be overstated by 0.2-0.4 W/m^2..."

    "...Ergo it is jumping the gun to conclude from this that the models are in error."

    Both above statements cannot be true. The models according to you are (currently at least) in error. If the models are not in error why do they need to correct the TOA imbalance numbers for model drift?"

    By "both of these statements cannot be true", you really only indicateing that you don't understand it.  In fact, everytime you said it in the post above, you were wrong.

    So, lets start from basics.  Climate models are models that, given inputs in the form of forcings produce outputs in the form of predictions (or retrodictions) of a large number of climate variables.  When you have such a model, if you feed it non-historical values for the forcings, it is not an error fo the model if it produces non-historical values for the climate variables.  So, when we discover that forcings have been overstated for the first decade and a half of the twentyfirst century, we learn absolutely nothing about the accuracy of climate models.  We merely rebut some inaccurate criticisms of the models.  It follows that the first sentence does not contradict, but rather provides evidence for the second.

    With regard to model drift, had you read the relevant scientific paper (to which I linked) you would have learnt that it is impossible to determine without exhaustive intermodel comparisons whether or not drift is the result of poor model physics, too short a run up time or poor specification of the initial conditions.  Only the first of these counts as an error in the model.  Ergo, you cannot conclude from this that because of model drift, the models are flawed.  All you can conclude is that, if you accept that the model drift exists, then you ought to correct for it and that uncorrected model projections will be rendered inaccurate by the drift.  Now here you show your colours, for while you steadfastly refuse to accept the dift corrected TOA energy imbalance figures as the correct comparitor, you want to count model drift as disproving the validity of models.  That is an incoherent position.  Either the models drift and we should compare drift adjusted projections to emperical observations, or they don't drift in which case you can't count drift as a problem with the models.

  • Models are unreliable

    Klapper at 17:13 PM on 4 May, 2015

    @Tom Curtis #889 & 890:

    "... It follows that minor discrepancies over more recent periods between model predictions..."

    I don't think they are minor, I think they help explain the recent lack of surface temperature gain in the observations compared to that projected by the models. The discrepancy from observations to models is currently 48% (0.90 to 0.62 W/m^2 TOA energy imbalance).

    "...because of recent small scale volcanism (also not included in the models).."

    I don't accept that argument. Forster and Rahmstorf 2011 did multivariate regression on the effects of TSI, ENSO and AOD, albeit against surface temperature, not TOA imbalance, but their Figure 7 shows essentionally no significant effect form aerosols after the mid-nineties (as least compared to ENSO and TSI). You'd be better off to include ENSO in your arguments than small volcanoes as I doubt the latter come close to the effect of the former. I suspect that's your next argument, ENSO deflated the observed TOA imbalance in the first decade of the 21 first century, which the models didn't include.

    "...Slight changes in a forcing consistently applied over the whole duration will not effect the anomaly and therefore are not relevant.."

    That's a rather astounding statement given it's untrue if you mean that changes in forcing won't affect the delta in the temperature anomaly.

    "...You will notice that the multi-model mean is about 0.2 C less than (ie colder than) the observed values..."

    Irrelevant. The forcing changes the warming rate, not the baseline which is dependent on the starting temperature/starting heat content. The warming rate in the models is essentially the same as the observations for surface temperature, yet the magnitude of the solar input appears to be approximately 0.85 W/m^2 too high (if we can believe the SORCE TSI reconstruction), in the CMIP5 model inputs. This is a serious issue you chose to treat as if it's not important but it is. Either the models are using the wrong input, or the SORCE 20th century TSI reconstruction is wrong.

    "...Further note with respect to your "models always run hot" comment on another thread, in this and many other cases, they run cold..."

    Calculate the SAT trend in all of the models and tell me what percentage run "hot" and what percentage run "cold"? Not many of them run cold and we shouldn't waste our time on sematic arguments when the ensemble mean is consistantly above the observations for TOA imbalance. Look at your own table above. The model forcing is higher than the observations in all but 1 of 10 period comparisons to the observations (5 CMIP5, 5 CMIP5 "adjusted").

    "....KNMI climate exporer (sic) are strictly speaking top of troposhere"

    What makes you think that? Maybe there's an issue with translation from Dutch but the description in the CMIP5 "standar output" document for the "rlut" variable is:

    "at the top of the atmosphere (to be compared with
    satellite measurements)"

    And if the "rsdt" varible was Top of the troposphere, it should be lower than the TSI reconstruction, not higher, as some incoming LW would not make the tropopause due to absorption in the stratosphere.

    "...(Note again, such a constant offset of a forcing would not affect appreciably changes in anomaly temperature values.)..."

    Once again, we are not talking about offsetting forcings, I agree it doesn't matter, we are talking about a difference in the net between input and output TOA, which do affect anomaly values. It is not true the net forcing in the models is the same as the observations.

    "...CMIP5 forcings are known to be overstated by 0.2-0.4 W/m^2..."

    "...Ergo it is jumping the gun to conclude from this that the models are in error."

    Both above statements cannot be true. The models according to you are (currently at least) in error. If the models are not in error why do they need to correct the TOA imbalance numbers for model drift?

    I think my next step will be to compare the CMIP5 model TSI input to the ACRIM TSI reconstruction.

  • Missing Arctic warming does contribute to the hiatus, but it is only one piece in the puzzle.

    Tom Curtis at 10:17 AM on 21 February, 2015

    drebich @19, your questions are nonsensical.

    Taking the first question, sea levels change regionally due to changes in wind circulation and ocean heat content.  On top of that, there are tides, waves, and storm surges, all of which contribute to a very variable local sea level.  Finally, different shores are rising, or falling due to plate tectonics, and in some locations, due to an ongoing rebound from the melting of the massive ice sheets of the last glacial.  Consequently, while taking an average of a globally distrubuted collection of tide gauge measurements can unequivocally show that sea levels have rising at a rate inconsistent with the planet not warming (see graph below), no mark on a beach can plausibly be a demarcation point, unless set high enough that it will not be passed for several decades.

    Your second question makes even less sense in that you want a demarcation point between ice age and global warming from sea ice.  That is a nonsense request as, first, the cause if ice ages is the spread of ice sheets on land, rather than the sea ice itself; and second because we have just come of 10,000 years of interglacial (colloquially, not an ice age) which was not a period of global warming, let alone anthropogenic global warming.  While the history of sea ice in the NH unequivocally shows the current dearth of sea ice to be astonishing (absent global warming) and unprecedented in recent times (see graph below), there is no limit such as you illogically ask for.

    Your third question, unfortunately shows that your post is an entirely rhetorical excercise.  That is unfortunate because, allowing for a small middle range, it is the most easilly answered.  The fact is that Global Mean Surface Temperature (GMST) has not recently varied greatly with time.  Over the last 10, thousand years, the temperature range has been about 1 C, yet we in the last century have seen the temperature rise from near the lowest value in that period to probably the highest:

    Again, this rate of temperature increase is unprecedented over the last 10,000 years and probably over the history of the Earth.  Further, that temperature keeps on rising, with new records for GMST having been set in 1973, 1980, 1981, 1987, 1988, 1990, 1995, 1997, 1998, 2005, 2010, and now 2014 (GISS LOTI, other indices will vary slightly), ie, on average once every four years over the last forty odd years.  (In contrast, the last cold record GMST was in 1909.)  If that average rate of new records is maintained, then we have global warming.  If temperatures fall below the 1970s average without major volcanism, a nuclear war or a massive asteroid bombardment, then global warming has stopped.  The evidence currently certainly indicates continuing global warming.

  • Temp record is unreliable

    Tom Curtis at 12:55 PM on 26 January, 2015

    MEJ @332, in addition to my prior comment, ATTP shows the BEST unadjusted record for Puerto Casado, with dubious data noted, and break points shown:

    You will note that the break points coincide with known station moves.  As they are known station moves, Homewood is arguing without evidence that those station moves had no effect on the temperature, whereas comparing the station temperature record with the regional average clearly shows that there were effects (as we would expect).

    We can also see the break point adjusted data:

    Steve Mosher, one of the BEST team comments:

    "The first step we take is to find all those stations that are duplicates. And the duplicate test involves fuzzy logic. So you look at the locations and see how close they are, you look at the name and see how close they are.. and then we look at the first differences in data to see. how close they are.

    There is second pass that looks for data similarily first, cause sometimes you get stations that have exactly the same data, but due to metadata issue the metadata is way wrong. These are really rare. I can’t recall one off the top of my head.

    This particular record has multiple sources and multiple locations that differ very slightly.  When the location differs we ‘slice’ the record. In other words the record is NOT adjusted for a station move, rather its computed as three different stations.

    Slicing has ZERO effect if there was in fact
    A) no real station move
    B) a station move that doesnt result in a change in temp

    That is something most people dont get. That is, if you “over slice” or split a record that should not be split the effect is zero. In this case we have three difference locations given for a station with the same name. So we treat it (mathematically) as if it is three different stations. If there was a move and the move had an effect on temperature, then splitting the record will allow us to fit the final surface treating those segments as independent. Again, if the metadata was wrong and there was no move or a move that didnt effect things, then there is no adjustment to make."

    Richard Betts also comments:

    "Actually one of the largest adjustments is the ‘bucket correction’, which aims to remove a systematic cold bias earlier in the SST record which arose from sea temperatures being measured in buckets hauled up to the deck, rather than at engine intakes as more recent measurements are. The bucket correction reduces the apparent warming over the 20th Century – it’s very well-known, but Mr Booker seems to have either not heard of it, or somehow forgot to mention it….!"

    (As fair notice, I disagree with Betts and Mosher on a number of topics.  On the area of temperature records, however, they are undoubted experts and I would not disagree with them on that record unless I was aware of other similar experts with whom I not only agreed, but agreed for the same reasons those experts give for their views.  Recognition of their expertise does not, however, mean I agree that they are right, or even sensible in some cases, in other topics outside of that expertise.)

  • It's not us

    MA Rodger at 21:02 PM on 21 January, 2015

    dvaytw @85.

    Regarding Trenberth's ERB from CERES, it does suffer massively from calibration issues. Its decadal value is more an inference relying on OHC data than a result in itself. IPCC AR5 Chapter 2 Section 2.3.1 is saying the net satellite measurements are 'calibrated' +/-2 W/m2, which is rather a lot. There is also quite big trend calibration issue (tenths of W/m2 per decade) for which a reference doesn't immediately spring into hand.

    What we importantly do have with ERB measurements is sight of the general wobbles which are valuable checks on GCM results. This also allows gaps to be filled in that allow demonstration that some of the crackpot wobblology theories (Staduim Waves etc) are nonsense and cannot be happening due to what we know of ERB. That sort of answers some of your second question @82 and is a bit of background to a graphic of mine that shows ERB less smoothed than normal (here - usually two clicks to 'download your attachment'). Unsmoothed it can be seen how any trend in ERB has yet to emerge from the wobbles.

     

    Your TSI graph comes from Lean(2000) (Data here @ NCDC). More recent assessments of historical of TSI  (see graph from CU here) do not yield such a large rise since the seventeenth century. And also, ΔTSI has to be divided by 4 to be equivilant to climate forcing as TSI is measured over the disc and forcing over the sphere. Perhaps one thing to remember with this TSI calibration here is that TSI is a component of the ERB measurement and its 'calibration' has been revised by quite a bit recently, usually downwards, much to the annoyance of denialists.

    Positive forcing is now above 3W/M2 (AR5 table A.II 1.2 gives positive forcing of +3.4 W/m2 for 1750-2011and it is rising at about 0.04 W/m2 pa). But there are also less-well defined negative forcings yielding a net anthropogenic forcing of ~+2.3W/m2 for 1750-2011 according to AR5 A.II.

  • 2014 SkS Weekly Digest #36

    Dikran Marsupial at 17:14 PM on 12 September, 2014

    Philip Shehan There are two ways in which the scientific community deal with journal papers which contain fundamental errors.  The most usual way is for scientists to read the paper, raise their eyebrows, think "boy, the reviewers have let that author down!", and then ignore it.  The result of this is that the paper rarely gets cited and the authors' academic reputation slips somewhat.  The second way is that someone writes a comment paper (a "letter to the editor") pointing out the errors, and then the scientific community continues as in the first way.

    I suspect in this case, nobody will be very interested in Prof. McKitricks paper, I doubt it will get many citations (outside e.g. Energy and Environment), and it won't change anything.  The fact that Prof. McKitrick published it in a lowly journal that few people read and rarely contains ideas worth using (c.f. h-index mentioned by Tom) suggests that even Prof. McKitrick doesn't think his method has much value.  If he did, he would have sent it to a more prestigeous journal.


    So asking you to write a journal paper on this is basically their way of saying "we have no answer to your criticism, but we are not going to admit that, and we are not going to change our minds".  In this case, there is little value in publishing a comment as the journal is not sufficiently prestigeous for the paper to attract much scientific interest, and it is unlikely that the skeptics on the blogsphere would pay any attention to the comment anyway.

    BTW I have written several "letters to the editor", and whether they are effective depends on the nature of the audience.  I wrote a paper explaining why Prof. Essenhigh's argument about the residence time of atmospheric CO2 does not mean the rise is not anthropogenic.  However that has not stopped the skeptics from citing Prof. Essenhigh's paper and ignoring the refutation and not also citing my paper (e.g. the NIPCC report).  I wrote another explaining the flaws in a paper about estimating the body mass of dinosaurs from their long bone measurements.  While the authors of the original study were unable to accept their method was incorrect for this particular application, the research community often cite my paper, whenever the original one is cited.  This rather shows a difference between some skeptics in the climate debate and the way in which scientists generally behave.

  • Joseph E. Postma and the Greenhouse Effect

    Tom Curtis at 11:57 AM on 17 July, 2014

    JPostma @51 writes in response to my post @50:

    "   "sufficient to emit greater energy at source than is recieve         by incident radiation on the device"

    That violates the first law of thermodynamics, and though I've refrained from engaging in the type of ad-hominem attack continually thrown my way, a statement like this really does expose scientific incompetency and a clutching of straws. I do apologize for having to make that remark, but alas, it couldn't be passed over in kindness, this time. Those preeminent experimentalists did not indeed interpret that their apparatus was magically producing more energy than received, ostensibly finding an exception to the 1st Law of thermodynamics. They would have laughed at that. What they found is that they could get sunlight to induce its maximum temperature on a plane, a temperature which is well above +100C. It is not an unexpected result."

    (Initial quote italicized for clarity.)

    I will concede that my thought was poorly expressed so that it could be misunderstood.  Allow me to clarrify.

    The de Saussure hot box consisted of an insulated box painted black in the interior, above which were mounted two or more glass panels to allow in sunlight but trap heat (both convective or radiant):

    It should be noted that the addition of more than one glass panel in no way improves the traping of heat escape by convection.  One panel is sufficient to stop the mixing of the gasses from inside and outside the box, and hence to stop the convective transfer of heat from inside to outside the box.  It does, however, improve the traping of heat both by radiant transfer and by conduction through the glass.  In doing so, it brings the hotbox model closer to a model of the atmospheric greenhouse effect, in that no energy escapes to space by either conduction or convection.

    Using de Saussure hot boxes, de Saussure measured temperatures as high as 383 K, and John Herschel measured temperatures as high as 388.5 K, in the interior most comparment of the device.  The did so in regions where (due to latitude) peak surface insolation was almost certainly less than 1100 W/m^2, and probably less than 1000 W/m^2.  However, the black body radiation for 383 K radiates 1220 W/m^2, while at 388.5 K it radiates 1291.7 W/m^2.  That is, at the "source" (ie, the interior of the innermost compartment of the device), the energy radiated by black body radiation was at least 100 W/m^2 (9%) greater than the "incident radiation on the device" (ie, the solar energy falling on uppermost panel of glass).

    Postma does not like the result, so he disputes the written record (and in de Saussure's case, publicly demonstrated in London and Paris) of two of the foremost experimental scientists of their day as "anecdotal".  Consequently I refer to a recent test of a single panel hot box (solar cooker) in Jordan:

    The "black coated, fixed" cooker is the one most analogous to a de Saussure hot box.  As can be seen, the average peak temperature over three days measurements achieved in the water in the black kettle within the box was 341.4 K.  The equivalent black body radiation was 770.2 W/m^2.  In contrast, the peak insolation recorded on any day during the experiment was was 717.4 W/m^2, and the insolation at the 14th hour (ie, the time of peak temperature) was just 364.5 W/m^2, less than half of the black body energy of the water.  Without water to act as thermal ballast, the interior temperature of the hot box would have peaked earlier and time, and been greater than that recorded.

    The obvious conclusion is that the interior temperatures in de Saussure hot boxes can easily be high enough that the interior black body emission from the inner most compartment exceeds in energy that of the incident sunlight.  Postma says that this is impossible.  Indeed, it is essential to his claims that this is impossible.  It cannot be explained by the prevention of convection, and nor (given the high thermal conductivity of glass) can it be explained by insulation against conduction through the glass.  That means any explanation of the increased temperature must include a greenhouse effect.

    To illustrate this point, consider two hot box designs:

    The first hot box is sealed by a panel that is transparent to both visible and IR light, but impermiable to air.  Because it is impermiable to air, it prevents any mixing of external with internal air, and hence any escape of heat by convection.  Because it is transparent to IR light, it neither absorbs nor radiates IR light.  Therefore any IR radiation leaving the box must come from the floor of the box, as illustrated in (1) above.

    The second hot box is sealed by a panel that is transparent to visible light, but absorbs IR light perfectly (emissivity = 1 for IR).  Because it absorbs all IR radiation that falls on it, any IR radiaton from the floor of the box is absorbed by it.  Because absorptivity equals emissivity, that means that energy is then reradiated, with half of it going up, and half of it going down, back into the box, as illustrated in (2) above.

    Now enter the laws of thermodynamics.  In particular, in this context the first law states that for any horizontal line drawn through a "box" above (horizontal plane for actual 3 D boxes), the energy going up equals the energy going down.  That is, 1U = 1D, 2u = 2D, 2u = 2d, and 2U = 2D + 2d = 2 x 2D.  (Note: 1U is energy flux U for box 1, etc.  It is not 1 x U.)  

    Further, the second law of thermodynamics states that for each such horizontal line, the entropy of the energy going up will not be less than the entropy of the energy going down.  Entropy, however, is the energy divided by the temperature.  The temperature of the light for black body radiation is just the temperature of the black body that emitted it.  Where it combines the light from two distinct black bodies, the entropy will be the energy weighted average of the entropies of the two black bodies.

    So, let's assume that 1D = 2D equals 1100 W/m^2.  Let us also assume the boxes are cubes with dimensions of 1 meter per side.  Then the temperature of the base of box 1 equals 373.2 K (~100 C), and the entropy of 1U = 1100/373.2 = 2.95 J/K.  In constrast, the black body emitting 1D was the Sun, with a surface temperature of approximately 5,750 K.  Consequently the entropy of 1D is 0.2 J/K, and as required the entropy of all downward energy at a given distance above the bottom of the box is less than the entropy of all upward energy at the same distance.  Indeed, the temperature of the bottom of the box would have to reach 5,750 K for that to not be the case - something it cannot do because of the first law of thermodynamics.

    In the second box, the temperature of the panel is also 373.2 K, and hence the entropy of 2u is 2.95 J/K.  The temperature of the base of box 2, however, rises to 443.8 K (~170 C).  The upward power from that base (2U) equals 2200 W/m^2.  The entropy of that energy is, therefore, 4.96 J/K.  That is comfortably greater than that of both 2D and 2d(=2u), and certainly greater than their combined entropy of 1.57 J/K.  Therefore the 2nd law of thermodynamics cannot forbid a situation such as illustrated in box 2, and the first law requires that the temperature of the floor of the box be 1.19 times greater than the temperature of the panel.

    (Postma, and others of similar belief, appear to confuse themselves by using imprecise statements of the 2nd law, to the effect that no body can be gain heat from a cooler body.  Heat, however, is net energy transfer.  In box 2, the floor of the box (443.8 K) gains heat from the Sun (5,750 K).  It then transfers heat to the panel (373.2 K).  There is energy flow from the panel to the floor, but the energy flow from the floor to the panel, so the net energy flow (heat flow) is from the floor to the panel.  This means that the floor is heated by the Sun, not the panel; but the floor is heated more by the Sun than it would be without the panel.  There are no entropy considerations preventing this unless the floor approaches temperatures near to that of the Sun's surface.)

    The important thing to note, however, is that mere prevention of convection cannot heat the floor more than sufficient to have a black body radiation equal in power to the incident radiation.  Neither can prevention of conduction where radiant heat can escape, as in the examples above, conduction is considered to be zero in both boxes.  Adding conduction can cool the floor temperature, but it cannot increase it in either case.  Therefore floor temperatures greater than the black body temperature for the power of incident solar radiation is proof that a greenhouse effect is in operation.  And just such temperatures have been observed historically by de Saussure and John Herschel, and more recently in testing of solar ovens.

  • The Other Bias

    Paul Pukite at 04:00 AM on 17 November, 2013

    Kevin, The bias that you have isolated accurately is the one during WWII.  I agree that a measurement error of about +0.1C occurs between 1940 and 1945.  It is not clear whether the time series such as GISTEMP actually correct for this.  I have been doing my own time-series "reanalysis" via what I refer to as the CSALT model.   This recreates the temperature record via non-temperature measurements such as CO2, SOI, Aerosols, LOD, and TSI (thus the acronym).

    What I find is that there is a significant warming spike during the WWII years which I correct below. The amount of correction is 0.1C, and when I apply that the model residuals trends more to white noise over the entire record.

    CSALT model residual

     

     

  • Why trust climate models? It’s a matter of simple science

    grindupBaker at 19:00 PM on 28 October, 2013

    Ironbark #5 you say "All that leaves is the models" then later "whether CO2 emissions are to blame" but they are largely unrelated. The simulation "models" purpose is to get a reasonable picture of future climate, not to prove that CO2 is "to blame". "whether CO2 emissions are to blame" (are the primary warming agent) is determined by many means all telling a very similar picture. Physics of CO2 (adults do it & schoolkids do it on videos), hundreds of thousands of thermometers in the oceans over decades taking (must be millions) of measurements showing it's warming. The IPCC report even shows how much warming or cooling is attributed to each major factor each year, not just CO2. None of this science has anything to do the the simulation "models" that project the future climate, the only slight interrelation is that they also use the ocean part of the models to estimate accurately the water temperatures between the floats with thermometers in them, because that's far more accurate than just taking a straight line between them.

    You say "ordinary people who don't have the time or expertise", "don't have the time or skills", so you say you have other life priorities and you imply strongly that you lack the mental equipment or life-conditioning or both that's required for logical analytical thought, so you base your opinions on the popular news. We all see that many humans are similar to that so they are easily led by the well honed techniques of those with power and intelligence well versed in advertising and the masses-manipulation techniques (that is, not naive obsessive scientist brainiac types). You must not be massively under the gun for time though because you've posted a few words so far.

    The tree ring data goes back 2,000 years. It's not worth me lookiing back at the data for details (you can't even be bothered for a 2-minute look to asses it a bit and comment, you are so busy) but humans had pretty good thermometers last few decades and the tree ring data showed cooling but the thermometers reasonable warming. If you had a good thermometer outside your home and recorded temperature each day and chopped down your cherry tree after 10 years and found its rings showed it freezing outside when your thermometer log showed it warm would you throw out the thermometer readings you took and assume the tree rings were right ? It's ridiculous. The trees were poorly chosen at the treeline where they were bashed by the elements, instead of trees deep in the forest. Prof. Muller says the whole 2,000 years tree record should have been thrown out then rather than just throwing out the last bit, and he seems to have a point but that would not change the last decade of big warming record at all (as Prof. Muller has said that also) and there are other ways that were used to measure temperature back 650,000,000 years, not just 2,000 years, and they all show the same story as near as matters - it's getting warm very suddenly lately. The clever professionals in the human-based soft sciences lead the masses around by nose rings by any sparkly bit of trivium because the masses don't have the necessary mental equipment, have much bigger priorities (we are all selfish by nature) and prefer simple entertainments.

  • IPCC model global warming projections have done much better than you think

    Leto at 23:33 PM on 4 October, 2013

    Sereniac,

    The problem I have with that analogy is that the fitness training is not just a distractor that hides the true fitness signal, it leads to genuine improvements in fitness. The ENSO fluctuations do not lead to analagous true improvements in the global heat balance.

    What if your Mr Jones is becoming morbidly obese, and this can be accurately projected using a metabolic model, but his measured weight flucutates in the short term by as much as a 2kg (eating 2kg donuts, as he does at random times, inflates his apparent weight by 2kg).

    He visits the doctor immediately after a donut splurge, and posts a record weight (c.f. 1998 el nino). He then continues to eat excessively, but 15 days later, his next weight assessment happens to be just prior to his daily donut splurge. He has actually gained 1.5kg in weight, and is now posting a record empty weight (c.f. recent record la nina), but his measured weight is 0.5kg lower than the last measurement. He boasts that his weight trend is going down, and declares the doctor's metabolic model to be bogus. On the contrary, he is fatter than ever, and his next post-splurge weight is expected to break all records (c.f. next significant el nino).

    A plot of his post-splurge measurements shows no overall change in the post-splurge trend, as does a plot of his pre-splurge weights (c.f the separate el nino and la nina trends in the Neilson-Gammon plot), but the short term fluctuations in apparent weight mean that he often has pseudo-pauses in his relentless weight gain. The existence of such pseudo-pauses is entirely expected in the metabolic model, though the timing of the pauses is outside the scope of the model, and an ensemble of model runs will average out the pauses so that they are not apparent. Nonetheless, he uses the pauses as an excuse to continue his unhealthy lifestyle.

     

  • Lu Blames Global Warming on CFCs (Curve Fitting Correlations)

    Dave123 at 12:06 PM on 6 June, 2013

    I'm not sure also what UV effects on the stratosphere is supposed to mean.  UV is included in the TSI satellite measurements, so it's not like it's missing from the TOA energy budget.  Is this some kind of 'wings of the butterfly' effect that is supposed to do something else?  Or is this more unicorn chasing....the endless parade of "what ifs" that don't actually formulate a hypothesis.

  • It warmed before 1940 when CO2 was low

    Bob Loblaw at 07:09 AM on 23 March, 2013

    Klapper @25: "They are the most recent I could find so I'll assume they are the best."

    Why would you assume that? If they do not reflect the range of reasonable values that are supportable in the literature (remember, we have no direct satellite-based measurements of TSI from the 1910-1940 period to check the reconstructions against), then you are assuming that TSI for the period is known with greater accuracy than it is. "The best" in science isn't necessarily decided by "the most recent".

    You just may be on your way to assuming your conclusions again.

    Have you read and understood my example in #18? Do you agree with it, or do you find fault in the reasoning? IMO, you keep sending yourself off on the Bad Idea path...

  • It warmed before 1940 when CO2 was low

    Bob Loblaw at 18:26 PM on 22 March, 2013

    A second comment, because the issue is sufficiently different from the ones that I addressed in the comment above. In #12, Klapper says

    "My real point is that I think you have a problem with your models in this period, which is a good one to evaluate as I've discussed above and it's not enough to just say it's within the margin of error and call it a day."

    I think this is following on the discussion on the other thread, where it was stated that lack of accuracy in model inputs places limits on how well we can expect models to match past temperature records. This has led to the current discussion of reconstructing past TSI and aerosol forcings.

    Klapper seems to feel that the models need some fixing to better match past climates, and seems unwilling to accept that there may be not much that can be done about it, due to inaccuracies. I will try to illustrate why limited accuracy of forcings limits our ability to "fix" a model, by using a very simple (non-climate) model.

    Let's assume that we have a model that says A + B = T. We have measurements of T that tell us it was 15.2 +/- 0.1 at some past time. Unfortunately, we do not have good estimates of A or B at that time - we only have proxies, and our best guesses are that the past values were A = 10.4 +/- 0.6, and B = 5.6 +/- 1.0. Thus, our model says that T = 16.0 +/- 1.2 (assuming independent errors for A and B).

    Is the model right or wrong? Well, the direct measurement of T is 15.2 +/- 0.1, and our model says it should be 16.0 +/- 1.2. The observation falls within the errors for the model, so the observation does not disprove the model. The model could be right.

    More importantly, are we justified in modifying the model? We could "improve" the model, by making it 0.923*A + B = T, and our model would then predict T = 15.4, which matches the observation. Is this justified? We could just as easily make the model A + 0.857*B = T, and get an exact match. Or, we could play with any combination of fudge factors for both A and B to get a matchj - there are an infinite number of them that would work.

    The problem is that there is no way of determining if any of these arbitrary "fixes" is correct, because we have no further observations of any kind that can differentiate amongst the possibilities. Indeed, within the error bounds of our data, the fudges that give an exact match between predicted and observed T are no better than our original model A + B = T. They all fall within the error bounds. Those error bounds already tell us that the original model may be correct.

    When a model's output (value plus error bounds)  already falls within the error bounds of the observations, it is a Bad Idea to try to tune a model through purely arbitrary adjustment of parameters. Such adjustments, even if they improve the match between model output and observations, do not mean that we have improved the model. It is a Good Idea to try to improve upon the knowledge of the various input variables/parameters, but you do not accomplish this by just trying different numbers in the model - you need an independent source of information. Blindly fudging parameters is just fitting to the noise.

    The Blogosphere is full of fake skeptics that think they have a good model just because they can get an arbitrary series of equations (usually "cycles") with arbitrary fitting of parameters, all while ignoring the known physics. It will be gussied up in terms of statistics or Fourier Analysis, or some fancy words, but it is not good science. A great place to see these things taken down is Tamino's, where is is often called Mathturbation.

    In Klapper's case, it appears that he is looking at the experts' models, for which observations do fall within model output (value plus uncertainties), and replacing them with worse models, which show a poorer match with the observations, and convincing himself that the experts are wrong.

  • It warmed before 1940 when CO2 was low

    Bob Loblaw at 17:32 PM on 22 March, 2013

    First, a quick clarification for anyone that reads this portion of the thread at some point in the distant future. Klapper's comment #8 is the result of a thread happening here that was getting off topic, and is moving into this thread where it is more on topic.

    Now, to address a few of Klapper's points:

    From what I understand, you have regressed TSI in the recent period against sunspot numbers, and then used that to estimate TSI in the period 1910-1945, and then used that TSI trend to compare to the temperature trend. In essence, what this is doing is just attempting to explain the temperature trend by correlating it with sunspot numbers. This is not a very sophiitcated model, and the idea that your model is better than the models in the literature is - shall we say - somewhat dubious. All I can suggest is that you take scaddenp's comment to heart, and think about how crude your model is (and it is a model).

    But to expland on certain points, let's first think about your use of sunspots. Sunspots are dark areas on the surface of the sun, and thus are spots that emit less radiation, not more. This has been directly observed using space-borne measurements of TSI during times of transit of sunspots across the visible solar disk. Yet increased sunspot numbers are associated with increased TSI on average, so the increase can't be just because of the sunspots. In fact, sunspots are just an indicator of something else that is going on with the sun, just as tree rings are an indicator that something is going on with the local climate. Sunspots are a proxy for solar activity, not a measurement of TSI. Sunspots are a not the best that science has to offer. Thus, you are basing your conclusions on a poor proxy for TSI, and this is leading you astray. Better estimates of past TSI use more sophisticated models, and are more likely to provide more useful results.

    Your understanding of aerosols and their effects is also very simplistic. In #14, you say "aerosols can't cool", and this is wrong. The effect of aerosols differs greatly depending on whether they are mostly-absorbing (e.g., soot), or mostly-scattering (e.g., light-coloured dust). This is usually quantified in terms of the "single scattering albedo". Although generally aerosols cause surface cooling, this is not always the case. A highly-scattering aerosol over a highly-reflective surface can have a warming effect, because is also affects the solar radiation reflected off the surface. A highly-absorbing aerosol reduces the solar radiation reaching the ground, but causes warming at the altitude the aerosol is located at. To thoroughly account for aerosols, you have to have knowledge of their optical properties and size distributions, as well as their geographical location and altitude. The trends in these properties over time will affect the temperature trends. Thus, you are basing your conclusions on a poor understanding of aerosols, and this is leading you astray.

    Looking solely at direct relationships between forcing factors (TSI, aerosols, etc.) and temperature ignores any time lags in the climate system. We know that forcings don't instantaneously result in temperature changes - it takes time for the climate to equilibrate. Thus, your anaylsis uses a simplistic relationship between forcings and temperature, and this is leading you astray.

    You have assumed that your simplistic reconstruction of TSI is better than the experts. You have assumed that aerosol effects over the period are zero. You have assumed that a simplistic model of T = f(CO2, sunspots) is valid. You then conclude that the experts have something wrong, even though the experts' models are much more sophisticated. Thus, you have assumed your conclusion. This is leading you astray.

     

  • New Study, Same Result - Greenhouse Gases Dominate Global Warming

    Bob Loblaw at 08:02 AM on 20 March, 2013

    Klapper @ 29:

    I would agree that errors in our knowledge of CO2 forcing in the period 1910-1945 are likely small, simply as a result of there being no reason to think that it varied significantly.

    Would you care to explain to me why you are confident about the estimates of atmospheric aerosol levels that are available for that period? Perhaps you'll wish to compare the methods used during that period with the kinds of estimates we can obtain today with networks such as AERONET, or satellite data?

    As well, perhaps you are willing to explain how accurate the measurements of solar output (you've used the acronym TSI, wich is Total Solar Irradiance) for that period are? Please feel free to compare that accuracy to those recently available from satellite data used in this PMOD analysis. Feel free to be as technical as you wish - I have worked with people from PMOD, and I am quite familiar with the types of instruments used to measure TSI on these satellites.

    Note that I consider phrases such as "the correlation between TSI and SSN are pretty good so I can't see a large error in that parameter" to be nothing more than handwaving (regardless of what you are referring to as SSN, which is an acronym that escapes me at the moment). A correlation with something that is not the item of measurement is not a fundamental estimate of the accuracy to which that element (TSI) is measured.

    I suspect that you are confusing the accuracy of an input to a model with the accuracy of the model, but I can't be sure where you are going wrong until you give a more elaborate explanation of your thought process.

     

  • It's the sun

    kcron24 at 13:06 PM on 2 March, 2013

    Sun is the dominant energy source for life on planet Earth.  According to Intergovernmental Panel on Climate Change (IPCC), the Sun’s radiation provides around 10,000 times more energy to Earth’s climate system than any other energy source (IPCC 2007).  Due to the magnitude of the Sun’s energy, small fluctuations of Since solar radiation is not constant, due to sunspots and other solar phenomena, changes in incoming solar radiation can occur.  Collecting such data is imperative for climate research and radiative budget modeling. 


                Measurements of total solar irradiance began with the ERB satellite in 1979 and continued with ACRIM series of measurements.  Ten years ago NASA launched the Solar Radiation and Climate Experiment or SORCE into orbit.  This satellite provides measurements of solar radiation from 1nm to 2000nm, encompassing 95% of the electromagnetic spectrum.  Instruments on SORCE such as the Total Irradiance Monitor (TIM) allow the measurement of spectrally integrated solar radiation incident at the top of the Earth's atmosphere. 


    Background


    Keeping human consumption in equilibrium with Earth’s climate system to prevent extreme positive feedbacks remains as one of today’s paramount challenges.  Anthropogenic gases such as CO2 contribute to the climate change through the greenhouse effect.  There has been an increase of about 0.8 K since preindustrial times.  However, the current climate sensitivity in climate models indicates that the forcing by greenhouse gases would have contributed to a rise of 2.1 K (range 1.5–3.2 K) (Schwartz et al. 2010).  In the IPCC Third Assessment Report (TAR), the IPCC recognizes that there is more work to be done to understand solar variation and its effect on climate change (Forster et al. 2007).  However the IPCC mentions that the effect of greenhouse gases plays a bigger role in recent warming than solar variation. 


    Gaining more knowledge on climate sensitivities will provide added momentum of restructuring the current dependence on fossil fuels.  Some estimate this knowledge to be in the tens of trillions of dollars (Edmonds and Smith 2006).  According to Hansen et al., uncertainties with the current knowledge of aerosol radiative forcing and solar irradiance limit the ability of current climate models to accurately predict with high confidence (Hansen et al. 2007).  Suspect data collected by SORCE and other total solar irradiance monitors show that uncorrected instrumental drifts could contribute to errors in TSI trends (Kopp and Lean 2011).  The small data set of 33 years and the imprecise measurements further contribute to unreliable trends and analysis (Kopp and Lean 2011). 


    In order to alleviate this problem more data needs to be collected with better equipment.  Unfortunately, an improved version of SORCE with an upgraded TIM called Glory, never had an opportunity to make a measurement. Glory crashed shortly after launching costing NASA 435 million dollars.  SORCE continues to be the main TSI collector.  The future TIM measurement goals of 0.01% uncertainties with stabilities

  • Mars is warming

    scaddenp at 10:14 AM on 22 February, 2013

    Whether travels on the galactic path had anything to do with past climate change is harder to decide, but we can see clearly that it has very little to do with post-1970 climate change. Why? because we have good measurements of the climate determinants. The only "space" factor affecting climate is TSI, whether from change of angle, change of solar luminense or "space dust". This is accurately measured by satellites since 70's and is stable if not decreasing (see "its the sun" argument), and yet the earth warms. Why look to weird, unknowable, out-there sources of warming when there is a perfectly reasonable, physically plausible solution coming right out of the smoke stack? Or to put it another way, we have measurably increased the energy flux onto the surface of the earth with increased GHGs. What mechanism do you propose by which this would not cause warming?

  • Humidity is falling

    Jeff313 at 23:19 PM on 15 February, 2013

    (-multiple off-topic and inflammatory snipped-)

  • 2012 Shatters the US Temperature Record. Fox, Watts, and Spencer Respond by Denying Reality

    Tom Curtis at 15:42 PM on 19 January, 2013

    Backslider @37:

    1) You may not be able to get your head out of Sydney, but the rest of Australia can. We (those Australian's outside of Sydney) are talking about a heat wave not because of two very hot days in Sydney, but because of 46 individual maximum temperature records in Australia so far in 2013; and because of repeated Australian mean maximum temperatures at or near record levels.

    2) The early explorers were not scientists, but ex-Navy or Army officers like Charles Sturt and Burke with the occasional surveyor like Wills. (And for non-Australian's, please read up on the Burke and Wills expedition to see just how absurd is this suggestion that because records were taken by explorers, they were taken with unusual competence.) Charles Sturt was, of course, very competent, but the suggestion that measurements made on thermometers packed and carried very day, then set up in an ad hoc fashion in tents or in the partial shade provided by eucalypt trees, and at an unknown distance from the ground should be more accurate than that provided by Stevenson Screened instruments is absurd.

    3) I do not have to make a comparison between Stevenson Screened instruments and others. It has already been done:

    "In view of the implications for the assessment of climatic changes since the mid-nineteenth century, systematic changes of exposure of thermometers at land stations are reviewed. Particular emphasis is laid on changes of exposure during the late nineteenth and early twentieth century when shelters often differed considerably from the Stevenson screens, and variants thereof, which have been prevalent during the past few decades. It is concluded that little overall bias in land surface air temperature has accumulated since the late nineteenth century: however, the earliest extratropical data may have been biased typically 0.2°C warm in summer and by day, and similarly cold in winter and by night, relative to modern observations. Furthermore, there is likely to have been a warm bias in the tropics in the early twentieth century: this bias, implied by comparisons between Stevenson screens and the tropical sheds then in use, is confirmed by comparisons between coastal land surface air temperatures and nearby marine surface temperatures, and was probably of the order of 0.2°C."


    See also here for comparisons between modern methods.

    I also know, as you obviously do not, that temperatures taken in the shade in poorly ventilated locations locations, or unusually close to the ground can exceed temperatures recorded in Stevenson Screens by up to ten or 15 degrees. That is why it is now law in Australia that you are not permitted to leave unattended children in parked cars. Poor house design in outback and subtropical Australia can easily result in internal temperatures several degrees above that found in neigbouring Stevenson Screened instruments, as indeed can internal temperatures in tents, particularly tents lacking a fly.

    @38, living in Mount Isa, I know from personal experience that the numbers of birds described as dying are way in excess of the carrying capacity of the land they are described as dying in. I also know that hyperbole is a favoured technique in Australian story telling. You do the maths. So many birds they snapped the branches of the trees, according to one account. Which makes a good yarn, and a better one when some (--snipped--) can't recognize when a yarn is being spun.
  • We're heading into an ice age

    Robertgj at 02:23 AM on 28 November, 2012

    Michael,
    (-snip-).
  • Climate time lag

    Bob Loblaw at 00:47 AM on 5 October, 2012

    Falkenherz:

    Sphaerica has given you a couple of good diagrams. You can also see the effects of the 1:4 ratio in the equations in the text of the OP. Look at the one that has:

    S(1-A) = 4εσT4

    ...and divide both sides by 4. The 4 disappears on the right, and we have S/4 on the left.


    ...now, you were asking about the "fails to appear" comment of mine, and said "So what you mean to say is, temp rises much higher/faster since roughly 1990 than ever before since 1850, and this cannot match to a TSI "lag pattern" from 1960 onwards?

    That is basically it, but we don't even need to look back to the 1800s, just look at T through the mid-1900s to now. Look at Riccardo's graph, and let's assume that his 0 time value represents 1960, when TSI stopped rising. What we would expect is to see temperature rising like it does in Riccardo's graph - most rapidly in the first few years, followed by a tapering off and eventual equilibrium. What you would not expect is to see temperatures steady for a couple of decades, then see a sharp rise like Riccardo's graph - but delayed to start around 1980 or so. From Riccardo's graph, we see that the flux imbalance starts at time 0, and gradually drops as T rises (due to extra IR loss to space as T rises). If the atmosphere did not warm for 20 years, the flux imbalance (delta-F) would remain at 1.0 for 20 years, and that energy has to go somewhere. Since it is not appearing in the atmosphere, where would it be hiding? (You won't get an answer from "skeptics".) That's where you would have to provide a plausible physical explanation to explain the "missing" energy and why it doesn't start to affect the atmosphere until many years later. That's the difficulty in the argument that the 1980s-onwards warming is delayed from earlier TSI increases. It ignores physics.

    As for pre-1850: we don't show measured temperatures for earlier times because the direct measurements don't exist. Think about when the thermometer and the concept of temperature were invented, and then think about how long it might take to get decent global coverage, and you'll see the problem. Before 1850, proxies are required - things that we can measure now, like tree rings, sediments, ice cores, etc. that have a record from the past that responds to temperature (but isn't a direct measurement of temperature). That's how we fill in the gaps from the past, and that the sort of data that goes into Mann's work (and other similar studies) and shows the Hockey Stick.
  • Do we know when the Arctic will be sea ice-free?

    KR at 02:52 AM on 9 September, 2012

    Dirt Girl - I for one am quite curious about the Soon and Briggs TSI - it isn't supported by any of the information I am aware of. They state the temperature data is from BEST, but do not source their TSI data.

    If you look at either the sunspot numbers or direct TSI measurements (averaged over the 11-year solar cycle for clarity) versus temperatures (as in this plot), you see that insolation is dropping while temperatures are rising over the last 40 years or so. S&B's claims are contradicted by the evidence.

    [ And yes, the Washington Times is a hyperconservative low-circulation paper, owned by the late Sun Myung Moon, not known for any balance in their views. I would not consider it a reliable source. S&B are welcome to try to publish their column in a peer-reviewed journal - it might give the editors a good laugh while getting rejected. ]
  • It's the sun

    scaddenp at 07:27 AM on 9 July, 2012

    And as a final point - if we were getting warmer because we were getting closer to the sun, then we should see that in the TSI measurements. As shown in the main article, this is not the case.
  • It's Urban Heat Island effect

    michael sweet at 22:26 PM on 1 July, 2012

    Koyaanisqatsi,
    When uncorrected data is compared there is no statistical difference between rural and urban sites. GISS has pointed out that they only lower urban measurements. Detailed comparisons show that some urban sites are lower than nearby rural sites. These data are never raised. That means that the UHI corrections result in a small lowering of the estimated temperature trend from the actual trend. The difference is not statistically significant. The estimated trend is left too low to be conservative.

    UHI is constantly raised by fake skeptics to muddy the waters. In reality, it is a very small effect that does not alter measured temperature trends.
  • Why Are We Sure We're Right? #2

    jyyh at 14:40 PM on 6 May, 2012

    Ok the more serious attempt to answer this:

    Why we're sure we're right

    To Claim:'we are right', we must define the words. 'We' has been traditionally been interpreted as being a specific communion of entities which can understand the english language, that means the humans who understand that this is written in english. 'We' has a broader definition too but I think this suffices for now. The specific communion of entities WRT to the context here would be the people who understand how the world works, i.e. the physics, AND accept there is a world which might be defined here as an assortment of environmental variables outside the entitys (if we wish to call a solipsist an entity) head, in the sense of the proverb: 'One can make a fence but not eternally block the world outside'. The dinosaurs of old times got this too late according to the prevailing theory of 'why there are no dinosaur fossils found on the layers of rock dated, by radiological measures that have and still can be reproduced in the laboratory containing an isotope counter, after the occurrence of a large crater on the Yucatan?' We humans (here encompassing all of our species) have at least had LONEOS and Catalina surveys among others, looking outside our comfort zone (broadly, the human biotope on earth (it's pretty large compared to nearly all other species on earth, only some marine species have had it wider; of mammals on land the next most widespread would be the rat and the dog I guess)), so some development has happened. Most people notice the outside world as a kid when they get hurt, so it's quite natural to be afraid of it. One might even make a bold assumption that the so-called 'theory of gravity' is accepted more readily than the GHG-theory because people have continuing first hand experience of it
    (f.e. saggy eyes).

    'right' is the tricky bit here, taken out of context this could mean almost anything. However, here we, as separate entities sharing some common chraracteristics (f.e. we're
    humans who can read and understand english), which is why the use of 'we' or 'these (inclusive)' (in case there are artificial intelligencies present) maybe allowed, have some context. In the opening chapter Rob Honeycutt tells us this is about climate issues and about being sure of these. So we might take the headline 'right' to mean 'certain', like an entity capable of self-recognition is. Talking to oneself isn't very fruitful, though, and one might even define the word 'talk' so it necessarily includes a receptive other entity, so if one uses language, it necessarily follows one assumes there is an environment that has beings capable of understanding your words, and thus there would be no 'Matrix' or at least if there was there would be no way of talking of it. Oops, might have made an error there.

    'are' is just a statement of being.

    So, after a brief exploration on the meaning of words, on to the subject. We're certain of the GHG effect and AGW because we talk (exchange information with others) of the
    measurements (one sort of entities), if necessary we check them repeatedly (in order to rule out the possibility of changing physical constants), try to find the simplest way to make sense of them (no need to complicate things in science, it's hard enough as it is (ref. above text on the meaning of words)), and have come to a conclusion there is a greenhouse effect (satellites give us the normal temperature in space this far from the sun), and that humans can increase the DWR (down-welling radiation) by adding insulation to the roof... oops meant to say CO2 in the atmosphere. I'm tempted to end this note to a thinly veiled
    insult to some people willing to lie and obfuscate this issue, instead I link to this: http://xkcd.com/54/
  • Measurements show Earth heating up, think tanks & newspapers disagree

    Tom Curtis at 13:06 PM on 3 February, 2012

    Elsa @15:

    "I find it rather difficult to see why you think your own approach is in any way superior to the cherrypicking that you highlight. If we are looking for evidence of temperature changes then surely it is temperature that we should look at."


    Contrary to Mark R, the Daily Mail article was a response to this press release rather than news of the upcoming revision of the Hadley/CRU Land/Ocean Temperature Index. That press release contains data from four Land/Ocean Temperature Indices, three of which are in close agreement and show significant temperature increase over the last fifteen years.

    There is, however, one which is an outlier, and which shows little or no temperature increase over that period. The land component of that index is also an outlier compared to various land only temperature indices including the BEST analysis (not included in the Met Office press release as it is not a Land/Ocean index). As we also know, it is an outlier with respect to its revised version which, though not yet fully implimented, is known to show warming over that period due to the addition of more station data from regions which where formerly sparsely covered.

    Naturally, the Daily Mail article focuses exclusively on the outlier.

    The fact that it is an outlier, however, puts Elsa's suggestion that we look exclusively at the temperature into perspective. The GWPF and the Daily Mail did not look exclusively at the temperature, but exclusively at just one temperature index. That is the nature of the cherry picking she has taken it upon herself to defend.

    When faced with contradictory temperature data, the obvious thing to do is to exclude the outlier. It is also obvious that we should exclude the temperature index with the least geographical coverage (and hence most unrepresentative of the globe as a whole). We should also exclude the temperature index with the least raw data, ie, actual station records. On all three counts, the temperature index we would drop is the HadCRUT3 temperature index, ie, the temperature index the the Daily Mail (and apparently Elsa) focus on in exclusion to all others.

    Of course, given that there may be problems with the analysis in any of the indices, the other sensible thing to do is to look for corroborating evidence. If we look at natural events which are significantly effected by temperature, and they tell us a different story from what our temperature index is telling us, then we have significant reason to distrust our temperature index.

    This is in fact a method recommended by the scientists at NASA:

    "This derived error bar only addressed the error due to incomplete spatial coverage of measurements. As there are other potential sources of error, such as urban warming near meteorological stations, etc., many other methods have been used to verify the approximate magnitude of inferred global warming. These methods include inference of surface temperature change from vertical temperature profiles in the ground (bore holes) at many sites around the world, rate of glacier retreat at many locations, and studies by several groups of the effect of urban and other local human influences on the global temperature record. All of these yield consistent estimates of the approximate magnitude of global warming, which now stands at about twice the magnitude that we reported in 1981. Further affirmation of the reality of the warming is its spatial distribution, which has largest values at locations remote from any local human influence, with a global pattern consistent with that expected for response to global climate forcings (larger in the Northern Hemisphere than the Southern Hemisphere, larger at high latitudes than low latitudes, larger over land than over ocean)."

    (Source, my emphasis.)

    If you follow NASA's advise and look at other indicators of global temperature increase, it becomes obvious that global temperatures continue to rise. Arctic, and global (Arctic plus Antarctic) sea ice have declined:



    Glaciers have retreated:



    The Greenland and Antarctic Ice sheets have lost mass, while other smaller ice sheets are disappearing entirely:





    The oceans are gaining heat:



    And, among a host of other smaller signs, the Donner Christmas family hockey game is a dying tradition:



    These secondary indicators clearly show the Earth has continued to warm over the last 15 years. That is, it is GISS and NOAA who are giving us the straight dope on temperatures, not HadCRUT3, on which the Daily Mail keeps its eyes so firmly fixed.

    Apparently Elsa wants our eyes firmly fixed on HadCRUT3 as well. Not for her any glance outside of that little black circle (figure 2 above). For if we do glance at the additional evidence, we won't believe the Daily Mail's con.
  • Galactic cosmic rays: Backing the wrong horse

    Eric (skeptic) at 02:02 AM on 17 December, 2011

    muoncounter, Dr. Laken kindly responded to more of my questions and suggests caution in reading a lot into long term changes in GCR flux, too many unknowns in the relationship to clouds over those timescales. He also cautioned on Dragic's selection of GCR events. In the Lakin paper they selected using TSI criteria that were designed to preclude bias in event selection. Dr Laken also questioned the use of Diurnal Temperature Range rather than direct cloud measurements, and he might have a point there, but personally I'm not sure what is wrong with DTR which is a localized climatological response.
  • Galactic cosmic rays: Backing the wrong horse

    Eric (skeptic) at 07:05 AM on 15 December, 2011

    Lucky for me, Dr. Laken took part of his evening to explain the TSI-GCR link. Over the long run an active sun means more TSI and an active sun means less GCR (due to more solar wind). The measurements of solar activity are smoothed and somewhat qualitative sunspot counts and TSI and GCR are running averages or proxies. Everything works the way I thought.

    But on a timescale of days the TSI relationship to GCR is complex due to positioning of the sunspots and other features and the movement of those features. So TSI and GCR (and solar UV) have more complex relationships including time delays. The wording in the abstract refers to those short term relationships (because that is what the paper is about).
  • Arctic sea ice has recovered

    CBDunkerson at 23:52 PM on 14 December, 2011

    Actually, if you want a simple explanation your best bet is the last graph at the bottom of the first link in my note above.

    This shows a comparison of the PIOMAS model volume estimates to a regression analysis of US Navy submarine readings of Arctic sea ice and the ICESat satellite's Arctic ice volume readings. The solid black line shows the PIOMAS model results and the dashed black line the resulting trend. The red line with '+' signs is the submarine regression analysis... basically, this means that they took spotty submarine records of ice thickness/area and used mathematical analysis to fit them together into a trend. The large pink shaded area above and below this line shows the uncertainty range around these values. The red line with triangles on it shows readings from the short-lived ICESat satellite, which measured the surface area of sea ice and calculated thickness based on the measured height of the ice above the water line (which, due to the relative densities of sea water and ice, represents about 20% of the total ice thickness). The red dashed line shows the trend of the submarine and satellite results.

    Note how closely the ICESat "actual measurements" agree with the PIOMAS 'estimates'. Likewise, note the similarity of the two trend lines. In both cases, they actually show MORE ice loss than the PIOMAS model. PIOMAS itself is also based on direct measurements BTW... they take satellite ice area measurements and estimate the ice thickness based on drilled sample readings, temperatures, past thickness data, et cetera.

    We thus have three different methods of calculating Arctic ice volume estimates... all of which show close agreement with each other.

    The detailed paper in the second link covers these issues in more detail and also determines uncertainty ranges by comparing PIOMAS ice thickness calculations to actual measurements of ice thickness in the same areas. Their analysis shows (again) that PIOMAS is actually under-estimating the rate of ice loss. They also found that the volume decline in 2010 was so great that it represented a new record minimum to a degree of certainty outside the uncertainty bounds... that is, even if we assume the prior record minimum year was the lowest possible value in the uncertainty range and 2010 the highest possible value, 2010 would still be a record low. Since then the September 2011 value actually came in lower than 2010, but not outside the uncertainty range.
  • 2nd law of thermodynamics contradicts greenhouse theory

    Tom Curtis at 11:43 AM on 14 December, 2011

    TOP @1152:

    1) You misquote Connolly as saying the troposphere is opaque to IR radiation. What he actually said is that it is "largely opaque to IR radiation". (My emphasis). That is easily verified by examining the downward IR radiation at the surface, as for example in these two spectra:



    You will notice that even with low humidity (Barrow Island), the atmosphere is essentially opaque to IR radiation outside the bands between wavenumber 800 and 1000, and between 1100 and 1200. With high humidity (Nauru) there are significant local emissions even in those bands.

    These facts were first discovered by the US Air Force, which conducted experiments in the IR transmission properties of the atmosphere so that they could effectively deploy heat seeking missiles. Consequently heat seeking missiles, and IR cameras, and IR thermometers are all tuned to the bands of low IR emission by the lower atmosphere. This model, for example, is tuned to the entire band of low atmospheric emissions, 8 µm to 14 µm (see spectral response under specifications).

    Arguing the atmosphere is not largely opaque to IR radiation because you can use an instrument tuned to the wavelengths in which the atmosphere least opaque is bizarre, although certainly not unique to you among fake skeptics. Neither is misquoting a source to strengthen your case. I hope both were accidental, and that you will now recognize that Connolly's claim was correct.

    2) If known temperatures and humidities (from observations are fed into a Line By Line (LBL) radiation model, the result looks something like this:



    This is the result of an actual comparison between a LBL model and observations over the Gulf of Mexico.

    This is the same comparison with theoretical and observed spectra offset for clarity:



    Here is a detail of the first image with black body curves shown for clarity:



    You will notice that the absorption band of CO2 emits radiation consistent with a black body curve of 220 degrees Kelvin. As the Earth must on average emit energy equivalent to a black body curve of 255 degrees K to not continuously gain heat, it follows that that low emission must be compensated for by a higher emission somewhere else. Because of the absorption by water, the band in which that higher emission can come from is largely restricted to the area of IR transparency, ie, in which the radiation is coming from the surface. In order to emit IR radiation with a black body equivalence greater than 255 degrees K, the surface of the Earth has to be at at temperature greater than 255 degrees K. Ergo, the absorption of IR by CO2 forces the temperature of the surface to have a temperature greater than that which it would have had in the absence of the CO2.

    That was all simple physics, and follows immediately from the observed emissions, and the conservation of energy. No amount of experimentation with a toy box can prove these observations false, and therefore not amount of experimentation with a toy box can show that the greenhouse effect does not exist.

    In case you think that that observation/model comparison was cherry-picked because it was an unusually good fit, here is the scatterplot of 134,862 comparisons between measured, and modeled Outgoing Longwave Radiation (OLR) measurements:



    3) A very minor point, but Wood's experiment was designed to show whether or not greenhouses warmed because of increase IR back radiation from the glass panels. His experiment successfully showed that they do not. It does not show, and is incapable from its design of showing, that the greenhouse effect does not exist. People who think it does do not understand the physics of the greenhouse effect.

    In order to successfully test whether a slab model of the IR effect is physically sound, you need to isolate the radiated surface (floor of the box) and the window (top of the box) by means of a vacuum. I do not believe it is possible to model the actual greenhouse effect as seen in Earth's atmosphere in so small a physical model.

    4) Names are acquired through history and retained from convenience. Yes, the "greenhouse effect" is not in fact the effect that warms greenhouses. But neither are tin cans made from tin, nor are rubber ducks either rubber, or ducks. Get over it. If such trivia are all you have to criticize the theory, then that theory is very well grounded indeed.
  • 10 Indicators of a Human Fingerprint on Climate Change

    adelady at 16:25 PM on 3 November, 2011

    New Boy. Concluding that CO2 is "the principal cause of the current warming" is a result of two processes.

    One. Physicists and chemists identified the properties of CO2 and advised (and then warned) that increasing the proportion of CO2 in the atmosphere would result in warming.

    Two. Other people have done lots of work on analysing all the various causes and sources of both warming and cooling now and in the past. Having measured, analysed, checked and analysed yet more data, they've concluded that the other known causes of variations in the climate cannot account for the current warming. The properties and the increasing concentrations of CO2 can explain it.

    One - theory about ghgs predicted warming long before there was any real evidence. Two - measurements of warming coincided with other measurements, greenhouse gases, TSI, volcanoes, declining ice, orbital variations, and found that only greenhouse gas warming can account for the increases we've seen.
  • Berkeley Earth Surface Temperature Study: “The effect of urban heating on the global trends is nearly negligible”

    Bob Lacatena at 07:59 AM on 3 November, 2011

    29, lancelot,

    1) Smoothing over 60 years is inappropriate for this purpose, because the time span is too long. It's like putting on foggy sunglasses to fix a watch.

    You use smoothing to eliminate noise. In this case you would be using smoothing to eliminate the signal, too.

    2) Solar activity does not equal sunspots.

    This needs clarification in three ways.

    First, no one denies that the sun is involved in climate (it is the continuous source of virtually all energy on earth, for crying out loud!). But the fact is that the variations in total solar irradiance (TSI) are very, very small, but obviously over long time spans it can be a factor. Other factors (such as GCRs) are being researched, but there are currently more reasons to doubt their involvement in current warming than to give it any weight.

    Second, sunspots are a proxy for total solar irradiance and other relevant factors (such as magnetic field strength). There is some correlation between sunspots and these values, and there are sunspot counts going back 400 years, so they serve as a proxy for those important values, while direct measurements only exist since the modern instrument era and space age.

    But sunspots themselves are nothing more than visible blemishes on the sun which signal changes in solar activity. Sunspots themselves are nothing from that point of view. They are the greening leaves that tell you the warmth of spring has arrived, but they are not the warmth of spring themselves.

    Third, climate models have been successful at identifying the causes of short-term climate change (volcanic activity, solar variations, etc.) throughout the recent climate record. Basically, we can account for the forcings (including the sun) for the length of that graph, and within that the forcing which is causing the warming in the past 30 years is CO2.

    So to say that there has been constant warming since 1800 is misleading, because it implies that such warming is the result of one source, or some ongoing, mystical trend, when in fact there are periods of warming and cooling within that period, and more importantly the factors involved in each period of warming or cooling are known and quantified. There's no reason to think that they are all caused by something else, so CO2 is therefore off the hook.
  • Understanding climate denial

    elsa at 21:00 PM on 11 October, 2011

    DSL You say "climate modeling is an attempt to forecast climate based on a sound physical model and probable conditions."
    I don't really understand what you mean by "a sound physical model". As far as I know we have no knowledge outside the models themselves of the relationship between eg CO2 and temperature. We are not given the information by physics that we could then test. What we have is information obtained by measurements about temperature and CO2 which we then fit together in a model. But the logic here is circular. Of course the data will fit because we have made it do so! Where the relationship between the two variables does not hold we add in something else to ensure that it continues to do so.

    This is quite the reverse of Sir Richard Doll's work. When he started out he had no preconceptions about what was causing the rise in lung cancer rates. His initial hunch was that the tarmacing of roads was the problem. Only after his work did he come up with the smoking/cancer relationship.

    In my view this is the flaw in the warmist argument. The "scientific" evidence for AGW is completely lacking because it is dependant on the models, which are not really scientific at all, although by using maths and complex jargon they give a sophisticated impression. Until the warmists are able to come up with falsifiable propositions a truly scientific consensus cannot come about.
  • Venus doesn't have a runaway greenhouse effect

    Rosco at 11:22 AM on 5 September, 2011

    Sphaerica a question.

    http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch1s1-4-3.html

    1.4.3 Solar Variability and the Total Solar Irradiance

    3rd paragraph

    "Between 1902 and 1957, Charles Abbot and a number of other scientists around the globe made thousands of measurements of TSI from mountain sites. Values ranged from 1,322 to 1,465 W m–2, which encompasses the current estimate of 1,365 W m–2. Foukal et al. (1977) deduced from Abbot’s daily observations that higher values of TSI were associated with more solar faculae (e.g., Abbot, 1910)."

    How did they measure this ?
  • Venus doesn't have a runaway greenhouse effect

    Rosco at 10:30 AM on 5 September, 2011

    muoncounter - the 51 % was after albedo of ~30% and atmospheric absorbtion of ~19 %.

    Please read this from the IPCC
    http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch1s1-4-3.html

    1.4.3 Solar Variability and the Total Solar Irradiance

    3rd paragraph

    "Between 1902 and 1957, Charles Abbot and a number of other scientists around the globe made thousands of measurements of TSI from mountain sites. Values ranged from 1,322 to 1,465 W m–2, which encompasses the current estimate of 1,365 W m–2. Foukal et al. (1977) deduced from Abbot’s daily observations that higher values of TSI were associated with more solar faculae (e.g., Abbot, 1910)."

    the solar insolation is certainly a vector quantity

    This link shows ~51 % solar irradiance reaches Earth.

    http://www.physicalgeography.net/fundamentals/7f.html
  • Mars is warming

    jdixon1980 at 04:47 AM on 27 July, 2011

    Do I understand correctly that Fenton 2007 was not based on actual measurements of temperatures on Mars, but rather an inference that temperatures must be going up because the albedo was lower? In my crude understanding (I have a bachelor's degree in mechanical engineering, but I work as an attorney and am by no means up on scientific trends in general, much less trends in climate science), the inference is that a lower *proportion* of light energy from the sun was being reflected, meaning a greater *proportion* of light energy from the sun was being absorbed, and therefore temperatures must be rising as a result? If I am getting this right, then to rely on Fenton 2007 as evidence that Mars warmed between 1977 and 1999, wouldn't you have to accept in the first instance that albedo is a reliable measure of Mars temperature, and doesn't that theory imply that the TSI is fairly constant (for the moment passing over StanislavLem's comment about other solar phenomena besides TSI impacting Martian climate)? Otherwise couldn't the perceived decrease in brightness on Mars actually be due to a decrease in TSI rather than a lower proportion of TSI that is reflected? If that were the case, that would undermine the argument that the Earth is heating up *because* of solar changes, as the Earth would be heating up *despite* a decrease in TSI. Of course, if we are actually measuring TSI directly, then the foregoing line of reasoning is irrelevant, and please forgive my ignorance.

    Turning to StanislavLem's comment about other solar phenomena causing dust storms on Mars, reasoning that those are "possible" on Earth as well, is anybody putting forth a cogent theory that dust storms on Earth actually are happening, that they follow the patterns of dust storms on Mars, and that they impact Earth's climate in a significant way?

    Is there a general consensus that the only actual significant heat energy from the sun is from TSI, even if other solar phenomena might have other impacts (gravitational/magnetic?) that could indirectly affect climate?
  • The Medieval Warm(ish) Period In Pictures

    muoncounter at 08:20 AM on 25 July, 2011

    Camburn#56: "I have not seen any rebuttals to the Sargasso Sea temperature proxies."

    Perhaps not. However, here is an excellent rebuttal to the numerous misrepresentations of Keigwin's Sargasso Sea data that continue to rebound throughout deniersville.

    Keigwin’s Fig. 4B (K4B) shows a 50-year-averaged time series along with four decades of SST measurements from Station S near Bermuda, demonstrating that the Sargasso Sea is now at its warmest in more than 400 years, and well above the most recent box-core temperature. Taken together, Station S and paleo-temperatures suggest there was an acceleration of warming in the 20th century, though this was not an explicit conclusion of the paper. Keigwin concluded that anthropogenic warming may be superposed on a natural warming trend. ...

    Keigwin’s Fig. 2 showed that δ18O has increased over the past 6000 years, so SSTs calculated from those data would have a long term decrease. Thus, it is inappropriate to compare present-day SST to a long term mean unless the trend is removed.
    -- emphasis added

    This analysis, Misrepresentation of Scientific Data by Hillary Olson at UT, is based on a 2010 GSA talk by Boslough and Keigwin. It features a point-by-point demonstration of the manner in which deniers cherry-pick from a legitimate study, modify, distort and misrepresent. It includes a discussion of how internet memes arise and gain traction despite being factually incomplete or incorrect. This particular 'Saragasso Sea was warmer way back when' meme is traced to the folks behind the Oregon Petition.

    For the benefit of any skeptical educators, Olson includes the relevant sections from the Texas Essential Knowledge and Skills (TEKS):

    The student uses critical thinking, scientific reasoning, and problem solving to make informed decisions within and outside the classroom. The student is expected to:
    (A) in all fields of science, analyze, evaluate, and critique scientific explanations by using empirical evidence, logical reasoning, and experimental and observational testing, including examining all sides of scientific evidence of those scientific explanations, so as to encourage critical thinking by the student;
    (B) communicate and apply scientific information extracted from various sources such as current events, news reports, published journal articles, and marketing materials;
    (D) evaluate the impact of research on scientific thought, society, and public policy;


    It is too bad those skills are in such short supply these days.
  • Milankovitch Cycles

    Bob Lacatena at 01:02 AM on 25 July, 2011

    To elaborate on scaddenp's comment at 17 (not sure if this has already been clearly stated elsewhere in the thread), my own understanding is that current theory states that the changes in insolation do not actually affect much themselves except to shorten/cool NH summers at the onset of a glacial, or to lengthen/warm NN summers at the onset of an interglacial.

    This warming is enough to cause a slow (meaning a lot slower than what we're doing to the Arctic) advance or retreat of northern hemisphere snow cover. Because of the amount of land in the NH, this results in a substantial change in albedo, which of course drops temperatures, and advances/retreats snow cover further.

    The drop in albedo further results in other feedbacks, primarily CO2, through things such as vegetation changes and ocean temperature changes. These, of course, evoke further feedbacks, as is well described by current climate science literature.

    The fact that changes in TSI are so minimal, and yet the glacials/interglacials occur, is an important clue that climate sensitivity is high.

    Ultimately, these effects all combine enough to cause the level of climate change required.

    The main problem I've seen in the literature is in trying to identify the cause/mechanism behind what appears to be an abrupt release of CO2 (which is both detected in ice core measurements, and also required for the degree of climate change seen) early in the termination of a glacial period.

    There's a lot of literature to be found just by searching for "CO2 glacial termination."
  • 2010 - 2011: Earth's most extreme weather since 1816?

    Norman at 12:38 PM on 5 July, 2011

    scaddenp @ 240 and 241

    Not sure I understand your line of reasoning. The definition of science is given above. Gravity has a linking mechanism, all matter attracts and it does so by the working equation F=G(M1*M2)/r^2. Certain systems of gravity cannot be predicted. They are outside the realm of science. Even accumulating more information on the system will not make it more predictable.

    Example of chaotic gravity.

    After 10 years with Hyperion you may not be able to build a model to predict its motion but gravity is still scientific. If you have the measurements of its mass, Saturn's mass, its distance from Saturn, you can come up with a precise measure of the gravitational force acting on this moon.

    If a climate model is to be considered scientific then it must pass the test of predictability. If the model is incapable of making valid tested predictions why would you consider it scientific?
  • It's the sun

    JoeRG at 12:39 PM on 3 July, 2011

    KR

    First is that natural variability means that on a short time scale (5-10 years) 'climate' models can only give an approximation of the 'weather', where on 20-30 years they do an excellent job of looking at trends. It's not a miss unless the observations go outside the envelope of model predictions, the orange and blue bands representing the multiple-run envelope.. Therefore the fit with anthropogenic forcings is quite good.
    If you didn't notice, the two peaks I mentioned are begin and end of a trend that lasted about 35 years and that, after removing the noise of ENSO effects, was as straight as a temperature trend could ever be. So I didn't speak about a short time effect but about a significant climatic scenario. Besides, it is a good example for underestimation of natural forcings, especially the solar forcing.
    Given that the forcings are to describe as a function like in comment #845 by scaddenp, the function for the natural forcings is: (nat)Temp = Func(Sun, Albedo(clouds), Aerosols(vulcans)). Looking at the conditions shows that in this period the albedo is to assume as nearly constant and the aerosols were slightly lowering with only a very small change after 1915. So the solar forcing remained as the main driver of the occured trend.
    As well, if only the natural forcings were considered, this trend should have been continued until 1963 because there were no significant changes.
    This leads to your next statement:
    Third, 'global dimming' shows up in both model and measurement data as change to a downward trend around 1940. I think your statement regarding that is unfounded.
    Excuse me, but where is it? In the measurements clearly, but where in the models?
    Given the circumstances that no natural forcing had changed that far that the trend could have been stopped (in the period from '45 until '63) results in the conclusion that anthropogenic forcings were at work in the manner of global dimming. This would mean that natural forcings must have caused higher temperatures as anthropogenic forcings in this time. But this never happens in the models.

    Summarized: We have a model that
    1) doesn't consider significant trends,
    2) underestimates natural forcings and
    3) shows wrong values of anthropogenic forcings.
    Sorry, but 'excellent' is something different.

    Finally, as to models - they are an important tool for teasing out the contributions and effects of different forcings, as well as a good check on our understanding of the physics involved.
    Such models give the impression that the physics are not well understood, at least in the climate science.

    Second, given recent higher grade measurements of forcings, the post 1950's fit is accordingly better in the models.
    Not quite. Forcings are calculated based on measured physical values and observed conditions.

    I suggest you read the Models are unreliable thread if you have such concerns about the use of models as tools.
    I didn't mean to go too far off topic, but this model that you've presented is a proper example for an analysis how underestimated solar activities are in the climate models.

    As I see it, because of false trails that exist (and that are powered by such bad models), the research in possible amplifications of solar forcings is too little to get a better understanding.
    For example, as I told before the magnetic field of the Earth weakened by 10% in the last century while the solar magnetic flux nearly doubled. I found only a view studies about this influence on climate, but most of them were made by persons that you would call a 'denier'. In the IPCC documents I found nothing at all, regrettably.
    As well, an influence of number and intensities of solar flares is possibe (and can of course explain the unusual hard rise in the OHC in 2003).
    But as long as only the last 35 years of solar activity are considered (as in the 3 topics and the IPCC reports) there will be probably no change in research.
    And that is not only a 'miss', it is truly a mess.
  • It's the sun

    KR at 23:43 PM on 30 June, 2011

    JoeRG

    I don't know if you are aware of this, but there are several issues with your comment.

    First is that natural variability means that on a short time scale (5-10 years) 'climate' models can only give an approximation of the 'weather', where on 20-30 years they do an excellent job of looking at trends. It's not a miss unless the observations go outside the envelope of model predictions, the orange and blue bands representing the multiple-run envelope.. Therefore the fit with anthropogenic forcings is quite good.

    Second, given recent higher grade measurements of forcings, the post 1950's fit is accordingly better in the models.

    Third, 'global dimming' shows up in both model and measurement data as change to a downward trend around 1940. I think your statement regarding that is unfounded.

    Finally, as to models - they are an important tool for teasing out the contributions and effects of different forcings, as well as a good check on our understanding of the physics involved.

    Based on our understanding of the physics of forcings, the measured changes in solar activity, volcanic activity, etc., natural forcings should have cooled the climate considerably since mid-century. That did not happen - anthropogenic forcings made the difference, hence current warming. So again, the statistically significant (obvious to the point of a boot to the head) break between natural forcings and climate response became visible mid-20th century.


    I suggest you read the Models are unreliable thread if you have such concerns about the use of models as tools.
  • Uncertainty in Global Warming Science

    KR at 01:14 AM on 28 June, 2011

    Ken Lambert - "If TSI is above an 'equilibrium' value and stays constant -there is a constant imbalance in forcing which translates to a linearly increasing gain in energy..."

    As has been pointed out more than once, Ken, a constant imbalance could only be maintained if the TSI was increasing (not 'constant') to stay ahead of the increasing TOA radiation to space due to increasing temperatures.

    That's not happening. We have excellent data on TSI, very precise in noting changes in insolation even if there are inter-satellite absolute calibration uncertainties - we simply do not have the constantly increasing insolation required to drive the temperature changes of the last 30-40 years. Your hypothesis is quite simply contradicted by the facts.

    I suggest taking additional comments in your ongoing TSI discussion back to it's the sun, where this has been repeatedly disproven.

    As a serious point, Ken, you seem to be unable to absorb any information on this topic. You've repeated the same incorrect assertions over and over and ... (repeat as necessary), and have had the errors in your hypotheses pointed out each time. Why do you continue to insist on this contrary to actual measurements?
  • Uncertainty in Global Warming Science

    dhogaza at 23:22 PM on 27 June, 2011

    Camburn:

    "It would seem Bertrand's research on solar agrees with Dr. Svalgaard.
    "Bertrand was investigating the effect of solar and volcanic influence on climate and concluded "these are clearly not sufficient to explain the observed 20th century warming and more specifically the warming trend which started at the beginning of the 1970s"."

    I don't think Camburn understands that the implication of Bertrand's research (assuming he cites it correctly) is that TSI hasn't caused the warming trend that started at the beginning of the 1970s. In other words "it's not the sun, rather than CO2". Strengthens, not weakens, the case of sensitivity to a doubling of CO2 being higher than denialists like Camburn so fervently want to believe.

    Now if Svaalgard's paper holds up over time, then yes, there's something not well understood about early 1900s warming. However, it's not claimed by climate scientists that the cause of this warming is perfectly understood.

    Camburn will be sure, though, that Svaalgard's *reconstruction* regarding TSI trumps all observations regarding CO2's role as a GHG, positive water vapor feedback, TOA satellite measurements, etc etc etc.
  • Uncertainty in Global Warming Science

    Camburn at 12:47 PM on 27 June, 2011

    KR:
    And then we have this. A paper that discusses the range of different TSI measurements in the present and its potential effects on climate.
    The TSI undertainty is not insignificant and has a large bearing on understanding. This is crucial to modeling etc.

    Scafetta
  • How would a Solar Grand Minimum affect global warming?

    Ken Lambert at 12:31 PM on 22 June, 2011

    Tom Curtis #76

    "I am now at a complete loss to explain why you should quote the irrelevant uncertainty for absolute measurements rather than the directly relevant uncertainties for relative TSI when attempting to rebut KR #62."

    Because we really don't know what absolute level of TSI will produce an equilibrium condition on Earth in the absence of AG forcings.

    It is OK to look at relative TSI back as far as satellite measurement goes and find only the 11 year ripple - but if that average TSI was at an absolute level higher than an 'equilibrium TSI' to start with (a 20th century solar maximum for example) - then that would contribute an increasing energy input to the Earth system.
  • How would a Solar Grand Minimum affect global warming?

    Tom Curtis at 00:19 AM on 22 June, 2011

    Ken Lambert @75, given that you now show every evidence of understanding the difference in uncertainty for absolute measurements of TSI and for relative TSI (ie, change of TSI with time), I am now at a complete loss to explain why you should quote the irrelevant uncertainty for absolute measurements rather than the directly relevant uncertainties for relative TSI when attempting to rebut KR #62.
  • Shapiro et al. – a New Solar Reconstruction

    PhillyWilly at 11:59 AM on 31 May, 2011

    TSI is one thing, but Direct and Indirect Solar forcing Mechanisms on the Climate System Are another thing completely. The Whole argument "TSI has been decreasing since 1980" really is not even relevant because its assuming that overall "equilibrium" is reached immediately after changes in TSI occur. In this case of "rapid equilibrium", TSI energy changes from the Sun are the only way the sun can modulate the climate.

    But that is assumption. TSI basically covers changes in total energies from the Sun itself, but not how the climate system responds to these changes, whether it be long term changes in Cloud Cover, Effects on the Ozone layer changing the amount of UV rays that can enter the atmosphere/oceans etc... I could go on and on.


    If we were to look at low clouds, for example..none of our measurement systems are "state of the art", so to speak, in measuring them. Its for this reason though...as if to say that clouds will remain fairly constant unless inflenced by AGW (with no mechanism to boot), that Either Direct or Indirect solar influence cannot affect them.

    A change in total clouds of 3% would have a significant radiative impact to the Surface Heating, a 0.5W/m^2 Net Radiative Impact, and a change in low clouds only of 3% would apply a 1.8W/m^2 of increased energy. Even if those Values are incorrect, Changes in Low Clouds would act to ( -All caps usage snipped- ) through more incoming SW radiation....and that is exactly what we have seen thus far, Satellite measurements of the entire tropospere showing less warming overall that the surface measurements...AGW works the other way around.

    And the small Proposed effect from GCR's to cloud cover... if GCR's are excessively low for some time, may have a significant effect on Low Clouds Overtime.


    So Arguing for TSI in the first place, at least short term, is really a bunch of semantics.
  • Carter Confusion #1: Anthropogenic Warming

    jonicol at 14:49 PM on 24 May, 2011

    1. Tom Curtis at 08:13 AM on 24 May, 2011
    jonicol @40,

    I found my first encounter with your theories several years ago......
    .
    He is so incompetent that his colleague, you, is still refuting a back radiation .......... without having realized after three years ...... is not the theory of the greenhouse used by climate scientists.

    1. Thank you Tom for coming back on this. I realised after reading a number of physics and spectroscopy papers on the action of CO2 in the atmosphere, and in particular after carrying out my own basic quantum mechanical analysis, that mine was not the “physics” used by climatologists. I then spent the next three years asking the Australian climatologists, what is the theory of the green house used by them. None has been able to give me an answer beyond stating that there was a correlation between 1979 and 1998 and virtually that was it. If you have more details, I would be delighted to hear from you. You have my email address at bigpond from my paper so I will look forward to hearing from you with genuine interest.

    Kinninmonth demonstrably makes fundamental errors on a repeated basis, wether by design or through incompetence.

    2. Could you give some examples of his errors?

    Your listing of Bill Kinninmonths qualifications are a pure appeal to authority.......


    3. I was not intending to go through the background qualifications of anyone, since I believe that most scientists learn on the job. However, some one else mentioned Bill as being one of my colleagues and indicated that he, Bill was not qualified to make a contribution to climate discussions. It seemed appropriate to at least outline what I believe does give him very significant credibility in this field, even though he does not claim to be a “climatologist”. So, no, I am not appealing to authority as I did not make any comment on the results of Bill’s work. It would also be interesting to know for instance, whether Andy Pitman or Will Steffen for instance, are conversant with the required physics and mathematics which is used to set up an AOGCM climate model, the basis upon which the science of climatologists relies. Not that it is any concern if they cannot, but they do refer to themselves as climatologists. Nor am I not saying either that Hansen is insufficiently knowledgable even though I do not agree with his analysis of the behaviour of CO2 since he also uses assumptions based only on Arrhenius’ hypothesis with a bit of embellishment from Callendar, whose work is interesting, but I am sure would agree simplistic. As I included in a longer response in this thread, the main portion of which was snipped, the most significant error from the point if view of physics that Arrhenius and Callendar made, as perpetuated also in modern climatology, is that only the green house gases are responsible for the rise in temperature from 255 K to 288 K. While at low concentrations the green house gases will assist in warming the atmosphere, the major transfer of heat from the surface to air is via wind cooling over land and evaporation over water. The sea surface for instance in the tropics has not been known to rise to 100 C (373K) which would be required for it to radiate at the rate it receives heat from the sun. Similarly over land the surface exposed to the midday tropical sun does not reach more than about 60 C whwereas the radiation equilibrium temperature is 119 C. On the basis of these observations, it is not difficult to calculate the fraction of heat lost by radiation - < 20%, - and nothing will change that. Perhaps you could give me some references to the work by Rabbet, Colose and Tamino since in my limited knowledge of them they have seemed to be working more on the veracity or otherwise of temperature measurements and concentrations of carbon dioxide, rather than discussing the physical links between the two parameters.

    Nor, apparently, for all of Kinninmonths qualifications ...... radiative transfer is unphysical

    4. I am wondering if you could be more specific in explaining why my model of the atmosphere is “unphysical”. I have had many comments on my paper over the years it has been on the web but none which described it as “non physical” As you would be aware, I have always given my email address and invited comments and in particular criticisms of the physics – but disappointingly none has been critical and many physicists have commended it – not that I am seeking commendation as it is just pure text book physics applied to the atmosphere and carbon dioxide in particular as an example of a green house gas.


    You say you want to "draw attention for the need to discuss the scientific aspacts of climate change". Nothing done by the moderators or participants in this forum have prevented you from doing so.

    5. That isn’t quite correct as some years ago I added my two pennies worth as I did the other day and
    was pillaried for it. I wasn’t discouraged this time – just had the main part of my contribution removed from the site. I accept that it may have just now gone outside this particular thread but somewhere else on this I have been urged to stick to the science. You have just been required, like the rest of us, to post on topic discussions which are confined to the topic of the post being discussed. As all manner of climate science (and non-science by deniers) is discussed on this forum, finding a suitable topic should be no problem. Apparently, however, it is too much effort for you. You would rather hijack threads with long screeds devoted solely to your theories. However, this is not your site. Out of politeness to your host, you should obey the forum rules (see the comments policy). Your inability or unwillingness to do so is you only impediment
  • It's the sun

    dana1981 at 07:58 AM on 17 May, 2011

    Cole -
    "This paper shows the Models underestimate solar forcing by up to six times."
    It does no such thing. The paper suggests that other TSI reconstructions underestimate the amplitude of TSI changes in the past. It has very little to do with climate models, and in fact specifically notes that their TSI estimates over recent decades, during which we have good measurements, are no different than previous TSI reconstructions.
  • Lindzen Illusion #6: Importance of Greenhouse Gases

    Bob Lacatena at 00:51 AM on 13 May, 2011

    60, SteveS,

    Interesting paper.

    I haven't gone through it yet, but it gives barely a nod to temperature and climate, and makes no statement at all beyond "the climate warming during the steady increase in solar activity in the first half of the twentieth-century." It seems they're just trying to use proxies (sunspot counts, and tree rings and ice cores) to compile a more precise record of TSI and SSI prior to the existence of adequate instrumental observations (last 30 years) back to 1600 (based on proxy data availability).

    If someone wanted to read into it, though, I think you're right in saying that it can't explain post 1950 warming. Really, if anything, it helps disprove solar influence. Looking only at their graphs as compared to temperatures, the implication is that increasing TSI through the first half of the 20thcentury could account for the warming in the first half of that century, as well as some reason for the leveling of temperatures starting around 1950, but then cannot account for the warming in the last 30 years.

    But in the end, the paper isn't about climate change at all. It's just about coming up with a new and hopefully better interpretation of proxy measurements to establish TSI and SSI back to 1600.
  • Lindzen Illusion #5: Internal Variability

    Arkadiusz Semczyszak at 20:16 PM on 10 May, 2011

    @Dana1981
    - Even if you're right, "mercilessly" simplifies.

    A significant component of unforced multidecadal variability in the recent acceleration of global warming. DelSole, Tippett and Shukla, 2010.

    There is a sentence:
    “While the IMP can contribute significantly to trends for periods of 30 yr or shorter, it cannot account for the 0.8°C warming that has been observed in the twentieth-century spatially averaged SST.”

    ... but nevertheless (also):
    “The warming and cooling of the IMP matches that of the Atlantic multidecadal oscillation and is of sufficient amplitude to explain the acceleration in warming during 1977–2008 as compared to 1946–77 ...”

    In the past, even as big a change as the passage MCA - LIA may be caused by - “by INTERNAL VARIABILITY” Medieval Climate Anomaly to Little Ice Age transition as simulated by current climate models, González-Rouco et al., 2011.:
    “Therefore, under both high and low TSI change scenarios, it is possible that the MCA–LIA reconstructed anomalies would have been largely influenced by INTERNAL VARIABILITY. [...]”

    INTERNAL VARIABILITY is not only a redistribution of energy absorbed by the ocean, it is also a change in ocean circulation or strengthening - weakening - AMOC, ENSO, local circulation. It has an effect on the accumulation of energy by the ocean - change of place accumulation - the increase (and by changing the quantity of GHGs - water vapor, methane, CO2 - and the clouds - the spatial resolution). Today we see that part of the ocean, energy is lost - the obvious influence of ocean circulation. ... And for those circulations influences INTERNAL VARIABILITY.

    Solar Influences on Climate, Gray et al., 2010. :
    “... anthropogenic forcings are needed to explain the observations after about 1975. It should be noted that this is true globally as well as in many, but not all, regions, indicating that internal variability is larger in some regions than in others and also is larger than in the global means.

    ... Top Of the Atmosphere (TOA): Estimations of climate sensitivity based on top-of-atmosphere radiation imbalance, Lin et al., 2009. . :
    “Currently, there is a lack of high accuracy measurements of TOA radiation imbalance.” “The range of feedback coefficient is determined by climate system memory. The longer the memory, the stronger the positive feedback. The estimated time constant of the climate is large (70-120 years) mainly owing to the deep ocean heat transport, implying that the system may be not in an equilibrium state under the external forcing during the industrial era.” “Furthermore, the climate feedbacks should include not only short-term (including instantaneous) responses but also longer time scale (or historical) responses because the climate generally has certain memories, which are omitted in these energy balance models.” “The range of feedback coefficient is determined by climate system memory. The longer the memory, the stronger the positive feedback. The estimated time constant of the climate is large (70 ~120 years) mainly owing to the deep ocean heat transport, implying that the system may be not in an equilibrium state under the external forcing during the industrial era.”

    @ adelady
    1. The sun's been a bit cooler the last few years.

    And so what? The sun has always acted with considerable delay (probably many, many times I will have to resemble).
    Sub-Milankovitch solar forcing of past climates: Mid and late Holocene perspectives, Helama et al., 2010.:
    “Thus, the warmer and cooler paleotemperatures during the Medieval Climate Anomaly and Little Ice Age were better explained by solar variations on a millennial rather than bimillennial scale. The observed variations may have occurred in association with internal climate amplification [...] (likely, thermohaline circulation and El Niño–Southern Oscillation activity). THE NEAR-CENTENNIAL DELAY in climate in responding to sunspots indicates that the Sun's influence on climate arising from the current episode of high sunspot numbers may not yet have manifested itself fully in climate trends.”
    “... 70 ~120 years ...”, “... NEAR-CENTENNIAL DELAY ...” - After that time the climate will respond to the fact that: “The sun's been a bit cooler the last few years ...”, NO EARLIER !!!
  • Tracking the energy from global warming

    Berényi Péter at 09:44 AM on 9 May, 2011

    The issue is spreading misinformation in another thread:

    "Satellites have measured an energy imbalance at the top of the Earth's atmosphere".

    It's not allowed to challenge this proposition where it occurred, we were redirected here.

    Therefore let's reiterate the references given there.

    See e.g. Trenberth 2009: "There is a TOA imbalance of 6.4 W m-2 from CERES data and this is outside of the realm of current estimates of global imbalances that are expected from observed increases in carbon dioxide and other greenhouse gases in the atmosphere".

    Or Trenberh 2010: "The difference between the incoming and outgoing energy -- the planetary energy imbalance -- at the top of the atmosphere is too small to be measured directly from satellites".

    From this it is crystal clear that satellites in fact have not measured an energy imbalance at the top of Earth's atmosphere, which is inconsistent with the claim they have.

    So far so good. But we can certainly do better than that.

    There was an interesting presentation on the 12th CERES-II Science Team Meeting Wednesday, November 4, 2009, Marriott Hotel, Fort Collins, CO at 9:30 am by Paul Stackhouse et al.: FLASHFLUX Update.

    They merged three datasets.
    1. CERES Terra EBAF Edition 1A (3/2000 - 10/2005)
    2. CERES Terra ERBE-like ES4 Edition2_Rev1 (1/2003 - 8/2007)
    3. FLASHFlux Terra+Aqua (7/2006 - 9/2009)

    They used overlap periods to remove mean difference between datasets and anchored the entire time series to the absolute values of the EBAF.

    Their (improved) result is seen on slide 21. At the moment we are only interested in the lower panel, the net TOA radiation balance.



    Unfortunately it is only a picture one can't do much with, other than staring at it. Therefore it had to be re-digitized: Net_TOA_Imbalance_Stackhouse_2009.txt.

    As accuracy of satellite radiative imbalance measurements is very low, the baseline is of course arbitrary. It is simply aligned to EBAF and has nothing to do with the actual imbalance. However, since precision is a bit better, we can still use it to track changes of this imbalance over time.

    To anchor the baseline to reality we need another data source (not considered by Stackhouse et al.)

    Fortunately we have quarterly data for the heat content anomaly of the upper 700 m of oceans since 1955 at the NOAA NODC Global Ocean Heat Content page.

    OHC (Ocean heat Content) anomaly is perfect for intercalibration purposes, because it is a linear function of the temporal integral of radiative imbalance at TOA. That is, the average slope of OHC in a time interval is indicative of average imbalance over the same period.

    For intercalibration we need several full years, because Stackhouse et al. only provides deseasonalized data while Levitus et al. of NODC include seasonal signal.

    It is best to use data from the ARGO period, because prior to that OHC is poorly and sparsely measured by diverse systems while ARGO provides a homogeneous and dense dataset. Now, before about mid-2003 ARGO coverage was not yet global, so we have to settle to the 6 years between 4. quarter 2003 and 3. quarter 2009.

    In this period (taking into account the error bars provided by NODC) slope of the OHC anomaly curve (for the upper 700 m) is -1.8±9.6×1020 J/year, which translates to an imbalance of -11±60 mW/m2. That is, in this period the climate system was probably losing heat, not gaining it, but the gain, in any case was more than an order of magnitude smaller than Trenberth's 850 mW/m2: "The TOA energy imbalance can probably be most accurately determined from climate models and is estimated to be 0.85 ± 0.15 W m-2". Therefore heat accumulation for this period can be considered zero well within error bounds.

    It is fortunate, because only a fraction of the net heat content anomaly is realized in the upper 700 m of oceans, the rest comes from or goes to elsewhere (deep ocean, land, ice sheets), although at least two third remains in the upper ocean.

    Average of Stackhouse's net TOA imbalance for the 72 months between October 2003 and September 2009 is 202 mW/m2, that is, their baseline is probably too high. If 213 mW/m2 is subtracted from each of their values, it brings net TOA radiative imbalance in line with OHC data.

    Now, that we have the correct offset for TOA imbalance, we can calculate heat accumulation for the entire timespan covered by Stackhouse's data. It looks like this:



    As you can see the story the data tell is somewhat different from the standard one. Heat content of the climate system is not increasing, but decreasing. What is more, the radiative imbalance at TOA during the satellite era is about -0.26 W/m2, which is, according to Trenberth, inconsistent with the 0.85±0.15 W/m2 determined from climate models.

    Furthermore, if we suppose about 2/3 of heat content changes are realized in the upper 700 m of oceans, it turns out satellite radiative imbalance measurements at TOA are also inconsistent with pre-ARGO OHC measurements.



    It probably means before about mid-2003 OHC data are absolutely bogus and unusable for model testing and calibration.
  • Frauenfeld, Knappenberger, and Michaels 2011: Obsolescence by Design?

    lucia at 00:25 AM on 7 May, 2011

    Tom Curtis at 10:47 AM on 6 May, 2011

    In response to:

    1) Ice melt extent in the 2000's would be statistically indistinguishable from that for the highest period in the reconstruction. (This is true as it stands, but apparently was not worthy of comment in the paper.)


    Huh? I thought what everyone is upset about is that the paper basically says the melt during the 2000s is statistically indistinguishable from that for the prior reconstruction. But if you agree that the ice melt extent is 2000 is statistically indistinguishable from that of the highest period in the reconstruction, I'm sure Michaels will be happy to report that Tom Curtis decrees this is so.

    2) The two years with the greatest ice melt extent would have occurred in the last ten years, and five of the most extensive melts would have occurred in the last ten years. In no other ten year period would more than two of the ten most extensive melts have occured.

    3) The year with the highest melt extent would be the most recent year, with just eleven reconstructed values having 95% confidence intervals encompassing that value.

    So what if the record happens to fall in the most recent year? This summer will probably have a lower melt than 2010. I'm unaware of any scientific rule that says papers discussing reconstructions can't be published if they happen to end with a melt index that is not a record. Moreover, if 2010 is a record it will still be a record until it is broken. The fact that it might not have been broken again in 2011 won't prevent anyone from pointing out that a record was broken in 2010.(And of course, if 2010 is not a record, it's not a record. )


    On the claim that two years with the greatest ice melt would have occurred in the past ten years: How are you concluding this with any certainty?

    It's true that the satellite measurements would indicate that the melts for 2007 and 2009 are greater than all reconstructed melt indices. But the reconstructed melt indices have uncertainties associated with them. Based on the observation that the 2007 melt index falls outside the ±95% uncertainty intervals for the reconstruction and there is at most a 60% probability that the 2007 melt is greater than all melts during the previous period. (The probability is actually much lower because I did a quick calculation which assume the 2007 melt index exactly equaled the upper 95% confidence value for 20 cases. So I did ( 1- 0.975^20) = 0.3973 as the probability all previous 20 are lower. But this represents the upper bound for the probability that all 20 are lower because in each individual case the probability a particular value from the previous period is < 0.975).

    So with respect to 2007-- you can't know for sure its melt exceeded those in during the 30s. That certainly makes your claim that two years with the greatest melt occurred in the past 10 years tenuous. (I suspect if we got the more detailed data from Chip and did the full analysis, we'd find the probability your claim is true is less than 1/2. )

    But even your claim that one year with the greatest melt occurred in the past 10 years is tenuous. Assuming your estimate that 2010 would fall outside the uncertainty intervals for 11 years, there is at least a 24% probability that the 2010 value is a not record. Very few people would consider this probability sufficiently high to say with confidence that even 2010 was a record. So you can't even say 2010 must be a record. (Though of course it might be. If we got the data from Chip, we could do a better calculation, and the probability that it's a record is even lower. )

    If a reviewer has been thoughtful, they might have asked FKM to do this calculation to firm up the numbers-- but given the traditions for making statistical calls, no one looking at the probabilities would decree that a record can be called even with only 11 and even making the simplifying assumption I made above.

    But the reviewers didn't do that. The consequence is the text in FKM doesn't discuss this at all, and text that might have needed more extensive modification showing that the probability of a record is "x%" isn't included in the paper.

    As the abstract stands in the published paper, changing "11" to "20" and "2007" to "2010" does not need to be modified. (So, yeah, assuming your '11' is correct, I missed on edit.) As a practical matter, the abstract only needs a "tweak" and you would have been no happier with it.

    Note: When the observed index is falls inside fewer than 5 ±95% uncertainty intervals, a more refined calculation will be needed to figure out if we can 'call' a record. At some point-- I'm SWAGING when the observed melt index falls inside fewer than 2 ±95% uncertainty intervals-- it will be impossible to say that there is any realistic probability that the melt falls outside the range experienced during the 20s-40s.

    I suspect this will happen during the next El Nino. Since KFM's reconstruction is now published, you'll be able to do this and using the KFM reconstruction and they will need to admit it. (I don't think this notion seems to have sunk in.)

    4) The relatively low ice melt extents in the early satellite period are due in large part to major tropical volcanic eruptions, eruptions which were absent in the 1930s. In the absence of these eruptions, the longest and period of extensive melting would be that straddling the end of the 20th century, not that in the middle. Clearly natural forcings have favored extensive ice melt in the mid 20th century, while acting against it towards the end. (True also in 2009, and surely worth a mention in the paper.)

    First: If this is intended to engage or rebut anything I wrote, it's a bit misplaced. I wrote about changes to the existing manuscript that would be required if 2010 was incorporated.

    Second: I don't disagree with your explanation of why the data looks as they do. Given the nature of this paper, I even think the paper would be stronger with this sort of discussion inserted.

    However, the reviewer (Box) who made a rather vague suggestion to this effect while simultaneously requesting inclusion of data that was not available (and is still unavailable than 8 months later) bowed out because that not-yet-available data were not incorporated. Evidently, whatever happened, neither the editors, the other reviewers nor the authors thought to incorporate this sort of thing.

    It's worth noting that not every paper showing a time series or reconstructions discusses why the time series looks the way it does-- for example the "Surface mass-balance changes of the Greenland ice sheet
    since 1866 L.M.WAKE,1 P. HUYBRECHTS,2,3 J.E. BOX,4,5 E. HANNA,6 I. JANSSENS,2 G.A. MILNE1" doesn't discuss Volcanism when explaining they reported
    "Higher surface runoff rates similar to those of the last decade were also present in an earlier warm period in the 1920s and 1930s and apparently did not lead to a strong feedback cycle through surface lowering and increased ice discharge. Judging by the volume loss in these periods, we can interpret that the current climate of Greenland is not causing any exceptional changes in the ice sheet."


    So, while I agree both the Wake paper and the FKM paper -- both decreeing that this century appears more or less similar to the previous melt period-- might have benefited from inclusion of a few sentences mentioning causal factors for the previous high and low melt periods, neither did. It seems the editors and reviewers standards are consistent in this regard.

    A paper drawing these conclusions, IMO, would be substantially different from the paper actually produced. Certainly it would have been very hard for Michaels to place the spin on it that he as been doing.

    As I understand it, his "spin" amounts to your conclusion (1) above which is "Ice melt extent in the 2000's would be statistically indistinguishable from that for the highest period in the reconstruction. (This is true as it stands, but apparently was not worthy of comment in the paper.)"

    Since other conclusions you make are unsupportable based on the data, your suggesting that his including them would prevent him from "spinning" as he is seem a bit odd. It would be rather silly to suggest that FKM are required to include incorrect or tenuous conclusions to avoid whatever you, Tom, happen to consider "spin".

    Other issues that puzzle me in your comment:
    Tedesco shows mass loss, while FK&M show melt extent.

    The graph I inserted is figure 1C from Tedesco indicates it shows "standardized melting index anomaly". The caption reads "(c) standardized melting index (the number of melting days times area subject to melting) for 2010
    from passive microwave data over the whole ice sheet and for different elevation bands." Tedesco also shows a graph of "SMB" (Surface Mass Balance". It's labeled figure 3a in Tedesco. Since FKM use melt extent, incorporating data for 2010 would involve 2010 melt extent data, not SMB data.


    Second, this analysis has been done graphically, and has all the consequent uncertainties (ie, the numbers might be out by one or two in either direction).

    Of course it's done graphically and your numbers might be out one or two.... I believe I said I was discussing an estimate and I assume you are too. We could add: Done in blog comments. Not even at the top of a blog post were many people could see it. Using Tedesco data as a proxy for the data that would really be used by FKM and so on.

    I've suggested before that this will be worth doing when the melt data used by FKM do become available.

    I see in your later comment you speculate that if only FKM had used a different proxy to reconstruct, they would get different answers, and you speculate as to what those results would be based on eyeballing graphs. Ok... but if their choice of proxy was tenuous, or the reviewers had wanted to see sensitivity to choice of proxy, then that was the reviewers call. They didn't make that call.

    Also: The fact that choice of proxy makes a difference widens the true uncertainty intervals on the reconstruction relative to those show in n FKM. So it would take an even longer time to decree that we feel confident that the current melt index is a record. When the melt index data are available, would you recommend doing the analysis to determine how much widen the uncertainty intervals on the reconstruction? It seems to me that may well be justified.
  • Solar Hockey Stick

    Berényi Péter at 05:18 AM on 18 April, 2011

    "The solar radiative forcing is the change in total solar irradiance (TSI) in Watts per square meter (W/m2) divided by 4 to account for spherical geometry, and multiplied by 0.7 to account for planetary albedo"

    TSI is not a particularly good climate indicator. UV-A (between wavelength 320 and 400 nm) is better, because
    1. solar variability is much larger in the UV than in the visible or near infrared
    2. the atmosphere is pretty transparent to UV-A, because it does not fall into the O3 absorption band
    3. water is extremely transparent for UV-A, so this kind of radiation can penetrate into the ocean (down to several hundred meters) and deposits its energy there as heat

    As UV heating of seawater occurs at a lower geopotential than evaporative and radiative cooling (which happens right at the surface), UV heating, unlike thermal IR, contributes to ocean mixing. It means the effect of increased UV irradiation has a delayed effect on temperature, as heat capacity of oceans is enormous compared to any other part of the climate system.

    Unfortunately UV-A variability is not well constrained by measurements.







    Absorption Coefficient of Water

  • Christy Crock #3: Internal Variability

    Gilles at 20:22 PM on 14 April, 2011

    enough with strawman arguments, please. I never stated that the influence of forcings was zero. I just reminded that it was very difficult-and almost impossible - to quantify precisely the amount of internal variability by computer models - you just get the amount of variability in your model, that's all.


    for instance in the "solar hockey stick" post, you find this kind of curve



    do you believe we have good models to explain the variation of solar activity over thousands of years ? (again I'm not speaking of the influence on the Earth, just of the origin of solar variations). No - absolutely not - not the slightest idea of where they come from. We have models of the sun - but nothing like explanations of that. So if you rely on models to know if these variations could be "natural" or "anthropogenic", would you conclude that they cannot be natural since the models do not show them, and thus must be anthropogenic ? of course this would be totally absurd. So - we can't rely on models to exclude natural cycles. And yes - natural cycles on timescales of 1000 years can exist, of course. Our modern measurements are much too recent to see them at the required accuracy.

    Again I am *not* stating that the influence of CO2 is zero. Just that having a very clear separation between it and natural variations is extremely difficult in my opinion, thus reinforcing the uncertainty on the climate sensitivity - and that the outputs of computer simulations are not really useful to fix this issue.
  • It warmed just as fast in 1860-1880 and 1910-1940

    Albatross at 09:33 AM on 14 April, 2011

    All,

    Re the dubious TSI data from Soon. Here is a graph by Kopp and lean (2011; both (eminent scientists in this field) that shows a distinct downward trend between 1979 and 2010:



    Caption: Contributions to the empirical model of temperature shown in Figure 1 are broken down here: El Niño Southern Oscillation (purple), volcanic eruptions (blue), anthropogenic effects (red), and solar irradiance (green).

    [Source]

    and compare those data with the SAT data:



    Caption: Global surface temperature from 1980 to 2010 has risen by 0.4 degrees Celsius (0.72 degrees Fahrenheit) according to Climate Research Unit measurements (black) and an empirical model (orange). (Courtesy Kopp and Lean), same source as above.

    Not surprisingly, Kopp and Lean conclude that:

    "Using this model, Lean estimates that solar variability produces about 0.1°C global warming during the 11-year solar cycle, but is not the main cause of global warming in the past three decades."
  • The e-mail 'scandal' travesty in misquoting Trenberth on

    Gilles at 00:15 AM on 14 April, 2011

    "If the energy budget were well constrained by the data Trenberth would not have made the comment that is the subject of the article."

    Yes but sorry, the OP was about two kinds of interpretations :

    interpretation a) : the global value of 0.9 W/m2 imbalance has been validated by measurements (note that this COULD have been true if the TSI and TOA outgoing flux would have been measured with a good enough precision), but we have still some problems in the repartition of this energy on the ground/ocean/ice etc....


    interpretation b) : we don't have any validation of this value from global incoming/outgoing fluxes, and besides, some problems in the repartition of this energy on the ground/ocean/ice etc....

    the simple facts is that as I understand the OP, one could think that the right interpretation is a) (and I'm afraid that a number of readers/writers on this thread think or have thought that), whereas , actually, all the scientific literature is saying it's b). I think it is worth being stressed - that's why I do it.

    Alec : thanks to give this example to your students - for me, this is a good piece of real scientific dispute, with the sake of accuracy and rigor. Concerning the "flaw" in the theory, I would simply say that it is too imprecise yet to be fully tested against data - that's far from being exceptional in science, I can give you a dozen of similar examples. I'm not saying it's bad science , I'm saying it's normal scientific research on still unresolved issues - and as it, full of uncertainties. Give that to your students, please.
  • The e-mail 'scandal' travesty in misquoting Trenberth on

    Gilles at 19:16 PM on 13 April, 2011

    well I don't see your point. I used own Trenberth's quotes to remind that the energy budget wasn't well known and constrained by measurements, because it seemed to me that some people here overlooked that and misunderstood what Trenberth really meant - he really meant that that the sum of variations of energy content he could measure didn't match the theoretical expectations. And again , concluding that theory is right and that measurements miss one of the heat sinks is obviously only *one* possibility - the other obvious one being that theory is incomplete ! And I used the plot to illustrate how it is difficult to get an absolute value of the incoming TSI - obviously the relevant point is to compare the discrepancies between instruments to the required accuracy to test the imbalance - around 1W/m2. Things would have been quite different if it would have been a matter of several dozens of W/m2 for instance , but it is not. You can recalibrate each instrument on the other, but always with an overall systematic uncertainty on the absolute value - because you don't know of course which one is *really* right.
  • The e-mail 'scandal' travesty in misquoting Trenberth on

    Gilles at 15:34 PM on 13 April, 2011

    les : I agree : it's a pity that so few people really read the papers and understand them. In Trenberth's paper , it is clearly stated that the 0.9 W/m2 is not the result of accurate measurements :

    " The Clouds and the Earth’s Radiant Energy System (CERES) measurements from March 2000 to 2005 were used at top of atmosphere (TOA) but adjusted to an estimated imbalance from the enhanced greenhouse effect of 0.9± 0.5 W/m2 (with 90% confidence limits) [7]."

    So they're adjusted to an estimated imbalance level. And later :

    "We cannot track energy in absolute terms because the accuracy of several measurements is simply not good enough. This includes the TSI [4] and the Earth’s TOA energy budget [6,7,15]"

    Again : we don't have accurate measurements of the TOA energy budget, including the total incoming radiation on the Earth, with an accuracy enough to measure so a small imbalance of 0.9 W/m2. I think Trenberth himself states it very clearly. It's only a theoretical value -which hasn't yet been confirmed by measurements.
  • Muller Misinformation #1: confusing Mike's trick with hide the decline

    Tom Curtis at 09:12 AM on 3 April, 2011

    Further to DBDunkerson @99, Michael Mann has commented on this issue at RealClimate. The most important point he makes is that the conclusions of the paper were made based on an analysis of the individual yearly records and decadal averages. Consequently the smoothing method makes no difference to the conclusions of the papers:

    "In some earlier work though (Mann et al, 1999), the boundary condition for the smoothed curve (at 1980) was determined by padding with the mean of the subsequent data (taken from the instrumental record). This does make a small difference near the end of the series. It doesn't effect any of the conclusions drawn in the paper though. These were based on comparisons of the individual reconstructed annual values (individual years and decadal averages over 10 consecutive years) from AD 1000-1980, with those from the recent instrumental record (1981-1998), and centered on the fact that the recent instrumental values were outside the error range of the reconstructed values over the past 1000 years and were not related to the smoothed curve."


    Astute readers will also notice that Mann padded with the mean of the instrumental period rather than with the intstrumental measurements themselves. That is an important point. First, it means that in splicing the instrumental record to the proxy record, Jones was not emulating Mann's procedure. Therefore, "Mann's nature trick", contrary to TTTM, is not the three step procedure described by him (which is not a procedure ever used by Michael Mann). In fact, in a post co-authored by Michael Mann at RealClimate, the "Nature Trick" is explicitly described:

    "The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear. Scientists often use the term “trick” to refer to a “a good way to deal with a problem”, rather than something that is “secret”, and so there is nothing problematic in this at all."


    So, TTTM's contention about the nature of the trick is incorrect, and John Cook is correct.

    For those who are interested, the difference padding with the instrumental record (as TTTM claims Mann did) and padding with the mean of the instrumental record (as Mann actually did) can be seen by comparing figures 1 and 3 above. In figure 1, Jones did pad with the instrumental record. The result is a much larger slope on the end of the tail, even in the final years of the 70's (ie, the end of MBH 98 and 99's smooth) as the final value closes on the mean. The exact behaviour does depend on the smoothing function used, so this difference is probably not that significant. As a side note, TTTM's third about truncation is definitely false. In MBH 98, a 50 year smooth is used, and the smoothed function terminates in 1973 (see figure 5 (PDF), ie, 25 years before the padding data ends, and 7 years before the proxy data ends. In MBH 99 a 40 year smooth is used and the smoothed function ends in 1979 (see figure 3 above), ie, twenty years before the padding data ends and 1 year before the proxy data ends. Clearly the end of the smoothed function was simply a consequence of the algorithm used, not the result of a deliberate truncation.

    These errors in the "reconstruction" of MBH's methods are typical of the "climate auditors". They repeatedly think of methods to "reconstruct" climate scientists' procedures that sound suitably culpable to them (or can be spun that way) and then present that as though it was an actual reconstruction of the scientists methods without checking the fine details that distinguish between those methods and closely similar methods. They are so slack that, as seen above, they can describe a graph terminated in 1973 as being terminated in 1980, visually a very noticeable difference.

    In this case, even if Mann had used TTTM's methods 1 and 2, there would have been nothing wrong with it. To my mind, it would have been more defensible than using the 20 or 25 year mean of the last values in the proxy data (a common alternative). Any padding of a smoothing function constitutes a prediction of future values of the smoothed data. Using a 25 year mean of the proxy data would constitute a prediction that the data would have a negative trend after 1980 reaching a value equal to the mean of the 1955 to 1975 values in 2005. Given this is a temperatures proxy, which has tracked temperatures very well since 1880, it seems far more likely that the proxies would continue to track temperatures. Using the instrumental values to pad the series, ie, predicting the proxies would continue to track temperatures, therefore seems wholly defensible.

    In contrast, it would not be defensible as a method of padding Briffa's series which demonstrably does not track temperatures post 1960. That, or course, is not what Jones did in the WMO report. Rather, he created a hybrid temperature reconstruction from two sources of data, as he in fact informed us in that document.
  • It's not us

    Berényi Péter at 04:13 AM on 22 March, 2011

    In the advanced version of The human fingerprint in global warming dana1981 writes:

    "Trenberth et al. (2009) used satellite data to measure the Earth's energy balance at the top of the atmosphere (TOA) and found that the net imbalance was 0.9 Watts per square meter".

    This proposition is false. What Trenberth has actually found in said paper is this:

    "There is a TOA imbalance of 6.4 W m−2 from CERES data and this is outside of the realm of estimates of global imbalances that are expected from observed increases in carbon dioxide and other greenhouse gases in the atmosphere"

    That is, Trenberth says satellite data are useless for measuring Earth's energy balance. Then he continues:

    "The TOA energy imbalance can probably be most accurately determined from climate models and is estimated to be 0.85 ± 0.15 W m−2".

    So. The energy imbalance is not measured, it is determined using computational climate models.

    Then, what he actually did to satellite data is described like this:

    "An upper error bound on the longwave adjustment is 1.5 W m−2, and OLR was therefore increased uniformly by this amount in constructing a best estimate. We also apply a uniform scaling to albedo such that the global mean increase from 0.286 to 0.298 rather than scaling ASR directly, as per Trenberth (1997), to address the remaining error. Thus, the net TOA imbalance is reduced to an acceptable but imposed 0.9 W m−2 (about 0.5 PW)".

    That is, he increased both OLR and albedo relative to actual data by amounts he considered acceptable in order to arrive at an imposed value of TOA imbalance.

    Therefore it's not true he has "found that the net imbalance was 0.9 Watts per square meter", but took a value based on model calculations and imposed it on satellite measurements.

    What Trenberth did is questionable, but defensible in a sense. Whenever you have next to useless data with unknown but large error margins, you either throw it away or do odd things to it in the hope at least something can be saved. If the data are as expensive to collect as CERES data are, NASA scientists have no choice but follow the latter path.

    On the other hand grave misrepresentation of Trenberth's pain as it is put by dana1981 above, is indefensible. Calculations can be verified against measurements, but they can never be verified against (the same!) calculations. That is, Trenberth's figure of 0.9 W/m2 net imbalance at TOA is still an unverified claim.

    There is an important difference in science between true and false statements. The latter kind implies anything along with its own negation, therefore it's a bit ill suited for deriving meaningful results.
  • Climate Sensitivity: The Skeptic Endgame

    Gilles at 19:17 PM on 4 March, 2011

    e : your wrong, if you take a random gaussian distribution of the "f" amplication factor, with an average value , the average value of the "1/1-f" (and hence sensitivity) factor will be larger than the 1/(1-)) This is a high bias.
    I don't want to prove that climate models and measurements are wrong. I'm just saying that the kind of line you adopt (reasoning on a large sample of different valuers) is not very convincing from a scientific point of view., if the issue is whether the whole model is correct or not. It relies on the implicit assumption that the models have been proved to be true - which is precisely the point. This is kind of a circular justification.

    concerning the point of relaxation timescales : in the simplest approximation, there is a single timescale, and the relevant equation is dT/dt + T/tau = S.F(t)/tau where tau is the relaxation timescale and S the sensitivity. The exact solution of this equation is
    T =S. ∫ F(t')exp(t-t'/tau) dt'. Mathematically, T(t) tries to follow the variations of F(t), but with some delay of the order of tau, and smoothes all variations of the order of tau. Basically, it responds to the average of F(t) over a past period tau.

    If tau is small (with respect to the characteristic timescale for S(t)), T(t) follows closely S(t). If it is large , the T/tau term is negligible and one has rather dT/dt = S F(t)/tau , so T(t) = S ∫ F(t) dt'/tau.
    The "response" is the the integral of F(t) (the system "accumulates heat") but it is curtailed by a factor tau.

    Now there are interesting questions around tau. If tau is small, S should be just the ration of T(t) to F(t), so it should be precisely determined by current variations, which is obviously not the case. So we are rather in a "long" tau, longer than or comparable with 30 years. This allows some "flexibility " between S and tau, because constant ratio S/tau will give the same signal T(t) - I think this is the main reason for the scattering of S (and tau) , they are not well constrained by the data.

    However, if tau is large, the response of a linearly increasing forcing should be quadratic (this is obvious because the temperature has to increase faster in the future to exceed the 2°C thershold for instance), so an acceleration should be measurable. Is it the case? not really. Temperature are less or equally increasing than 30 years ago - you can discuss whether they're still increasing or not, but they're not accelerating.That's kind of puzzling in my sense (leading to the obvious observation that if they aren't accelrating, a warming rate of 0.15 °C will only produce 1.5 °C after 100 years).

    So there is a small window for which the sensitivity is high but not too much, and the timescale high but not too much, and the "curvature" will be significative in the near future , but not yet just now. Outside this window, the curve T(t) is essentially linear with a linearly increasing forcing (as the forcing is is logarithmic with the concentration and the production of GHG is supposed to increase more or less exponentially with a constant growth rate, the forcing should be close to linear).
    This is only possible for tau between 30 and 100 years, Say (which is essentially what is found in current models).

    But again this raises other interesting questions. 30 to 100 years is SHORT with respect to paleoclimatic times , and astronomical (Milankovitch) changes of forcings. So IF it were in this range, temperatures should follow rather closely teh forcings, and change only very slowly with them. But we hear here also on a lot of "variations " of climate at the centennal time scale (medieval "anomaly" whatever happened then, D-O events, and so on) which should NOT happen if the forcing is not changing at this time scale. But why is the forcing changing? aerosols, volcanoes, do they have a reason to statistically change when averaged over 100 years or so (remember that the temperature changes by averaging over this time scale?)
  • Crux of a Core, Part 1b

    NewYorkJ at 05:58 AM on 4 March, 2011

    Gilles (#17),

    The instrumental record appears to extend another decade. Note the recent decade is more than 0.5 C warmer in the Arctic regions than the previous one.

    2000-2009 compared to 1990-1999

    I often see contrarian types remove the instrumental record entirely, which often means cutting off 30-50 years of data, then claiming MWP was warmer than the "recent period".

    See also: Kaufman 2009

    A few of the proxies appear to be high latitude tree rings, which might have the modern divergence problem

    Regarding scaling, if the purpose is to just show a correlation, there's no need to do the scaling, unless one is trying to greatly exaggerate very weak correlations. Scaling is appropriate if you have entirely different unrelated measurements, such as temperature vs TSI. Furthermore, Hall doesn't even accomplish showing a correlation in the context of his initial graph. Recall that his initial graph focused on the recent 10,000 year period. From his new "correlation" graph, which increases the time scale by an order of magnitude, the last 10,000 years are just a blur, and finding any correlation over that period is impossible.

    Also note that Hall is ultimately claiming recent global warming is not unprecedented, and using the scaled graph to support the idea that there were steep variation in other regions. To follow his lead, maybe we should scale recent global temperature changes 5x, which would show 4 C of warming over the last century.

    On a related note, there's an important distinction between "recent warming" and "recent warmth", which can be mistakingly used interchangeably. One refers to rate of change and the other to magnitude. It's easy to show "recent warmth" is not unprecedented. The Holocene peak was possibly a little warmer than recent temperatures, and of course millions of years ago when dinosaurs roamed the Earth it was considerably warmer. Hall uses the phrase "recent warming". Over the last 2000 years, there has not been a rapid rise in temperature
  • Climate Sensitivity: The Skeptic Endgame

    Gilles at 04:26 AM on 3 March, 2011

    Alexandre and Albatross with pleasure : look towards people of ASPO, such as Aleklett, Laherrère, Rutledge http://www4.tsl.uu.se/~aleklett/powerpoint/20100609_Aleklett_kva.pdf http://rutledge.caltech.edu/ http://aspofrance.viabloga.com/files/JL_IPCCscenarios09.pdf they say all approximately the same : IPCC scenarios are basically grossly overestimating the size of fossil resources, especially oil and gas. And that "current path" is simply untenable, so extrapolations aren't justified. Note that the amount of "official reserves" is not criticized - the only point is just that IPCC doesn't really care of proven reserves, it assumes that expensive, unconventional resources could be extracted at the same pace or even a higher pace than the current conventional ones - which isn't justified by any real facts. If peak oil happens soon, which is more and more likely, then all SRES scenarios will fail on this point. KR : "Looking at that same graph, and at the numbers, you will note that the lower end of the range of 2-4.5°C is very solid, very little chance of the actual values falling below that level - " I disagree with you : unaccurate data and models are never "very solid", even on a statistical point of view. They can be prone to systematic biases. As an example, models of the sun can explain a lot of things, but not the 11-years cycle. That is, if you believe in computer models, they hardly describe activity cycles, and when they do, they are much shorter than expected. The 11-year cycle can be dismissed on a "statistical study" of the models - yet it does exist. In the same way , supernovae never explode in numerical models. That's a pity, but supernovae don't exist in the world of numerical computations. Based on numerical experiments, they are statistically impossible. Yet they do exist. So I am personnally rather reluctant in front of a "set of inaccurate measurements". "You seem to be arguing that natural forcings or feedbacks (outside the ones we know of, or poor measurements? Not clear...) will lead to overestimation?" No, I am talking about other causes than forcings and feedbacks - causes of change that are due to something else, changes in the oceanic circulation, atmospheric convection patterns, and so on - that can lead to a variation of the average temperature without a change of the average forcing.

  • Climate Sensitivity: The Skeptic Endgame

    KR at 03:56 AM on 3 March, 2011

    Gilles - "well, on the other hand, simple inspection of the Fig 2 in this post shows a scattering of almost a factor 10 in the estimates. So what's your definition of a 'reasonably well known factor'? "

    Looking at that same graph, and at the numbers, you will note that the lower end of the range of 2-4.5°C is very solid, very little chance of the actual values falling below that level - it would contradict just about all the data we have. We know about alligators near the Arctic circle and the "snowball Earth" - the data sets certain minima on what climate sensitivity could be. The high end is less certain - and I'm not going to take uncertainty indicating very high sensitivities as a good sign.

    There are definitely uncertainties in the data, although I believe you are overstating them - lots of lines of evidence support paleo reconstructions of CO2, solar activity, aerosol levels, temperatures, etc. And there are few issues indeed with Milankovic forcings.

    "The issue I see is that IF natural, unforced variability is important , it will generally lead to overestimate the sensitivity. But excluding an unforced variability is very difficult just because you have no way of knowing if it is unforced or not - so by making the assumption that everything is due to forcings, you will end up with a figure, which will be always biased to high values. "

    I have to seriously disagree. You seem to be arguing that natural forcings or feedbacks (outside the ones we know of, or poor measurements? Not clear...) will lead to overestimation? But that's only true if for each of the studies there is an unaccounted for forcing that adds to the effect. It's just as (un)reasonable to argue that said unmeasured forcings subtract from it. If there are poor measurements, they only widen the range, not bias it.
  • Dispelling two myths about the tropospheric hot spot

    TimTheToolMan at 08:53 AM on 27 February, 2011

    I'm not actually sure you understand the argument Dan. This is a relatively new paper (Oct 2010) and looks at the spectral components of the TSI.

    I did a search for "Stratospheric cooling" and came up with the usual references to TSI increases resulting in a different warming profile to CO2 warming. For example

    "8.If the warming is due to solar activity, then the upper atmosphere (the stratosphere) should warm along with the rest of the atmosphere. But if the warming is due to the greenhouse effect, the stratosphere should cool because of the heat being trapped in the lower atmosphere (the troposphere). Satellite measurements show that the stratosphere is cooling."

    But this is an incorrect statement on TSI changes and assumes that all spectra change together. It turns out that this is not correct and so the observed changes since the satellite was put up there indicate a change that ought to mimick the CO2 profile.

    I'm happy to hear any arguments you have against this, but pointing to "its not the sun" links are irrelevent.
  • It's not us

    Julian Flood at 22:32 PM on 25 February, 2011

    quote
    Can you explain why it would be informative to exclude some natural source and lump the remainder of the natural carbon cycle together with anthropogenic emissions?
    unquote

    Because that highlights the logic of the argument. Any increase outside the lumped-together sources can be pointed to as the cause of all of an increase. If one postulates e.g. an increase in metabolised methane from the permafrost (this is, IMHO, uncontentious as the suppression of methane efflux by acid rain is documented), or perhaps warmer deep water is increasing CO2 emissions from metabolised clathrates, then one can by the same line of reasoning say 'it's all coming from the permafrost'. This would of course be wrong, one needs to add up all the changes and then -- in this case -- one can say 'it's X Gt from permafrost and 27 Gt from fossil fuel. The proportions are X:27 and, since we don't know the absolute size of the sinks, we do not know what eliminating fossil fuel emissions will do to the rate of sink, but it will remove the CO2 in proportion to the contribution, i.e. X:27.' If we cut all of the fossil fuel emissions then we might find that the CO2 levels continue to increase because X from permafrost is bigger than the enhancement of the sink. Because we do not know the actual size of the permafrost contribution, we do not know the actual size of the sink which is taking up all but 14 Gt of the enhanced (fossil emissions + permafrost contribution). We do not know enough to make a meaningful statement.

    quote
    What is important is whether CO2 levels would be rising if not for anthropogenic emissions, and the answer is quite clearly "no, they would be falling" (which we know because the net effect of the natural environment as a whole is to absorb about half our emissions).
    unquote

    No, that is not the case. We know only that all sinks add up to more than the sum of all the sources. We do not know that CO2 levels would be falling because we have not measured the sinks and sources. See above.

    quote
    Given that CO2 levels would now be falling if we were to cut our emissions to zero, it seems odd to suggest we are not 100% responsible for the current rise.
    unquote

    No, you cannot truthfully make that assertion -- I could say 'if we cut our emissions to zero then the rate of increase of atmospheric Co2 would only diminish by 10%' and I would be talking equal nonsense. See above. This is just assertion of what we are discussing and brings us no further forward. It does point up one of the problems I'm trying to understand: if we cut to zero, would the increase in atmospheric CO2 entirely cease? My contention -- perhaps too strong a word -- my fear is that it would not.


    quote
    I have repeatedly explained that you don't need to know the value of individual fluxes to know that the natural environment as a whole is a net sink. If you shared a bank account with your partner and always put in $100 a month more than you spent, but observed your monthly balance only increased by $50 a month, you would know your partner was a net sink (to the tune of $50 a month) without needing to know where he/she spent the money, or how much he/she spent in total or how much he/she deposited each month. The mass balance argument is essentially analogous.
    unquote

    But if you have a beneficent uncle who is adding untold amounts to your account, or not, depending on how his ulcer feels, you can then say nothing about what's going on. Your daughter, meanwhile, has found a way of silently tapping off an increased allowance, and a direct debit, which you have had running so long that you'd forgotten it, has ceased. Now you do not know who is doing what because there are too many unknowns, as I have repeatedly pointed out. Unless you know the details about what's going on, you don't know what's going on and you cannot make any meaningful statement about what's going on.

    I agree that the mass balance argument is analogous. It is, however, incomplete in your presentation. Perhaps here we have an insight into our disagreement -- you believe that the sources and sinks are all accounted for and the only things to be considered are one input and one output, while I am not sure they are, which is why I ask these questions. My own guess is that we have screwed up one or more biological sinks, the pull down of 12C has decreased, leaving more 12C in the atmosphere, and we're misinterpreting that as part of the fossil fuel signal. But that is just a guess -- it might even be MWP deep water at last reaching the deep ocean clathrates. Or something else.

    However, we'll have to wait for more measurements -- only then, to continue the analogy, will we be able to look at a bank statement and see what everyone's up to. Only then will we be able to truthfully say thngs like 'if we stop emitting fossil fuel CO2 then atmospheric CO2 levels will begin to fall.'
  • Radiative forcing by aerosol used as a wild card: NIPCC vs Lindzen

    SoundOff at 15:26 PM on 22 February, 2011

    rhjames,

    It’s not quite right to judge clouds’ cooling effects by how you feel as a cloud passes over your head. Clouds aren’t outside the climate system. The SW energy that did not reach your skin was partially absorbed by the cloud top and partially reflected to somewhere else, some of it outside the climate system. The warmer cloud warms the surrounding atmosphere. At the same time, some of the LW energy emitted by the Earth around you is intercepted by the water vapor held by that same cloud and again used to warm the surrounding atmosphere. A warmer atmosphere won’t let the Earth’s surface radiate as effectively so the surface will warm, though not as much as it cooled in the shady spot where you are standing. Some complicated measurements and accounting are needed to assess the overall effect and we aren’t there yet.
  • Coral: life's a bleach... and then you die

    Mila at 01:55 AM on 14 January, 2011

    #22 I am a PhD chemist by training but as it is outside of my field I am unable to appreciate the details - papers which I have read about isotope measurements as proxies of the past make sense to my general experience

    unfortunately, as soon as you reject isotope (and other) proxies as an indirect evidence we will have to wait till the first prototype of a time machine - which may take some time - especially to prove it really did work :)
  • Lindzen and Choi find low climate sensitivity

    RW1 at 14:26 PM on 22 December, 2010

    Eric (RE: Post 148),

    Eric: "RW1, the sun, measured by TSI changes in the historical measurements and proxies, increased by about 0.5 W/m^2 from the depths of the Little Ice Age to about 1900, see fig. 1 in A-detailed-look-at-the-Little-Ice-Age.html The temperature increase, which also involved other factors, was at least 0.5C, maybe more like 1C. With no other factors considered the "gain" is something like 2.5 to 5W/m^2 divided by 0.5 which is 5 to 10, rather than 1.6

    The problem, I believe, is you are calculating gain with full solar input (zero to current day) which will yield a much smaller result than a delta of solar input as I demonstrated, albeit crudely, using the LIA."

    You're assuming the temperature increase was caused entirely by the 0.5 W/m^2 increase in solar power. The overwhelming majority of it could have been caused by a countless number of other things or combination of things - most of which we still don't know.

    It's well known that the very small increases in total average solar radiance are not enough to cause the warming we've seen since the LIA.
  • Lindzen and Choi find low climate sensitivity

    Eric (skeptic) at 12:43 PM on 22 December, 2010

    RW1, the sun, measured by TSI changes in the historical measurements and proxies, increased by about 0.5 W/m^2 from the depths of the Little Ice Age to about 1900, see fig. 1 in A-detailed-look-at-the-Little-Ice-Age.html The temperature increase, which also involved other factors, was at least 0.5C, maybe more like 1C. With no other factors considered the "gain" is something like 2.5 to 5W/m^2 divided by 0.5 which is 5 to 10, rather than 1.6

    The problem, I believe, is you are calculating gain with full solar input (zero to current day) which will yield a much smaller result than a delta of solar input as I demonstrated, albeit crudely, using the LIA.
  • A detailed look at climate sensitivity

    Eric (skeptic) at 12:39 PM on 19 December, 2010

    To RW1 on the Lindzen thread, muoncounter recommended that you visit this thread for evidence of high sensitivity. I second that, but IMO the arguments here come up short in several respects. One is that the average measurement of higher water vapor do not take into account the distribution of WV. If it is higher and evenly distributed then it is a positive feedback to CO2 warming. But if WV is unevenly distributed in a world warmed slightly by CO2, then an average increase in WV will result in less or no amplification.

    Second, the derivation of sensitivity from paleo studies routinely ignores unmeasured confounding factors. I gave one possibility here: cosmic-rays-and-global-warming.htm but there are others. Typically the response is to treat solar geomagnetic variations as a proxy for TSI and then dismiss it because of poor correlation and low amounts of TSI change. Also the last 30 years of detailed measurements don't show much in the way of GCR related climate effects. However the penultimate interglacial coincides with an abrupt decline in GCR so a relatively small TSI increase could be amplified without the need for CO2 feedback.
  • The human fingerprint in the seasons

    Daniel Bailey at 03:20 AM on 8 December, 2010

    Re: Norman (130)

    It might be helpful to think of sunspot number as a surrogate marker for TSI. When lacking TSI data from modern measurements, sunspot number is a useful metric. But in this modern instrumental era, TSI is much more valuable. And as such, TSI shows at best a 5-10% attribution of the warming measured over the past 30 years.

    In the absence of CO2 forcing from anthropogenic fossil fuel emissions, such as in the paleo record, changes in TSI can act as a significant forcing (up or down) on global temperatures. As do Milankovich cycles.

    No observable mechanism other than the rise in CO2 explains also the rise in temperatures we've measured since 1980.

    So it's not the sun.

    It is what it is.

    The Yooper

More than 100 comments found. Only the most recent 100 have been displayed.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us