Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Recent Comments

Prev  589  590  591  592  593  594  595  596  597  598  599  600  601  602  603  604  Next

Comments 29801 to 29850:

  1. Andy Lacis responds to Steve Koonin

    ranyl @15, although we have a stated likely range of 9 to 31 meters sea level rise at 392 ppmv, the lower bound does not start increasing again until about 550 ppmv, and the upper bound, though it increases, then decreases to about 29 meters from about 550-800 ppmv.  On the postulate that increasing temperatures will not result in decreasing sea levels, that rise in the upper bound must be treated as simply a loss of certainty due to limited data over that interval and we have a prediction of essentially no further increase in sea level other than by thermal expansion from 390-800 ppmv.  

    To reach that state of relative stability, the WAIS and GIS must be essentially completely melted.  That in turn indicates a sea level rise of about 14 meters from WAIS and GIS plus a little more from glacial ice and thermal expansion.  So, assuming we will only have 9 meters of sea level rise is indeed folly based on that data.  Sustained CO2 concentrations at current levels will give us 14 plus meters of sea level rise, but the same is true at 550 ppmv, and not till you get above 1000 ppmv is their any credible risk of the complete melt of the polar ice caps (the original point queried).

    So far I have broadly agreed with you, though perhaps disagreed on detail.  Where I truly disagree with you is on rates.  While it is reasonable that a greater TOA energy imbalance (not forcing) will result in a faster ice melt, there are a host of other factors involved.  One of those is surface area subject to ice melt which is far less with the surviving ice sheets than was the case at the end of the last glacial.  Another is regional energy imbalance in the melt season, which was far greater at high northern latitudes (due to milankovitch "forcing") during the end of the last glacial than during the current anthropogenic temperature increase.

    Further, even if I agreed with you about the possibility of 5 meters per century sea level rise in the current warming (which I do not), it would not be possible in this century.  The current rate of sea level rise is about 0.32 meters per century.  Even if that were to linearly increase to 5 meters per century by 2100, that would still only give us a sea level rise over the century of slightly less than 2.5 meters.  (Less than because 14% of the century has already occured at low rates.)  An exponential increase to that rate gives us a lower rise, not a greater rise - although it would predict a greater sea level rise in from 2100-2200.  That, however, is itself a problem.  Hansen's predictions suggest that most of the WAIS and GIS will melt out in less than fifty years early next century, a prediction which is not credible.

    There is a reason why virtually nobody (I know of no actual cases) who is an expert is ice sheet dynamics or sea level rise accepts Hansen's predictions.  There is a reason also why those who think the IPCC is too conservative expect sea level rises of the order of 2 meters or less for this century, not 5 meters.

    If you want a more realistic assessment for this century, you should turn to Jevrejeva et al (2014):

    "We construct the probability density function of global sea level at 2100, estimating that sea level rises larger than 180 cm are less than 5% probable. An upper limit for global sea level rise of 190 cm is assembled by summing the highest estimates of individual sea level rise components simulated by process based models with the RCP8.5 scenario. The agreement between the methods may suggest more confidence than is warranted since large uncertainties remain due to the lack of scenario-dependent projections from ice sheet dynamical models, particularly for mass loss from marine-based fast flowing outlet glaciers in Antarctica. This leads to an intrinsically hard to quantify fat tail in the probability distribution for global mean sea level rise. Thus our low probability upper limit of sea level projections cannot be considered definitive. Nevertheless, our upper limit of 180 cm for sea level rise by 2100 is based on both expert opinion and process studies and hence indicates that other lines of evidence are needed to justify a larger sea level rise this century."

    (My emphasis)

    Aslak Grinsted (one of the authors) writes on his blog:

    "Any value for the upper-limit would meet opposition. Some would see it as overly alarmist, and others would argue that things could go a lot worse. We believe that with the methods we use, we fairly represent the broader community uncertainty.

    For the ice sheet contribution we used a shap-shot of the expert uncertainty from 2012 (Bamber & Aspinall, 2013). Since then several studies have found that parts of Antarctica is already collapsing. This new knowledge may alter expert opinion (as we note in the paper), but we can only speculate by how much. This has led Joe Romm at Think Progress to argue that our study therefore "vastly* underestimates" worst case sea level rise. However, domain experts are ahead of the game, and ice sheet experts have long considered the possibility of a collapse. It is important to realize that the expert elicitation we used did not only ask for a best estimate, but asked each scientist to give a confidence interval. And it is clear from their responses that they did consider this possibility.

    "This indicates a growing view that a significant marine ice-sheet instability in the WAIS could initiate in the coming century." -From Bamber & Aspinall 2013.

    The new studies do not really inform on how fast that might happen, and I believe that the high-end would not change much if the same experts were asked the same question now. I speculate that these new studies will have a greater effect on what experts consider to be the most likely value, than the tail."

    Further on the expert elicitation by Bamber and Aspinal, their 95% bound on elicited ice sheet contribution to sea level rise in 2100 was 17.61 mm per annum.  Note that that is the sea level rise in that year.  It means the expected average contribution over the century is approximately half that (or about 0.9 meters total contribution).  The median is 5.4 mm per annum and mean of 6.9 mm per annum.  The maximum elicited expert contribution was 38 mm per year (for an approximate expected contribution over the century of 1.9 meters).  Hansen's views are so far outside the bounds of actual expert opinion on this topic as to be absurd.

  2. michael sweet at 07:15 AM on 15 April 2015
    2015 SkS Weekly Digest #15

    DSL,

    What was the period ending in December?  How far back do you need to go to get out of the current record streak?

  3. 2015 SkS Weekly Digest #15

    March GIS L-OTI is out: 0.84C.  That's the 5th warmest month and the 3rd warmest March.  The 12-month period ending in March is the warmest in the record, beating the periods ending in February (now 2) and January (now 3).  Feb-Mar Multivariate ENSO Index was .65.  

  4. The history of emissions and the Great Acceleration

    sailshrao, excellent observation, andyskuce's graphes are really illuminating to the idea that IF we could get fresh water to convert the deserts to agriculture we could absorb massive amounts of CO2 from the atmosphere, while releasing more delicate lands, rich in biodiversity, return to nature. Change in one direction release massive amoutns of CO2, in the other it absorbes similar amounts. The deserts are a very big potential CO2 spunge.

    I'm thinking of something like massive and sustainable "San Joaquin" valleys through the desert belts of the world. The only challenge is fresh water, but if you think about it, that is also the problem, there is too much water in the atmosphere as the planet heats up. Water is also a great way to convert excess heat into humidity again leading to fresh water. From my studies this is the solution. This is what I meant to say by "continuing the great acceleration" of wealth creation, while creating a "Great Absorption" of atmospheric CO2.

    There are two way to get that water: 1) From the atmosphere, this produces energy while producing fresh water. 2) Diversion from the arctic rivers. While this is a potentially catastrophic on environmental disruption, it is less so than loss of Gulf Steam circulation (GSC), which such action would help to curve. There is one more action to retain it GSC but that's another story.

    I know this sounds crazy, but people here know better than anyone that what we are experimenting with through geologic scale fossil fules oxidation, is much crazier.

  5. Andy Lacis responds to Steve Koonin

    "Earth system were to reach equilibrium with modern or future
    CO2 forcing. Given the present-day (AD 2011) atmospheric CO2
    concentration of 392 ppm, we estimate that the long-term sea level
    will reach +24 +7/−15 m (at 68% confidence) relative to the present."

    That was at 392ppm.

    http://www.pnas.org/content/110/4/1209.full.pdf

    As the early Pliocene was ~350-400ppm, with most of the later papers with more accurate calculations are pointing more to 350ppm, and sea levels were 20-25m higher.

    This is were stats lets us down really, for everyone will grasp at the seeming chance of 16% that sea level rise will be below 9m, for this is resultant from taking a wide range of sea levels and CO2 levels uncertainities and arriving this larger uncertainity, yet no expert in the paleoclimatology would agree that the Pliocene wasn't warmer and that sea levels weren't 20m higher, when CO2 was ~350-400ppm.

    Also interesting that 5cm a year rates of sea levels rise have occured in deglaciations when the general global warming was occuring an order or two magnitude slower than current rates.

    Seems reasonable to suggest that the fasterthe ehat accumulates in the system the faster the ice sheets will melt, and it is well known that melting an ice sheet is an ever accelerating event due to icesheet lowering and dynamic melting processes (e.g.the heat from surface melt ponds being transfered to the heart of the ice sheet).

    That is why Hansen wonders at 5m in a century and why most feel the IPCC 0.8m by 2100 is generally regarded as very conservative.

    Moderator Response:

    [PS] Fixed link. It would be appreciated if you did this yourself with the Link button in the comment editor.

  6. 2015 SkS Weekly Digest #15

    Thank you very much Tom, I appreciate you spelling it out to me.

  7. Andy Lacis responds to Steve Koonin

    As Tom has pointed out, when one is considering eustatic sea level rise (i.e. the rise due to more water being added to the oceans) the behaviour of the East Antarctic Ice Sheet will not simply mimic that of the Greenland Ice Sheet, the West Antarctic Ice Sheet or the Antarctic Peninsula.

    The EAIS is at a much higher elevation than its smaller counterparts, and would need considerably more (and longer) planetary warming before it would even come close to melting out. Those wishing to learn a bit more might care to look at the Antarctic Glaciers website, or at the British Antarctic Survey site.

    There is little meaningful argument that the Mass Balance for each of the GIS, the WAIS and the Peninsula is in negative territory. However, the EAIS may actually be accumulating ice at present, as enhanced precipitation in its central regions (thanks to the good old Clausius-Claperyon relationship) could be more than compensating for increased peripheral loss. 

  8. CO2 limits will harm the economy

    Maybe this has been talked about in the comments: Where might I find a detailed examination of the likely performance of an ETS vs Carbon Tax/Fee and Dividend?

  9. 2015 SkS Weekly Digest #15

    Tristan @1, similar claims have been made on SkS before, and discussed in detail.  The validity of the claim depends, however, on the way it is formulated.  In this case, based on IPCC AR5 figures, to counterbalance total annual anthropogenic emissions, humans need only increase total terrestial photosynthesis (ie, Gross Primary Productivity) by 8.9 GtC, or 7.24%.  That is, of course, the figures for all terrestial photosynthesis, not just crop land.  Taking the figures of Haberl et al (2007), that equates to increasing agricultural productivity by 57.1% of current human appropriation of net primary productivity or 107.6% of the current harvestable yield.  

    I don't know whether that is technically feasible, but it does not present a theoretical bar.  However, that increase must be the increase in persistent biomass.  Any actual crop, whether consumed by humans or as fodder for animals, returns to the atmosphere as CO2, and hence is not sequestered.  The sequestration, therefore, is limited to the annual increase in standing biomass and soil biomass.  Here, however, you face a problem.  Based on the IPCC AR5, human activity has reduced terrestial biomass by just 30 GtC since the industrial revolution.  That is, if we could manage that rate of sequestration, we would have to exceed natural levels of sequestration within 3.4 years.  That figure is less the increase in biomass due to increased temperature, water, CO2 and the agricultural revolution.  On an alternative measure, we would be exceeding natural levels 18.4 years (remember to convert from GtCO2 to GtC if checking the figures).  Allowing for preindustrial levels, we might have 36 to 40 years of such sequestration before we reached a situation where the land simply would not hold more biomass.  So, while such high rates of sequestration cannot be excluded, neither can they be maintained sufficiently to provide a long term solution.

    Finally, many of the changes suggested require convesion to small scale, labour intensive farming (permaculture).  The problem with that is it requires the majority of the population to be farmers.  It represents a retreat back to a dark age, where we do not have sufficient resources to maintain a system of universities and scholarship, or of mass entertainment (which may be of interest to more people).

    I am not suggesting that implication of the methods advocated by Savory or the Rodale institute are not worthwhile in themselves, or that they will not help.  They are just not a silver bullet for global warming, and they make themselve incredible (ie, not capable of being believed) by suggesting otherwise. 

  10. 2015 SkS Weekly Digest #15

    I'm not quite sure where to put this, it may not be within the remit of this site. I'm just wondering if this might be an example of 'misinformation' from team environment.

    "According to the Rodale Institute, small-scale farmers and pastoralists could sequester more than 100% of current annual CO2 emissions with a switch to widely available, safe and inexpensive agroecological management practices that emphasize diversity, traditional knowledge, agroforestry, landscape complexity, and water and soil management techniques, including cover cropping, composting and water harvesting."

    http://www.commondreams.org/views/2015/04/13/food-farming-and-climate-change-its-bigger-everything-else

    WIthout a background in ag science, I can't evaluate the truthiness of that.

  11. Andy Lacis responds to Steve Koonin

    Thanks for a very clear explanation.

  12. Andy Lacis responds to Steve Koonin

    mrkt @10, looking carefully at the graph in the upper right panel, it becomes clear that they show a "likely range" (68%) at 9 to 59 meters at 450 ppmv.  That makes their claim of a "likely (68% confidence) long-term sea-level rise by more than 9 m above the present" true, but obscure.  Ie, they are saying that the likely confidence interval, using IPCC methods of expressing confidence, has a lower limit of 9 m.  From that it follows that their results show an 84% probability of at least 9 meters of sea level rise for a long term CO2 concentration of 450 ppmv.

    That is not how I initially read it, so thank you for drawing attention to my error.

    Looking at their likely range, over that interval, it appears evident that the upper value is so large due entirely to the limitted observations.  On the assumption that increased temperatures will not cause water to refreeze, I think the lowest upper limit of the likely range at higher temperatures can also be used as a more realistic upper limit for a long term 450 ppmv concentration.  That produces an adjusted likely range between 9 and 30 meters, with 30 meters representing the almost complete melt of the WAIS and GIS, with any surviving ice from those ice sheets being more than compensated for by melt of the EAIS.

    bozza @11, the flat line is because while the WAIS and GIS are on the verge of melting, the EAIS is much more stable , and will mostly remain intact once the WAID and GIS have melted away with little melting other than at the fringes.  Only after considerable further rise in temperature will the EAIS melt away.

  13. Andy Lacis responds to Steve Koonin

    @ 9, I think this is surely the relevant question at the moment!

    @ 7, why things flatline and/or double dip I don't know but the links should prove interesting- cheers.

  14. Andy Lacis responds to Steve Koonin

    Tom Curtis @7:  In looking closely at the graphs, it appears that a 9 meter rise corresponds to 1 std. dev. below the mean. This would imply that the chance of less than 9 meters is 16%, and the chance of more than 9 meters is 84% rather than 68%. No? Clearly not a better situation. (I do note that you are quoting the paper.)

  15. Andy Lacis responds to Steve Koonin

    I think it would be accurate however to say the 450ppm is incompatible with the ice-age cycle. (Didnt have ice ages when atmosphere was last at 450ppm, though there were still polar ice caps.)

  16. Andy Lacis responds to Steve Koonin

    gregcharles @5, the CO2 in the atmosphere does not reflect IR radiation back towards the Earth.  Rather, it absorbs it, and then reradiates it.  The difference is important, because if reflected the energy returned would be the difference between that which the surface emits (398 W/m^2) and that which escapes the atmosphere (239 W/m^2), ie 159 W/m^2.  In fact, downwelling IR radiation (or back radiation) averages around  342 W/m^2 because the IR radiation from the atmosphere (figures from IPCC AR5):

     

    The much higher back radiation is due to the air mass immediately above the Earth's surface (from which most of the back radiation originates) having a temperature very close to that at the surface.  In constrast, most of the IR radiation to space comes from high in the troposphere, where the temperatures are much lower.

    Further, the actual back radiation is not important to the greenhouse effect (although may be important for local weather events).  That is because if the back radiation were to increase, with no change in the greenhouse effect, evaporation and sensible heat transfers would also increase to maintain a balance, and if it were to decrease, evaporation and sensible heat would also decrease.  The greenhouse effect is determined by the top of atmosphere energy balance.

  17. Andy Lacis responds to Steve Koonin

    CBDunkerson @1, Bozza @2, the most recent credible data that I know of is encapsulated in this figure:

    (Source)

    Total melting of the ice caps is associated with a sea level rise greater than 70 meters.  Ergo, from paleo data, we cannot expect that unless we have sustained CO2 concentrations of 800-1500 ppmv.  From that paper (Foster and Rohling 2013), we lean that "our results imply that acceptance of a longterm 2 °C warming [CO2 between 400 and 450 ppm (46)] would mean acceptance of likely (68% confidence) long-term sea-level rise by more than 9 m above the present."  That probably represents the melting of the West Antarctic Ice Sheet (WAIS), and the partial melting of the Greenland Ice Sheet (GIS), with only limited melting of the East Antarctic Ice Sheet (EAIS).

    Looking at the figure, Andy Lacis may be basing his claim on van der Wal et al, 2011, except that when I actually look at van der Wal et al, the modelling shows a total loss of polar ice does not occur until a sustained temperature anomaly of plus 20 C is reached, ie, around 1600 ppmv with an Earth System Climate Sensitivity of 8 C.  It appears, therefore, that Foster and Rohling have incorrectly represented van der Wal et al's results.

    Aslak Grinstead gives a more detailed discussion.

  18. Andy Lacis responds to Steve Koonin

    gregcharles - The Earth isn't in equilibrium right now, increasing ocean heat content (OHC) measures show that over the last 50 years we've averaged about a 0.6 W/m2 mbalance over that period, meaning the earth has been receiving about 240 W/m2 and radiating about 239.4 W/m2. The difference points to the (currently) unrealized warming due to thermal lag, primarily in the oceans. 

    The difference between the near-blackbody IR radiation at the Earths surface (~396 W/m2, IR emissivity about 0.95-0.98) and what's radiated to space (~240 W/m2) is entirely due to the radiative greenhouse effect - in essence energy isn't emitted to space until much higher altitudes (and colder) due to greenhouse gases, making the effective emissivity of the Earth in IR around 0.61. The Earth's just not as effective a radiator as a bare rock would be, and with increasing GHGs it becomes even less effective - hence a higher and higher surface temperature required to radiate the same energy to space. 

  19. Andy Lacis responds to Steve Koonin

    I agree that 450ppm by the end of the century is an understatement. Everything I've seen suggests we'll reach that mark within the next 20 years.

    I have a couple of questions about the Fourier calculations. First, the global mean temperature of 288K implying 390 W/m2 of radition, some into space and some reflected back to earth by greenhouse gasses. I'd like to know more about how that's calculated.

    I also don't fully understand the near-global energy balance used as the reason earth absorbs 240 W/m2 of radiation from the sun and radiates the same amount out of the upper atmosphere. The earth isn't in equilibrium now is it? We'd still warm for quite awhile even if CO2 levels remained constant, right? Doesn't that imply we're absorbing more radiation from the sun than we're radiating back into space? I feel like I'm missing something obvious here.

  20. Andy Lacis responds to Steve Koonin

    Has anyone put the basic Arrhenius/Hulburt calculation in a spreadsheet?  Something where at the top you input your preferred CO2 level and at bottom it tells you how much warmer, or cooler that will be, in equilibrium, compared to pre-industrial?  I understand it means breaking up the atmosphere into 200-500 nodes, each of which absorbs sunlight, radiates/convects/etc.  It means assumptions are made about feedbacks (clouds, vegetation, ocean response, ice cap extent).

    My point is: equilibrium sensitivity, at the level of Arrhenius or Hulburt, is just a calculation, like 2+2 =4 (indeed, for them, it was a calculation done without benefit of electronic calculators).  If you can put those assumptions/calculations in a spreadsheet for anyone to see and manipulate, it reinforces the notion that this is, at base, just Math (i.e., once the Physics and Chemistry have been codified into Math).  It's Math anyone sufficiently trained can do/see/appreciate.  

    Hence, when someone says "Doubling CO2 is no big deal" you can send them the spreadsheet and say "identify the specific location where 'no big deal' " comes out of this Math".  Without the spreadsheet you're left saying "It IS a big deal. My expert says so."  Leading to the response "My expert says it isn't."

  21. Andy Lacis responds to Steve Koonin

    @1, The truth may be more along the lines of  " a sustained level of 450 ppm CO2"...

    Note that Dr David Mills was on youtube years ago saying the 440ppm barrier is impossible not to break... he says it is(/was) still being worked out  whether or not it was possible to go over the limit and then duck back under it but the point I'm making is ---> 450 ppm is written in stone as reality!

  22. Andy Lacis responds to Steve Koonin

    Some housekeeping may be required, as this article is breaking the formatting of other posts on the main page (no doubt due to the preview ending in a block quote).

  23. Andy Lacis responds to Steve Koonin

    "They should also pay attention to the geological record that points to an atmospheric CO2 level of 450 ppm as being incompatible with polar ice caps..."

    That figure seems much too low. 450 ppm is the most common 'target limit' to avoid the worst impacts of AGW, but I've never seen a claim that we'd eventually lose the ice caps and have corresponding 70 meter sea level rise at 450 ppm.

  24. 2015 SkS Weekly News Roundup #15B

     "... usual media hyperbole is beginning to come down on the pro AGW side"

    I do love the term "pro AGW", as if it is something that many people would actually want to see happening. In common with many people, I think there is a causative relationship between ozone depletion and melanoma incidence rate. That doesn't mean I'm "pro CFCs", or that I'm "anti the Montreal Protocol".

    Terms such as "pro democracy" or "pro equal rights" do tend to convey the correct impression, but "pro AGW", I rather think not. 

    If you are genuinely concerned about perceived press bias, it might be educational for you to have a look at the article John Mason wrote a few weeks ago concerning the reporting of certain climate/weather related events. 

    cheers   bill f

    Moderator Response:

    [JH] "Pro AGW" is typically shorthand for "pro AGW science".

    I also presume that your admonition is directed to the journalist who wrote the article. If so, you should post your concerns on the comment thread of the article as posted on the Los Angeles Times website.  

  25. 2015 SkS Weekly News Roundup #15B

    mjp, the article clearly states;

    "California has seen droughts before with less rainfall..."

    How does that constitute 'sidestepping' the severity of past droughts?

    You cite the headline, but it doesn't say anything about precipitation or 'drought' in general. It says that heat records have been broken. Which is true.

  26. Global warming hiatus explained and it's not good news

    amhartley - I dont immediately find McGregor's source but I think you can guess it's context. You are usually interested in heat capacity in terms storage of added energy. Add energy to earth (eg increased solar output or additional GW), then the heat capacity of earth is going store energy, slowing the temperature rise. Warming from solar doesnt penetrate far into land  - a few meters (ask horizontal ground source heat pump installers). Sea is different with deeper penetration but more importantly convection carries heat deeper and heat capacity is huge. Atmospheric storage is pretty minimal.

  27. Global warming hiatus explained and it's not good news

    scaddenp@14, Tom@16,19

    Your discussion belongs to Underwater volcanoes are warming the oceans. I went there to check what SkS has to say about your last sentences (thermal insulation of the ocean depths and thermal gradient of ocean warming vs. volanic/geotermal heat) & sadly, I found this very would be useful article is empty! What happened to it?

    If it's lost it should be brought back. Some mod could check it...

    Otherwise it should be filled in with the results of your discussion, together with the references Tom had provided. That article is important to debunk the claims like that by Peter Carson. Anyone wants to do that?

    Moderator Response:

    [DB] That's a topic article stub, earmarked for a future piece.

  28. 2015 SkS Weekly News Roundup #15B

    The completely rational story for the California drought is that worse droughts have occurred in the deep past and are possible in the future, but a warming climate will make a recurrence of similar circumstances worse in the future. Bit of a mouthful, I know.

    When used in the context of the drought I find it unhelpful to see headlines like "California's new era of heat destroys all previous records". 

    Denial arguments play on the science papers that show worse droughts have occurred in the past.  When science based articles appear to sidestep and not acknowledge that, it appears that the science is being deceptive.  Doesn't help.

    But this is mainstream media.  I guess the upside is that usual media hyperbole is beginning to come down on the pro AGW side.

  29. Global warming hiatus explained and it's not good news

    "About 97 per cent of all the heat capacity of the Earth is in the ocean — that's where all the energy gets stored."

    Is that accurate? Should a qualifier accompany that 97%? For instance, is the denominator the earth's heat capacity within, say, 10km of the surface?

  30. Models are unreliable

    Tom Dayton @860, that Hotwhopper article is pretty damning of Roy Spencer's choices.  What it does not mention was that 1983 was massively effected by the El Chichon volcano (which shows up the models), but that the effect in observed temperatures was cancelled, or more than cancelled by the 1983 El Nino in the observational record, which by some measures was stronger than the 1998 El Nino:

     

     

    As ENSO fluctuations are random in time in the models, they do not coincide with observed fluctuations.  The consequence is that while the volcanic signal was obscured in the observations, it was not in the models and the discrepancy between models and observations in 1983 was not coincidence.  Nor was the greater relative temperature in UAH relative to HadCRUT4, as satellite temperature indexes respond more strongly to ENSO.

    Spencer knows these facts.  Therefore, his arbitrary choice of 1983 as the baseline year must coint as deliberate deception.  He is knowingly lying with the data.

  31. Global warming hiatus explained and it's not good news

    Tom, I have just got to stop doing this when I get out of bed! You are correct of course.

  32. Timothy Chase at 08:43 AM on 13 April 2015
    2015 SkS Weekly News Roundup #15B

    Although the Bloomberg California heat chart does not appear in the tech paper, at least with respect to the unprecedented nature of the current drought, the Bloomberg article appears to going off of:

    ... the 2012–2014 drought stands out in the context of the last millennium. In terms of cumulative severity, it is the worst drought on record (−14.55 cumulative PDSI),more extreme than longer (4 to 9 year) droughts. Considering only drought episodes defined by at least three consecutive years all lower than −2 PDSI, only three such events occur in the last 1200 years, and 2012–2014 is the most severe of these."

    Open Access: Griffin, Daniel, and Kevin J. Anchukaitis. "How unusual is the 2012–2014 California drought?." Geophysical Research Letters (2014).
    http://onlinelibrary.wiley.com/doi/10.1002/2014GL062433/epdf The tech article makes clear that the reduced precipitation is by no means unprecedented, but over the past three years the higher temperatures have resulted in higher rates of evaporation that have amplified the drought such that cummulative drought severity has been unprecedented.

  33. Global warming hiatus explained and it's not good news

    scaddenp @18:

    "Argh! Dont attempt this stuff when you are in massive hurry and barely awake."

    I know the feeling, and good advise ;)

    Running through the calculation, and using your figures we have:

    3 x 10^23 Joules / 0.84 Jouls/Kelvin.gramme = 3.57 x 10^23 Kelvin.grammes

    3.57 x 10^23 Kelvin.grammes/ 2790 Kelvin = 1.28 x 10^20 grammes

    1.28 x 10^20 grammes / 3 grammes/cm^3 = 4.267 x 10^19 cm^3

    4.267 x 10^19 cm^3 /10^15 cm^3/km^3 = 4,267 km^3 of basalt cooled from "melting point" to 10 C to generate the release the amount of energy accumulated in the ocean.  That in turn works out at approx 97 km^3 per annum.  That compares to the 30 km/annum estimated deposited magma globally.

    You will notice the major difference between my and your working is that I divide by temperature rather than multiply, the division being necessary to get the correct units.

    As an estimate, the 3800 K magma temperature is absurdly high relative to recognized values (which are closer to 1300 K from wikipedia, National Geographic, and a couple of scientific articles I read).  However, you did not include the heat of fusion, nor the difference in heat capacity between magma and basalt.

    Using the values from the worked example @16, we would need an additional flow of 50,025 km^3 of magma at the ocean floor (ie, not mere crust formation) to account for the increase in OHC since 1970.  Even that, however, would not account for the problem of thermal insulation of the ocean depths, nor why the thermal gradient cools with depth rather than warms with depth as would be required if the major source of surface ocean heat was from the ocean floor.

  34. Models are unreliable

    Rhoowl:

    Phil computers do have rounding errors, iteration problems with real numbers.

    Indeed they do, which is why careful climate modeller programmers analyse their programs to ensure such errors are restricted, and the models (or their component parts) are tested to ensure they do not exert a undue influence. That they do would be obvious - a modelling program that produces results that are unduely influenced by rounding errors would give widely different results with very small changes to the input.

    Input has fudge factors. I use fudge factors all the time when modeling. I enter objects that can't possibly exist just to make the program work...


    Whatever you may do, it does not follow that climate modellers do it too.

  35. Global warming hiatus explained and it's not good news

    Argh! Dont attempt this stuff when you are in massive hurry and barely awake. I had 30E22 in head from look at OHC graph and transcribed it to exponent too. However, I didnt make typo entering into calculator and 3E11 is still the cubic kilometers. As Tom said, this has to be very thin to transfer that much energy to ocean. The main point stands: it is totally unrealistic to blame undersea volcanoes for GW.

  36. Models are unreliable

    Jh I wish to reply to your comment and since it is off topic and is more personal in nature we should do this privately. My email is rhoowl at yahoo

    Moderator Response:

    [JH] Your request has been duly noted.

  37. Models are unreliable

    Rhoowl:  Your reply to my comment was not on the topic I had explained--differences between models.  Since you either will not or cannot focus on a topic long enough to have an actual conversation, I'm giving up on you.

  38. The history of emissions and the Great Acceleration

    Thanks, sidd and andyskuce.

    Howard Lee indicates that would be 340 Gt of Carbon, not CO2, for land use changes from prehistoric times to pre-industrial times.

    As such, cumulatively, land use change dwarfs all the other contributors, which bodes well for the potential of land use changes to drawdown CO2 in the near future, if we put our hearts and minds to it.

  39. Models are unreliable

    Rhoowl:  Spencer followed up his claim that you linked, with another claim this time about "90 models" but likewise severely flawed. Hotwhopper clearly explained Spencer's biggest...um, "mistake"...of playing loose and fast with baselines. There is also the issue of Spencer falsely giving the impression that the RSS and UAH satellite trends for the tropics are consistent, when in fact UAH for the tropics is three times lower trend than RSS, and recently RSS has been shown to be correct in the tropics and UAH wrong.

  40. 2015 SkS Weekly News Roundup #15B

    @ PC

    Ah, you have unerringly spotted the fingerprint of the 66 year cycle, which proves that it was the PDO "wot dunnit".

  41. PhilippeChantreau at 02:11 AM on 13 April 2015
    2015 SkS Weekly News Roundup #15B

    Here's a funny thought: if I was to use fake skeptic methods with the NOAA chart of California's temps, I would cherry-pick 1949 as starting point and calculate the "trend" to the present...

  42. Models are unreliable

    robp dr Roy spencer has also reviewed this his conclusion don't agree with you graph. Tristan dr spencer is a climate scientist.

    http://www.drroyspencer.com/2013/06/epic-fail-73-climate-models-vs-observations-for-tropical-tropospheric-temperature/

    Td I have already agreed that the scenarios spread was not an model error....there are errors in the models. I have also read the intermediate blog. What you pointed to as verification was reviewing past Enso from only 18 climate models. What About the other climate models. This appears to be a weak verification. He only matced the trend an not absolute values. I reviewed Steve easterbrook material. Much of what he professes is that science twist need to aek to the public in general terms so it is more understandible. Much of what he said didn't address the issues I am presenting.

    leto I never claimed that the modelers do not have skill or infallibilty

    Quite the opposite actually. The losing argument was started by someone else previously. Perhaps this was out of line.

    Phil computers do have rounding errors, iteration problems with real numbers. Input has fudge factors. Is use fudge factors all the time when modeling. I enter objects that can't possibly exist just to make the program work...

    after further reading  about about the water co2 interaction it became clear that a grid resolution of 100km x 100km is too coarse to property model the cloud co2 interaction. The material is too anisotropic for that resolution. Zhou zhang bao and liu wrote a paper suggesting that grid resolution Be 1x1 mm. To properly model turbulence.. In the atmosphere. Obviously this would be an impossible task

    Moderator Response:

    [JH] Either English is not your first language, or you do not take the time to proof read what you have keyed in prior to hitting the "Submit" button. Either way, parts of your comment are nonsensical. In addition, some of your statements insult the intelligence of other commenters. If you keep going down this path, your future postings may be summarily deleted.

  43. Global warming hiatus explained and it's not good news

    3 x 1030 joules?????

    As scaddenp refers to 3x1023 J in comment #14, then 3x1030 J in #15, I think we can safely assume there has been a typo.

    Unless, of course, there was a sudden increase in OHC of 2.9999997x1030 joules in the space of approximately 22 minutes. (In which case, I think we would have noticed.)

    Glad to see I'm not the only one that does typos.

     

    cheers  bill f     ;-)

  44. Models are unreliable

    And two more ...

    3. Hoffmann seems to think that global temperatures are inputs to GCM. This is just factually wrong


    4. He makes the usual "denier" mistake of equating the atmospheric temperature record with the "global" temperature record (i.e he ignores 93% of the energy imbalance)

  45. Models are unreliable

    @851

    The blog post by Dr Hoffmann is wrong in so many ways. Here are just two points from it.

    1. The illustration of rounding errors in computer programs would only be relevant if the errors have a systematic bias (i.e. they all rounded up or all rounded down). As Hoffmann's output shows they don't; the rounding errors are randomly signed and therefore will tend to cancel each other out, both within an individual run and between runs.

    2. The discussion about modelling individual molecules in the atmosphere/planet is ludicrous; bulk matter has well defined properties that can be determined experimentally and used in a model without recourse to modelling individual molecules. We didn't know, or model, the individual atoms of the Apollo 11 space rocket, but that didn't affect our ability to predict its behaviour.

  46. The global warming 'pause' is more politics than science

    Here's my analysis based on an LSQ fit to an RSS graph I saw on WUWT site (unclear to me whether they were saying it was Monckton or Knappenberger-Michaels LSQ fit):

    1996/10-2014/10 = 0.0 degrees/century (18 years 1 mo. = 217 months)

    WUWT Monckton(?) LSQ fit with comment <I paraphrase> exactly zero warming.

    1999/02-2014/10 = 1.2 degrees/ century (15 years 9 mo. = 189 months)

    *more recent* my well-estimated eyeballed fit *on the very same WUWT graph* for this later period.

    Thus, Monckton (I think it is) shows clearly that "global warming" increased just 16 years ago.
    Not only that but the fun part now: 1.2 degrees later warming trend / 0.0 degree earlier warming trend = infinity so the Monckton (?) analysis *shows an infinite increase of "global warming" just 16 years ago*.

    I'm like that other bunch, Monckton and that, I hugely prefer fun & games to actual work.

     

  47. The history of emissions and the Great Acceleration

    In particular table 3 in ruddiman(2013) estimates 320-343 gigaton pre1850 carbon emissions

  48. The history of emissions and the Great Acceleration

    sailesh rao asked:

    "What is an estimate of the CO2 emissions due to land use changes from the start of the agricultural revolution, say 8000BC to 1750?"

    See Ruddiman(2013) doi:10.1146/annurev-earth-050212-123944 or his recent book "Earth Transformed"

    sidd

  49. Models are unreliable

    Rhoowl wrote,

    "you have to understand the the people who are writing and operating the programs for computer models are not climate scientists."

    Producing a useful model of anything is 99% based on understanding the domain you are modelling, and 1% putting some code together. The idea that a non-climatologist who knows about programming is particularly well-positioned to comment on the success or otherwise of a climate model is nonsense. The idea that climate science has a lack of intelligent people versed in both the necessary domain knowledge and the coding skill is also nonsense. Sure, you mustn't assume that the climate modellers are infallible, but your starting assumption should be that the people trying to educate you on this site know much more about this than you or some programmer.

    "btw..you only win the argument if you convince the other person that they are wrong."

    This is probably the silliest comment I have ever read on this site. For a start, you are wrong if you see this exchange as a contest people are trying to win. The people responding to you are trying to educate you, and if you refuse to be educated that is a reflection on you, not on the validity of their responses. I see no evidence that anyone has failed to understand your points (which have all been discussed before anyway), but I see plenty of evidence that you have not actually stopped to consider what you are being told.  Remainingly stubbornly ignorant and then calling that result a win or a draw is simply foolish.

  50. Global warming hiatus explained and it's not good news

    scaddenp @15, I do not understand why you are using 3 x 10^30 Joules as your target, given that it is 10 million times the heat increase in the ocean since 1970.

    Anyway, in trying to check your numbers I came across a worked example by Ass Prof Leslie Sonder at Dartmouth College.  This area falls close enough to her area of specialization that I suspect she has made no blunders in the basic calculation, but am quite happy for others to point out blunders so that we can correct the example.

    In any event, she calculates that 2 x 10^13 Kg per annum of new crust is formed by mid-oceanic ridges.  That represents 6.67 x 10^9 cubic meters, or 6.67 cubic kilometers, or 22.23% of estimated global magmatic deposition.

    She also calculates an energy release of 4 x 10^19 Joules per annum, or less than 0.008% of the average annual energy accumulation in the ocean since 1970.

    Several things should be noted about this.  First, the magma deposition as new crust of a given year does not all occur at the center of the mid-ocean ridge.  Rather, extrusion form pillow basalts at the surface (0.5 km thick according to wikipedia), which because they cooled in water, cool rapidly.  Below that, however, are sheeted dike complexes (1.5 km thick) which cool beneath the layer of pillowed basalt, and hence slowly.  Beneath that again are Gabbro and layered ultramafic rocks (5 km thick) which, because of its depth below the sea floor, cools very slowly.  The process of formation appears like this:

     

    (Source)

    Because the vast majority of the solidified magma is not at the surface, it cools slowly releasing its heat gradually over time.  This does not mean less heat is released in any given year, because heat is still be ing released from previous years.  It does mean the heat is not all released at the center of the mid-ocean ridge by volcanism.  The vast majority of it is released later by diffusion through the sea floor.  Hence the wide bands of increased geothermal heat surrounding each mid-ocean ridge.  

    It also means that the majority of the rock does not cool to abyssal water temperatures.  Indeed, the rock immediately above the mantle is near the melting point of the rock, with temperatures declining approximately linearly as it approaches the ocean floor.  In other words, Sonder's estimate is likely an over estimate.  Put another way, 0.008% of recent annual OHC increase is an upper limit of the heat released by cooling magmas at the mid-oceanic ridges.  Further, the process of that release ensures that it is near constant over time so that it cannot be a significant contributor to any change in OHC.

Prev  589  590  591  592  593  594  595  596  597  598  599  600  601  602  603  604  Next



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us