Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
Keep me logged in
New? Register here
Forgot your password?

Latest Posts

Archives

Ten Things I Learned in the Climate Lab

Posted on 23 June 2012 by climatesight

This is a re-post from ClimateSight.

  • Scientists do not blindly trust their own models of global warming. In fact, nobody is more aware of a model’s specific weaknesses than the developers themselves. Most of our time is spent comparing model output to observations, searching for discrepancies, and hunting down bugs.
     
  • If 1.5 C global warming above preindustrial temperatures really does represent the threshold for “dangerous climate change” (rather than 2 C, as some have argued), then we’re in trouble. Stabilizing global temperatures at this level isn’t just climatically difficult, it’s also mathematically difficult. Given current global temperatures, and their current rate of change, it’s nearly impossible to smoothly extend the curve to stabilize at 1.5 C without overshooting.
     
  • Sometimes computers do weird things. Some bugs appear for the most illogical reasons (last week, the act of declaring a variable altered every single metric of the model output). Other bugs show up once, then disappear before you can track down the source, and you’re never able to reproduce them. It’s not uncommon to fix a problem without ever understanding why the problem occurred in the first place.
     
  • For anyone working with climate model output, one of the best tools to have in your arsenal is the combination of IDL and NetCDF. Hardly an hour of work goes by when I don’t use one or both of these programming tools in some way.
     
  • Developing model code for the first time is a lot like moving to a new city. At first you wander around aimlessly, clutching your map and hesitantly asking for directions. Then you begin to recognize street names and orient yourself around landmarks. Eventually you’re considered a resident of the city, as your little house is there on the map with your name on it. You feel inordinately proud of the fact that you managed to build that house without burning the entire city down in the process.
     
  • The RCP 8.5 scenario is really, really scary. Looking at the output from that experiment is enough to give me a stomachache. Let’s just not let that scenario happen, okay?
     
  • It’s entirely possible to get up in the morning and just decide to be enthusiastic about your work. You don’t have to pretend, or lie to yourself – all you do is consciously choose to revel in the interesting discoveries, and to view your setbacks as challenges rather than chores. It works really well, and everything is easier and more fun as a result.
     
     
  • Climate models are fabulous experimental subjects. If you run the UVic model twice with the same code, data, options, and initial state, you get exactly the same results. (I’m not sure if this holds for more complex GCMs which include elements of random weather variation.) For this reason, if you change one factor, you can be sure that the model is reacting only to that factor. Control runs are completely free of external influences, and deconstructing confounding variables is only a matter of CPU time. Most experimental scientists don’t have this element of perfection in their subjects – it makes me feel very lucky.
     
  • The permafrost is in big trouble, and scientists are remarkably calm about it.
     
  • Tasks that seem impossible at first glance are often second nature by the end of the day. No bug lasts forever, and no problem goes unsolved if you exert enough effort.

0 0

Bookmark and Share Printable Version  |  Link to this page | Repost this Article Repost This

Comments

Comments 1 to 11:

  1. It is indeed the permafrost "issue" that worries me the most: I've a few colleagues who study this and if the decomposition of permafrost, and chlathrates, continue and/or accelerate, my tummy begins to ache a bit too.

    I'm a scientist: Hope springs eternal.
    0 0
  2. It's nice to hear from the troops in the trenches! A lot of people could learn a lot from hearing about what it's like to actually work with climate models (as opposed to the vacuous, echo-chamber pseudo-understanding that most people have of climate models).

    I look forward to seeing more posts (with maybe more detail, and an anecdote or two) like this one.
    0 0
  3. As I understand RCP 8.5, it is based on burning pretty well all of the globes' recoverable fossil fuels.

    Speaking from an Australian perspective (I am in the State of Queensland which has a lot of coal and CSG), burning all our fossil fuels irrespective of whether CCS is ever viable is pretty well the accepted paradigm from government, industry and the community.

    We (i.e. the State of Queensland) have something like 150 years of coal at current production rates (of 200 Mt/yr) and some truly enormous new mines being proposed, plus enormous CSG development. We plan to dig it all up and allow it to be burnt here or overseas.

    Our national and state governments, not to mention the mining and CSG industries, are fully committed to this future.

    While we soon will have a small price on direct carbon emissions in Australia (although no price on emissions from coal and CSG exports) our main strategy seems to be to postpone any major reduction in emissions for decades (we have a target of an 80% reduction by 2050, which is punting it well down the track for future governments to deal with).

    With great sadness I think that, based on the current political discourse in Australia and looking at what is happening globally, at present RCP 8.5 is a realistic future.
    0 0
  4. "With great sadness I think that, based on the current political discourse in Australia and looking at what is happening globally, at present RCP 8.5 is a realistic future. "

    You may be right Chris but it isn't really a future in which humankind will actually survive beyond small pockets.

    Indeed 2oC isn't much rosier unless we start planning for it with an adaptation transformation process (i.e. Adaptation, mitigation, and societal and economic transformation) starting sort of nowish, remembering that 2oC by 2100 means getting to 350ppm by 2100 if the pliocene data is correct.

    But an adaptation transformation process involves stopping using fossil fuels basically within 5 years (and all that implies) and that is a very tight carbon budget (no room for spending thousands of tonnes of Co2e on renewables or nuclear) and that is seemingly impossible for it means using a lot less power and intermittent power for certain.

    Therefore I think you might right Chris and despite pre-knowledge it seems humankind at present would rather deny the possibility of what a global mass depopulation actually means in terms of process (i.e. war, starvation, disease and widespread death)and will keep burning fossil fuels, but lets hope not.
    0 0
  5. Some think that RCP 8.5 is not realistic, because humans won't let it become that bad. I guess that's true, in theory; however what concerns me is lag. Already we've got people like James Lovelock saying that it's all happening slower than expected.

    The big danger -- and I base this on the track record to date -- is that our scientifically-illiterate politicians (for, let's face it, that's what they tend to be), encouraged by the short-termism of economists and the masses, might set us unwittingly on a trajectory for 8.5 and not realise where we're headed until it's too late to do anything about it. To me that's the big worry.
    0 0
  6. How hard is it to set a model with two grid patterns, one with inputs on the grid scale availabe from commonly available data, and one in very fine scale, say in grid cells of 10 meters instead of 10s of kilometers? Is that doable? What kind of person count would you need? How big a system? How long to build a working model? How long to test it and make it reliable?
    0 0
  7. pluvial - what do you hope this will achieve? I dont work in climate code but for models I work with the immediate issues would be:
    1/ halve the resolution and you double the processing time. This limits what we can do with our models. Resolution has improved as no. of cpu goes up and get faster.
    2/ The numeric method being used may have issues. Its little help to understanding reality if fine scales simply amplify rounding error in copying bounding conditions to nodes.
    3/ Parameterization is often done to account for subscale processes. Increasing scale should theoretically allow for eliminating this but only by actually directly modelling these processes. Doing this is likely to be at least as complex as the large scale models and so massively increasing cpu time requirements. If you dont model these processes, then you potentially lose the reason for smaller scale in first place.
    0 0
  8. PluviAL,

    Like scaddenp asks... what is the point of your questions?

    Climate models often work with lots of grid sizes (different for ocean, land, atmosphere). The world is also three-dimensional. Grid choices are made primarily for execution time (twice as many cells in the grid along its width and height means four times as many calculations, four times as many means 16 times the calculations, etc.).

    To go to a scale of 10 meters instead of 10 km, you have 1,000 times as many cells across, and 1,000,000 times as many cells in total. Climate models don't even run at scales of 10 km.. more like hundreds of kilometers, so your scale change would blow things way out of the water. No, you're not likely to find a computer on earth that will even get things done at the 10 km scale (although that's really not necessary, either).

    Sometimes what you are looking at doesn't require better resolution, and can even be confounded by it (it requires even more complex and detailed modeling of physical processes to resolve the interaction at the higher resolution -- things that can easily be dispensed with at larger resolutions).

    The CMMAP project is particularly interesting. One of the great problems in climate models is cloud behavior, because the scales needed to properly model clouds are far too small to be performed efficiently, and most climate attributes do not need that small scale. Their approach is to model the climate on an achievable scale, and to model clouds for just one small grid cell within the larger cell, and then to apply that result throughout the cell (effectively assuming that their single result will apply, on average, throughout the larger cell).
    0 0
  9. Actually, PluviAL's question isn't that unreasonable. Do a google search on "Regional Climate Model" - there has been a lot of work of using a fine-mesh local grid in the region of interest (say, the continental U.S.) imbedded in a courser-grid GCM.
    0 0
  10. Which means the question cant be answered till pluvial tells us more. Embedding a WRF model into GCM is a cool idea for resolving process.
    0 0
  11. Here's an example of coupling regional and global models: http://www.clim-past.net/8/25/2012/cp-8-25-2012.pdf One advantage is being able to downscaling global model output to look at local effects (when looking into the future). There was an article here about the future of the Los Angeles basin using that technique.

    I thought this 2004 paper explained the advantages of combining high and low res models pretty well: NS Diffenbaugh, LC Sloan - Journal of climate, 2004. A key advantages is the availability of high res regional data, so it may not seem as applicable to the distant past. However I believe the regional models can still be validated against similar modern climate data (although I don't have a ref for that) and that would provide valuable input to the global model.

    My view is that although the papers mainly talk about the advantages of being able to handle topographic complexity in the regional models, the weather that is being simulated is often an example of "topographical complexity" without topography. That because the lifting in a front, although not fixed and unyielding like a mountain chain has many of the same small scale effects with the importance noted in the paper.
    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

TEXTBOOK

THE ESCALATOR

(free to republish)

THE DEBUNKING HANDBOOK

BOOK NOW AVAILABLE

The Scientific Guide to
Global Warming Skepticism

Smartphone Apps

iPhone
Android
Nokia

© Copyright 2014 John Cook
Home | Links | Translations | About Us | Contact Us