Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.


Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe

Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...

Keep me logged in
New? Register here
Forgot your password?

Latest Posts


Andy Skuce

Andy Skuce is a recently-retired geophysical consultant living in British Columbia. He has a BSc in geology from Sheffield University and an MSc in geophysics from the  University of Leeds. His work experience includes a period at the British Geological Survey in Edinburgh and work for a variety of oil companies based in Calgary, Vienna and Quito. Since 2005, he worked as an independent consultant. Andy has published a handful of papers over the years in tectonics and structural geology that can be viewed here. He described how his views on climate change evolved in this blog post.

Follow Andy on Twitter @andyskuce and on his blog Critical Angle


Google Scholar profile.


Recent blog posts

B.C. lowballing fugitive methane emissions from natural gas industry

Posted on 8 October 2015 by Andy Skuce &

This article was first published in the Corporate Knights Magazine.

There is a supplementary article at my blog Critical Angle, that has more detail, links and references, along with an estimation of the GHG emissions (excluding end-use) associated with one liquefied natural gas project and the effect this will have on the feasibility of BC reaching its emissions targets.

There is a further piece at DeSmog Canada, where I compare the situation in BC's gas industry with the Volkswagen emissions reporting scandal, in which a corporation cheats on its emissions tests, with the tacit approval of industry-friendly regulators and governments, only to be exposed by independent researchers performing tests in real-world situations.

The push by British Columbia to develop a new liquefied natural gas (LNG) export industry raises questions about the impact such activities would have on greenhouse gas emissions, both within the province and globally.

One of the single most important factors relates to the amount of methane and carbon dioxide that gets released into the atmosphere, either deliberately through venting or by accident as so-called fugitive emissions. Fugitive emissions are the result of valves and meters that release, by design, small quantities of gas. But they can also come from faulty equipment and from operators that fail to follow regulations.

Photo by Jesús Rodríguez Fernández (creative commons)

According to the B.C. Greenhouse Gas Inventory Report 2012, there were 78,000 tonnes of fugitive methane emissions from the oil and natural gas industry that year. B.C. produced 41 billion cubic metres of gas in 2012. This means about 0.28 per cent of the gas produced was released into the atmosphere.

By North American standards, this is a very low estimate. The U.S. Environmental Protection Agency (EPA) uses a figure of 1.5 per cent leakage, more than five times higher. Recent research led by the U.S. non-profit group, Environmental Defense Fund (EDF), shows that even the EPA estimates may be too low by a factor of 1.5. B.C.’s estimate, in other words, would be about one-eighth of what has been estimated for the American gas industry.

Although the amounts of methane released are small compared to carbon dioxide emissions, methane matters because it packs a much bigger global warming punch. Determining the effect of methane emissions is complicated because molecules of methane only last in the atmosphere for a decade or so and the warming effect from its release depends on the time interval it is measured over. Compared to a given mass of carbon dioxide, the same mass of methane will produce 34 times as much warming over 100 years, or 86 times as much over 20 years.



Are we overestimating our global carbon budget?

Posted on 15 July 2015 by Andy Skuce &

The latest research suggests that natural sinks of carbon on land may be slowing or even turning into sources, creating climate consequences potentially worse than first thought.

Nature has provided humans with a buffer against the worst effects of our carbon pollution. Since 1750, we have emitted about 580 billion tonnes of carbon into the atmosphere by burning fossil fuels, cutting down forests and making cement. If those emissions had simply accumulated in the air, the concentration of carbon dioxide would have increased from 280 parts per million (ppm), as it was before the Industrial Revolution, to about 550 ppm today. Instead, we currently measure around 400 ppm, which is still a whopping 40 per cent above the planet’s pre-industrial atmosphere, but much less than a doubling.

Some 60 per cent of our emissions have been taken up in natural sinks by, in roughly equal parts, dissolving into the ocean and by being taken up by plants growing faster on land. Were it not for these natural carbon sinks, we would by now be much deeper into dangerous climate change.

As we continue to burn fossil fuels, our climate troubles will become worse should those sinks start to falter. And the outlook will be worse still if those sinks turn into sources of carbon. 

New research

According to the latest research, the carbon sink on land is unfortunately starting to show signs of trouble. Instead of providing a brake on human emissions, the land carbon sink could instead soon be giving our emissions a boost. (The ocean sink appears to be relatively safe for now, although there is a price to pay: the consequence of the process of carbon dioxide dissolving into seawater, ocean acidification, has been called climate change’s evil twin with its own, non-climate related consequences for marine life.)

Plants don't thrive solely on carbon dioxide

Plants don’t thrive solely on carbon dioxide.



Carbon cycle feedbacks and the worst-case greenhouse gas pathway

Posted on 7 July 2015 by Andy Skuce &

The worst-case emissions pathway, RCP8.5, is a scenario that burns a huge amount of fossil fuels, especially coal. The model has sometimes been criticized as implausible because of its huge resource consumption and emissions of ~1700 billion tonnes of carbon (GtC) over the century. Those emissions are based in part on carbon cycle model assumptions, which recent work suggests may be too optimistic. New research shows that future plant growth may be restricted by nutrient availability, turning the land carbon sink into a source. Also, permafrost feedbacks (not considered in IPCC CMIP5 models) may also add significant emissions to the atmosphere under the RCP8.5 pathway. In addition, the latest research on the Amazon Basin reveals that the tropical forest carbon sinks may already be diminishing there. Together, these feedbacks suggest that the greenhouse gas concentrations in the RCP8.5 case could be achieved with ~400 GtC smaller human emissions, making the RCP8.5 worst-case scenario more plausible.

The climate models referred to  in the recent IPCC Fifth Assessment Report (AR5) are founded on one of four Representative Concentration Pathways or RCPs. The key word in RCP is concentration. In the RCPs, the concentration of greenhouse gases is fixed at different times in the future and the climate model (or general circulation model or GCM) uses those atmospheric concentrations to calculate future climate states. Underpinning the concentration pathways are socio-economic and emissions scenarios. There can be more than one underlying emissions scenario capable of producing the concentration pathway.

If you are unfamiliar with RCPs, check out the great guide that Graham Wayne wrote in August 2013 for Skeptical Science.

This way of modelling differs from previous approaches in which the starting point was a story or scenario about economic and social development that led to emissions. These emissions are run through a carbon-cycle model (which may be simple or complex) to produce atmospheric concentrations over time. 

The schematic illustrates the differences in approach. The elements in red boxes are the prescribed inputs into the models, whereas the elements in blue ellipses are outputs. The advantage of the RCP prescribed-concentration approach is that the climate model outputs do not depend to the same degree on carbon-cycle models as they did in the emissions scenario method. The disadvantage is that there is no unique link between concentrations and emissions. The schematic is simplified in that there are feedbacks and loops in the processes that are not illustrated. 

The worst-case scenario among the four Representative Concentration Pathways (RCPs) is known as RCP8.5. The number “8.5” refers to the radiative forcing level measured in W/m2 in the year 2100. RCP8.5, despite it often being called “business-as usual”, has been criticized as an unlikely outcome. While true, that’s more feature than bug, since, as one of the two extreme pathways, it is designed to provide climate modellers with an unlikely, but still just plausible “how bad could it be” scenario.

Let’s look briefly at some of the underlying socio-economic assumptions behind RCP8.5, then we’ll examine how the latest research on the terrestrial carbon cycle makes the GHG concentrations in the RCP8.5 model easier to reach.



Why the 97 per cent consensus on climate change still gets challenged

Posted on 18 May 2015 by Andy Skuce &

Here are some excerpts from an article I wrote for the magazine Corporate Knights, published on May 14, 2015. Some references and links have been added at the end.

In 2004, science historian Naomi Oreskes published a short paper in the journal Science concluding there was an overwhelming consensus in the scientific literature that global warming was caused by humans.

After the paper’s release, there was some unexpectedly hostile reaction. This prompted Oreskes and her colleague Erik Conway to go even deeper with their research, leading to the publication of the book Merchants of Doubt. It documents how a small group of scientists with links to industry were able to sow doubt about the scientific consensus and delay effective policy on DDT, tobacco, acid rain and, now, global warming.

Fast forward to two years ago: a team of volunteer researchers (myself included) associated with the website Skeptical Science decide to update and extend Oreskes’ research. Led by University of Queensland researcher John Cook, we analyzed the abstracts of about 12,000 scientific papers extracted from a large database of articles, using the search terms “global warming” and “global climate change.” The articles had been published over a 21-year period, from 1991 to 2011.

As an independent check on our results, we also sent emails to the more than 8,500 scientist authors of these articles. (These were the scientists whose e-mail addresses we were able to track down). We asked them to rate their own papers for endorsement or rejection of man-made global warming.

Both approaches yielded a very similar result: 97 per cent of the scientific literature that expresses an opinion on climate change endorses the expert consensus view that it is man-made. The results were published in May 2013 in the journal Environmental Research Letters.

We were astonished by the positive reception. Mention of the paper was tweeted by U.S. President Barack Obama, Al Gore and Elon Musk, among others. Obama later referenced it in a speech at the University of Queensland, while U.S. Secretary of State John Kerry has referred to the 97 per cent consensus in recent speeches. John Oliver based an episode of his HBO comedy show Last Week Tonight around it, a clip viewed online more than five million times.



Permafrost feedback update 2015: is it good or bad news?

Posted on 20 April 2015 by Andy Skuce &

We have good reason to be concerned about the potential for nasty climate feedbacks from thawing permafrost in the Arctic. Consider:

  • The Arctic contains huge stores of plant matter in its frozen soils. Over one-third of all the carbon stored in all of the soils of the Earth are found in this region, which hosts just 15% of the planet's soil-covered area.
  • The Arctic is warming at twice the rate of the rest of the planet. The vegetable matter in the soils is being taken out of the northern freezer and placed on the global kitchen counter to decompose. Microbes will take full advantage of this exceptional dining opportunity and will convert part of these plant remains into carbon dioxide and methane.
  • These gases will add to the already enhanced greenhouse effect that caused the Arctic warming, providing a further boost to warming. There's plenty of scope for these emissions to cause significant climatic mischief: the amount of carbon in the permafrost is double the amount currently in the air. 

But exactly how bad will it be, and how quickly will it cause problems for us? Does the latest research bring good news or bad?

Ted Schuur and sixteen other permafrost experts have just published a review paper in Nature: Climate change and the permafrost feedback (paywalled). This long and authoritative article (7 pages of text, plus 97 references) provides a state-of-the-art update on the expected response of permafrost thawing to man-made climate change. Much of the work reported on in this paper has been published since the 2013 IPCC AR5 report. It covers new observations of permafrost thickness and carbon content, along with laboratory experiments on permafrost decomposition and the results of several modelling exercises.

The overall conclusion is that, although the permafrost feedback is unlikely to cause abrupt climate change in the near future, the feedback is going to make climate change worse over the second half of this century and beyond. The emissions quantities are still uncertain, but the central estimate would be like adding an additional country with the unmitigated emissions the current size of the United States' for at least the rest of the century. This will not cause a climate catastrophe by itself, but it will make preventing dangerous climate change that much more difficult. As if it wasn't hard enough already.


There's a lot of information in this paper and, rather than attempt to describe it all in long form, I'll try to capture the main findings in bullet points. 

  • The top three metres of permafrost contain about 1035 PgC (billion tonnes of carbon). This is similar to previous estimates, but is now supported by ten times as many observations below the top 1 m depth. Very roughly, the deepest deposits richest in carbon are near the Russian, Alaskan and Canadian Arctic coasts, with the poorest in mountainous regions and in areas close to glaciers and the Greenland ice sheet.

The carbon content in the top three metres of permafrost soils. From Hugelius et al (2013).



The history of emissions and the Great Acceleration

Posted on 7 April 2015 by Andy Skuce &

 This is a repost from the Critical Angle blog.

One of my pastimes is downloading data and playing around with it on Excel. I’m not kidding myself that doing this means anything in terms of original research, but I do find that I learn quite a lot about the particularities of the data and about the science in general by doing some simple calculations and graphing the numbers. There’s even occasionally a small feeling of discovery, a bit like the kind that you experience when you follow a well-trodden path in the mountains for the first time:

We were not pioneers ourselves, but we journeyed over old trails that were new to us, and with hearts open. Who shall distinguish? J. Monroe Thorington

Anyway, I downloaded some historical emissions data from the CDIAC site and played around with it. To repeat, there’s nothing new to science here, but there were a few things that I found that were new to me. First, let’s look at historical emissions of CO2 from man-made sources from 1850 to 2010. Note that for all of these graphs there are no data shown for 2011-2015.

What immediately struck me—something I hadn’t fully appreciated before—was how small oil consumption was before 1950. Both world wars were carried out without huge increases in oil use, despite the massive mobilizations of armies, navies and air forces. You can make out some downward blips in coal consumption for the Great Depression (~1930) and around the end of WW2 (~1945).

It wasn’t until after 1950 that fossil-fuel consumption went nuts. Some people have taken to calling this inflection point The Great Acceleration, there’s more on this later.

What do these emissions from different sources look like as a proportion of all human emissions over this time period?



Shell: internal carbon pricing and the limits of big oil company action on climate

Posted on 24 March 2015 by Andy Skuce &

Shell evaluates all of its projects using a shadow carbon tax of $40 per tonne of carbon dioxide. That's great. But why is the company still exploring in the Arctic and busy exploiting the Alberta oil sands?

Of all of the big fossil-fuel companies, Shell has adopted perhaps the most constructive position on climate change mitigation. Recently, the company's CEO, Ben van Buerden told an industry conference:

You cannot talk credibly about lowering emissions globally if, for example, you are slow to acknowledge climate change; if you undermine calls for an effective carbon price; and if you always descend into the ‘jobs versus environment’ argument in the public debate.

Shell employs engineer David Hone as their full-time Climate Change Advisor. Hone has written a small ebook Putting the Genie Back: 2°C Will Be Harder Than We Think, priced at just 99¢ and he writes a climate change blog that should be part of every climate-policy geek's balanced diet.

Shell also has a position they call Vice President CO2, currently occupied by Angus Gillespie. Here's Gillespie talking recently at Stanford on the company's internal shadow carbon pricing strategy (hat-tip to John Mashey). It's worth watching if only for Gillespie's vivid example of the limitations of looking at averages. The slides can be downloaded here.



Does providing information on geoengineering reduce climate polarization?

Posted on 4 March 2015 by Andy Skuce &

Dan Kahan of Yale University and four colleagues have just published an article in Annals of the AAPS titled: Geoengineering and Climate Change Polarization Testing a Two-Channel Model of Science Communication that investigates the effect on study participants' attitudes to climate change after reading an article about geoengineering. In their abstract, they write:

We found that cultural polarization over the validity of climate change science is offset by making citizens aware of the potential contribution of geoengineering as a supplement to restriction of CO2 emissions.

I will argue here that this experiment achieved no such result because the premise was wrong. Specifically, the information on geoengineering that was presented to the study participants (in the form of a fictional newspaper article) bears no relation to mainstream scientific opinion on geoengineering nor, even, to the opinions of advocates of geoengineering. Geoengineering is portrayed in the fictional newspaper article as a strategy with no uncertainty about how well it might work and, it is claimed, will "spare consumers and businesses from the heavy economic costs associated with the regulations necessary to reduce atmospheric CO2 concentrations to 450 ppm or lower". This is hardly depicting geoengineering as a "potential solution" or "a supplement" to the restriction of emissions, as is claimed in the abstract of the paper.

In fact, what Kahan et al. have demonstrated is that presenting misinformation dressed up as fact can affect people's opinions about climate change. That may be interesting as a social science experiment conducted on consenting adults, but it is not much use as a guide to effective public science communication, constrained as it is to tell the truth.

The Kahan et al 2015 paper is paywalled, but there is a 2012 version of it (updated in 2015), with the same title, similar figures, but different text, that is available online here. The study looked at two representative samples of individuals from the USA and England of 1500 each. The two samples were further split into three groups that were each asked to read one of three fictional newspaper articles. One article, used as a control, had nothing to do with climate change. The second was an article advocated tighter limits on atmospheric concentrations of CO2 (although this article contained what surely must be a typo, calling for CO2 concentrations of 175 ppm, which would send us back to depths of the last ice age). The third piece called for geoengineering on the grounds that "limiting emissions is a wasteful and futile strategy". Articles two and three both quoted a (fictional) Dr Williams of Harvard University, the spokesman of the (fictional) "American Association of Geophysical Scientists". Both of these articles contained a couple of pictures designed to appeal to or to repel people at either end of the political spectrum.



Andy Skuce's AGU Fall Meeting 2014 poster presentation

Posted on 29 December 2014 by Andy Skuce &

This is a re-post from Critical Angle

I gave a poster presentation on December 16th  at the 2014 Fall Meeting of the American Geophysical Union in San Francisco. The title is: Emissions of Water and Carbon Dioxide from Fossil-Fuel Combustion Contribute Directly to Ocean Mass and Volume Increases.

You can read the abstract here and I have uploaded a pdf of the poster here. There is a picture of the poster below, click on it to make it readable, although you will need to download the pdf to make out some of the fine print.


Some of the numbers changed a little bit between the time I submitted the abstract in August and now. I found one or two small errors and recalculated the uncertainty range using Monte Carlo analysis. “Min” and “Max” values bracket the 90% confidence interval. In the title I used “directly” to distinguish the physical effects of emissions on ocean volumes from the more “indirect” (and bigger and better-known) contributions of emissions to sea-level rise via the effect of emissions on global warming.



Keystone XL: Oil Markets and Emissions

Posted on 1 September 2014 by Andy Skuce &

  • Estimates of the incremental emission effects of individual oil sands projects like the Keystone XL (KXL) pipeline are sensitive to assumptions about the response of world markets and alternative transportation options.

  • A recent Nature Climate Change paper by Erickson and Lazarus concludes that KXL may produce incremental emissions of 0-110 million tonnes of CO2 per year, but the article has provoked some controversy.

  • Comments by industry leaders and the recent shelving of a new bitumen mining project suggest that the expansion of the oil sands may be more transportation constrained and more exposed to cost increases than is sometimes assumed.

  • Looking at the longer-term commitment effects of new infrastructure on cumulative emissions supports the higher-end incremental estimates.

President Obama (BBC) has made it clear that the impact of the Keystone XL (KXL) pipeline on the climate will be critical in his administration’s decision on whether the pipeline will go ahead or not.  However, different estimates of the extra carbon emissions that the pipeline will cause vary wildly. For example, the consultants commissioned by the US State Department estimated that the incremental emissions would be 1.3 to 27.4 million tonnes of CO2 (MtCO2) annually. In contrast, John Abraham, writing in the Guardian (and again more recently), estimated that the emissions would be as much as 190 MtCO2 annually, about seven times the State Department’s high estimate (calculation details here).

The variation in the estimates arises from the assumptions made. The State Department consultants assumed that the extra oil transported by the pipeline would displace oil produced elsewhere, so that we should only count the difference between the life-cycle emissions from the shut-in light oil and those of the more carbon-intensive bitumen. In addition, they estimated that not building KXL would mean that bitumen would instead be transported by rail, at slightly higher transportation costs. Abraham simply totted up all of the production, refining and consumption emissions of the 830,000 barrels per day (bpd) pipeline capacity and did not consider any effect of the extra product on world oil markets.

Neither set of assumptions is likely to be correct. Increasing the supply of any product will have an effect on a market, lowering prices and stimulating demand (consumption) growth. Lower prices will reduce supply somewhere.  The question is: by how much?

An interesting new paper in Nature Climate Change (paywalled, but there is an open copy of an earlier version available here) by Peter Erickson and Michael Lazaruares ,attempts to answer this question. The authors are based in the Seattle office of the Stockholm Environment Institute (SEI).



The Consensus Project Website



(free to republish)



The Scientific Guide to
Global Warming Skepticism

Smartphone Apps


© Copyright 2015 John Cook
Home | Links | Translations | About Us | Contact Us