Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.


Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe

Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...

Keep me logged in
New? Register here
Forgot your password?

Latest Posts


Climate Hustle

Andy Skuce

Andy Skuce is a recently-retired geophysical consultant living in British Columbia. He has a BSc in geology from Sheffield University and an MSc in geophysics from the  University of Leeds. His work experience includes a period at the British Geological Survey in Edinburgh and work for a variety of oil companies based in Calgary, Vienna and Quito. Since 2005, he worked as an independent consultant. Andy has published a handful of papers over the years in tectonics and structural geology that can be viewed here. He described how his views on climate change evolved in this blog post.

Follow Andy on Twitter @andyskuce and on his blog Critical Angle


Google Scholar profile.


Recent blog posts

The Road to Two Degrees, Part Two: Are the experts being candid about our chances?

Posted on 26 November 2015 by Andy Skuce &

The first part of this three-part series looked at the staggering magnitude and the daunting deployment timescale available for the fossil fuel and bioenergy carbon capture and storage technologies that many 2°C mitigation scenarios assume. In this second part, I outline Kevin Anderson's argument that climate experts are failing to acknowledge the near-impossibility of avoiding dangerous climate change under current assumptions of the political and economic status quo, combined with unrealistic expectations of untested negative-emissions technologies.

In plain language, the complete set of 400 IPCC scenarios for a 50% or better chance of meeting the 2 °C target work on the basis of either an ability to change the past, or the successful and large-scale uptake of negative-emission technologies. A significant proportion of the scenarios are dependent on both. (Kevin Anderson)

Kevin Anderson has just written a provocative article titled: Duality in climate science, published in Nature Geoscience (open access text available here). He contrasts the up-beat pronouncements in the run-up to the Paris climate conference in December 2015 (e.g. “warming to less than 2°C” is “economically feasible” and “cost effective”; “global economic growth would not be strongly affected”) with what he see as the reality that meeting the 2°C target cannot be reconciled with continued economic growth in rich societies at the same time as the rapid development of poor societies.  He concludes that: “the carbon budgets associated with a 2 °C threshold demand profound and immediate changes to the consumption and production of energy”.

His argument runs as follows: Integrated Assessment Models, which attempt to bring together, physics, economics and policy, rely on highly optimistic assumptions specifically:

o   Unrealistic early peaks in global emissions;
o   Massive deployment of negative-emissions technologies.

He notes that of the 400 scenarios that have a 50% or better chance of meeting the 2 °C target, 344 of them assume the large-scale uptake of negative emissions technologies and, in the 56 scenarios that do not, global emissions peak around 2010, which, as he notes, is contrary to the historical data.

I covered the problems of the scalability and timing of carbon capture and storage and negative emissions technologies in a previous article.

From Robbie Andrew, adjusted for non-CO2 and land-use emissions.Note that these mitigation curves assume no net-negative emissions technologies deployed in the latter part of the century.



The Road to Two Degrees, Part One: Feasible Emissions Pathways, Burying our Carbon, and Bioenergy

Posted on 16 November 2015 by Andy Skuce &

This post looks at the feasibility of the massive and rapid deployment of Carbon Capture and Storage and negative-emissions Bioenergy Carbon Capture and Storage technologies in the majority of IPCC scenarios that avoid dangerous global warming. Some observers question whether the deployment of these technologies at these scales and within the required time frames is achievable. This is Part One of a three-part series on the challenge of keeping global warming under 2 °C.

The various emissions models that have been used to produce the greenhouse gas concentration pathway to 2°Celsius vary considerably, but the majority of them require huge deployment of Carbon Capture and Storage (CCS) as well as net-negative global emissions in the latter part of the twenty-first century. The only negative emissions methods generally considered in these scenarios are bioenergy capture and storage (BECCS) and land-use changes, such as afforestation. For there to be net-negative emissions, positive emissions have to be smaller than the negative emissions.

Kevin Anderson (2015) (open-access text) reports that of the 400 scenarios that have a 50% chance or greater of no more than 2 °C of warming, 344 assume large-scale negative emissions technologies. The remaining 56 scenarios have emissions peaking in 2010, which, as we know, did not happen.

Sabine Fuss et al. (2014) (pdf) demonstrate that of the 116 scenarios that lead to concentrations of 430-480 ppm of CO2 equivalent, 101 of them require net negative emissions. Most scenarios that have net-negative emissions have BECCS providing 10-30% of the world’s primary energy in 2100.


From Fuss et al. (2014), showing the historical emissions (black), the four RCPs (heavy coloured lines) and 1089 scenarios assigned to one of the RCPs (light coloured lines).



The thermometer needle and the damage done

Posted on 6 November 2015 by Andy Skuce &

Rising temperatures may inflict much more damage on already warm countries than conventional economic models predict. In the latter part of the twenty-first Century, global warming might even reduce or reverse any earlier economic progress made by poor nations. This would increase global wealth inequality over the century. (This is a repost from Critical Angle.)

A recent paper published in Nature by Marshall Burke, Solomon M. Hsiang and Edward Miguel Global non-linear effect of temperature on economic production argues that increasing temperatures will cause much greater damage to economies than previously predicted. Furthermore, this effect will be distributed very unequally, with tropical countries getting hit very hard and some northern countries actually benefitting.

Let me attempt a highly simplified summary of what they did. I’m not an economist and this analysis is not straightforward, so beware. If I confuse you, try Dana Nuccitelli’s take or Seth Borenstein’s or Bloomberg’s or The Economist's.

Firstly, Burke et al. looked at factors like labour supply, labour performance and crop yields and how they relate to daily temperature exposure. Generally these show little variation up to temperatures in the high twenties Celsius, at which point they fall off quickly. Secondly, those trends were aggregated to predict the relationship between annual average temperatures and the annual impact on economic output. Thirdly, they looked at annual economic output and average annual temperatures for individual countries for the period 1960-2010. Note that they only compared the economic effects of temperature change on individual countries, they did not correlate one country with another. Using these observations they were able to see how the observations compared with their predicted aggregate curve.


All figures from Burke et al. (2015).

This work showed that the GDP of countries with an annual average temperature of 13°C were the least sensitive to temperature changes. Colder countries on the left side of the hump would benefit from an increase in temperature, whereas warmer countries would see their output suffer as temperature increases. Note that the figure does not show that a certain temperature predetermines the level of wealth of a country (China, despite recent rapid growth is poorer than the US and Japan even though average annual temperatures are similar). Rather, it illustrates how susceptible countries are to increases or decreases in productivity relative to their annual average temperature.

There is some evidence that rich countries are slightly less affected by changes in temperature (the curve is a little flatter for them). There are few hot and wealthy countries examined in the study, so any general conclusions about them cannot be certain, but the evidence still points to them being more prone to damage from rising temperature than rich, cooler countries. No matter how rich you are, extra heat hurts the warm lands more than it does the temperate and the cool. You can’t buy your way out of the effects of global warming, except by moving away from the Equator or up into the highlands.



B.C. lowballing fugitive methane emissions from natural gas industry

Posted on 8 October 2015 by Andy Skuce &

This article was first published in the Corporate Knights Magazine.

There is a supplementary article at my blog Critical Angle, that has more detail, links and references, along with an estimation of the GHG emissions (excluding end-use) associated with one liquefied natural gas project and the effect this will have on the feasibility of BC reaching its emissions targets.

There is a further piece at DeSmog Canada, where I compare the situation in BC's gas industry with the Volkswagen emissions reporting scandal, in which a corporation cheats on its emissions tests, with the tacit approval of industry-friendly regulators and governments, only to be exposed by independent researchers performing tests in real-world situations.

The push by British Columbia to develop a new liquefied natural gas (LNG) export industry raises questions about the impact such activities would have on greenhouse gas emissions, both within the province and globally.

One of the single most important factors relates to the amount of methane and carbon dioxide that gets released into the atmosphere, either deliberately through venting or by accident as so-called fugitive emissions. Fugitive emissions are the result of valves and meters that release, by design, small quantities of gas. But they can also come from faulty equipment and from operators that fail to follow regulations.

Photo by Jesús Rodríguez Fernández (creative commons)

According to the B.C. Greenhouse Gas Inventory Report 2012, there were 78,000 tonnes of fugitive methane emissions from the oil and natural gas industry that year. B.C. produced 41 billion cubic metres of gas in 2012. This means about 0.28 per cent of the gas produced was released into the atmosphere.

By North American standards, this is a very low estimate. The U.S. Environmental Protection Agency (EPA) uses a figure of 1.5 per cent leakage, more than five times higher. Recent research led by the U.S. non-profit group, Environmental Defense Fund (EDF), shows that even the EPA estimates may be too low by a factor of 1.5. B.C.’s estimate, in other words, would be about one-eighth of what has been estimated for the American gas industry.

Although the amounts of methane released are small compared to carbon dioxide emissions, methane matters because it packs a much bigger global warming punch. Determining the effect of methane emissions is complicated because molecules of methane only last in the atmosphere for a decade or so and the warming effect from its release depends on the time interval it is measured over. Compared to a given mass of carbon dioxide, the same mass of methane will produce 34 times as much warming over 100 years, or 86 times as much over 20 years.



Are we overestimating our global carbon budget?

Posted on 15 July 2015 by Andy Skuce &

The latest research suggests that natural sinks of carbon on land may be slowing or even turning into sources, creating climate consequences potentially worse than first thought.

Nature has provided humans with a buffer against the worst effects of our carbon pollution. Since 1750, we have emitted about 580 billion tonnes of carbon into the atmosphere by burning fossil fuels, cutting down forests and making cement. If those emissions had simply accumulated in the air, the concentration of carbon dioxide would have increased from 280 parts per million (ppm), as it was before the Industrial Revolution, to about 550 ppm today. Instead, we currently measure around 400 ppm, which is still a whopping 40 per cent above the planet’s pre-industrial atmosphere, but much less than a doubling.

Some 60 per cent of our emissions have been taken up in natural sinks by, in roughly equal parts, dissolving into the ocean and by being taken up by plants growing faster on land. Were it not for these natural carbon sinks, we would by now be much deeper into dangerous climate change.

As we continue to burn fossil fuels, our climate troubles will become worse should those sinks start to falter. And the outlook will be worse still if those sinks turn into sources of carbon. 

New research

According to the latest research, the carbon sink on land is unfortunately starting to show signs of trouble. Instead of providing a brake on human emissions, the land carbon sink could instead soon be giving our emissions a boost. (The ocean sink appears to be relatively safe for now, although there is a price to pay: the consequence of the process of carbon dioxide dissolving into seawater, ocean acidification, has been called climate change’s evil twin with its own, non-climate related consequences for marine life.)

Plants don't thrive solely on carbon dioxide

Plants don’t thrive solely on carbon dioxide.



Carbon cycle feedbacks and the worst-case greenhouse gas pathway

Posted on 7 July 2015 by Andy Skuce &

The worst-case emissions pathway, RCP8.5, is a scenario that burns a huge amount of fossil fuels, especially coal. The model has sometimes been criticized as implausible because of its huge resource consumption and emissions of ~1700 billion tonnes of carbon (GtC) over the century. Those emissions are based in part on carbon cycle model assumptions, which recent work suggests may be too optimistic. New research shows that future plant growth may be restricted by nutrient availability, turning the land carbon sink into a source. Also, permafrost feedbacks (not considered in IPCC CMIP5 models) may also add significant emissions to the atmosphere under the RCP8.5 pathway. In addition, the latest research on the Amazon Basin reveals that the tropical forest carbon sinks may already be diminishing there. Together, these feedbacks suggest that the greenhouse gas concentrations in the RCP8.5 case could be achieved with ~400 GtC smaller human emissions, making the RCP8.5 worst-case scenario more plausible.

The climate models referred to  in the recent IPCC Fifth Assessment Report (AR5) are founded on one of four Representative Concentration Pathways or RCPs. The key word in RCP is concentration. In the RCPs, the concentration of greenhouse gases is fixed at different times in the future and the climate model (or general circulation model or GCM) uses those atmospheric concentrations to calculate future climate states. Underpinning the concentration pathways are socio-economic and emissions scenarios. There can be more than one underlying emissions scenario capable of producing the concentration pathway.

If you are unfamiliar with RCPs, check out the great guide that Graham Wayne wrote in August 2013 for Skeptical Science.

This way of modelling differs from previous approaches in which the starting point was a story or scenario about economic and social development that led to emissions. These emissions are run through a carbon-cycle model (which may be simple or complex) to produce atmospheric concentrations over time. 

The schematic illustrates the differences in approach. The elements in red boxes are the prescribed inputs into the models, whereas the elements in blue ellipses are outputs. The advantage of the RCP prescribed-concentration approach is that the climate model outputs do not depend to the same degree on carbon-cycle models as they did in the emissions scenario method. The disadvantage is that there is no unique link between concentrations and emissions. The schematic is simplified in that there are feedbacks and loops in the processes that are not illustrated. 

The worst-case scenario among the four Representative Concentration Pathways (RCPs) is known as RCP8.5. The number “8.5” refers to the radiative forcing level measured in W/m2 in the year 2100. RCP8.5, despite it often being called “business-as usual”, has been criticized as an unlikely outcome. While true, that’s more feature than bug, since, as one of the two extreme pathways, it is designed to provide climate modellers with an unlikely, but still just plausible “how bad could it be” scenario.

Let’s look briefly at some of the underlying socio-economic assumptions behind RCP8.5, then we’ll examine how the latest research on the terrestrial carbon cycle makes the GHG concentrations in the RCP8.5 model easier to reach.



Why the 97 per cent consensus on climate change still gets challenged

Posted on 18 May 2015 by Andy Skuce &

Here are some excerpts from an article I wrote for the magazine Corporate Knights, published on May 14, 2015. Some references and links have been added at the end.

In 2004, science historian Naomi Oreskes published a short paper in the journal Science concluding there was an overwhelming consensus in the scientific literature that global warming was caused by humans.

After the paper’s release, there was some unexpectedly hostile reaction. This prompted Oreskes and her colleague Erik Conway to go even deeper with their research, leading to the publication of the book Merchants of Doubt. It documents how a small group of scientists with links to industry were able to sow doubt about the scientific consensus and delay effective policy on DDT, tobacco, acid rain and, now, global warming.

Fast forward to two years ago: a team of volunteer researchers (myself included) associated with the website Skeptical Science decide to update and extend Oreskes’ research. Led by University of Queensland researcher John Cook, we analyzed the abstracts of about 12,000 scientific papers extracted from a large database of articles, using the search terms “global warming” and “global climate change.” The articles had been published over a 21-year period, from 1991 to 2011.

As an independent check on our results, we also sent emails to the more than 8,500 scientist authors of these articles. (These were the scientists whose e-mail addresses we were able to track down). We asked them to rate their own papers for endorsement or rejection of man-made global warming.

Both approaches yielded a very similar result: 97 per cent of the scientific literature that expresses an opinion on climate change endorses the expert consensus view that it is man-made. The results were published in May 2013 in the journal Environmental Research Letters.

We were astonished by the positive reception. Mention of the paper was tweeted by U.S. President Barack Obama, Al Gore and Elon Musk, among others. Obama later referenced it in a speech at the University of Queensland, while U.S. Secretary of State John Kerry has referred to the 97 per cent consensus in recent speeches. John Oliver based an episode of his HBO comedy show Last Week Tonight around it, a clip viewed online more than five million times.



Permafrost feedback update 2015: is it good or bad news?

Posted on 20 April 2015 by Andy Skuce &

We have good reason to be concerned about the potential for nasty climate feedbacks from thawing permafrost in the Arctic. Consider:

  • The Arctic contains huge stores of plant matter in its frozen soils. Over one-third of all the carbon stored in all of the soils of the Earth are found in this region, which hosts just 15% of the planet's soil-covered area.
  • The Arctic is warming at twice the rate of the rest of the planet. The vegetable matter in the soils is being taken out of the northern freezer and placed on the global kitchen counter to decompose. Microbes will take full advantage of this exceptional dining opportunity and will convert part of these plant remains into carbon dioxide and methane.
  • These gases will add to the already enhanced greenhouse effect that caused the Arctic warming, providing a further boost to warming. There's plenty of scope for these emissions to cause significant climatic mischief: the amount of carbon in the permafrost is double the amount currently in the air. 

But exactly how bad will it be, and how quickly will it cause problems for us? Does the latest research bring good news or bad?

Ted Schuur and sixteen other permafrost experts have just published a review paper in Nature: Climate change and the permafrost feedback (paywalled). This long and authoritative article (7 pages of text, plus 97 references) provides a state-of-the-art update on the expected response of permafrost thawing to man-made climate change. Much of the work reported on in this paper has been published since the 2013 IPCC AR5 report. It covers new observations of permafrost thickness and carbon content, along with laboratory experiments on permafrost decomposition and the results of several modelling exercises.

The overall conclusion is that, although the permafrost feedback is unlikely to cause abrupt climate change in the near future, the feedback is going to make climate change worse over the second half of this century and beyond. The emissions quantities are still uncertain, but the central estimate would be like adding an additional country with the unmitigated emissions the current size of the United States' for at least the rest of the century. This will not cause a climate catastrophe by itself, but it will make preventing dangerous climate change that much more difficult. As if it wasn't hard enough already.


There's a lot of information in this paper and, rather than attempt to describe it all in long form, I'll try to capture the main findings in bullet points. 

  • The top three metres of permafrost contain about 1035 PgC (billion tonnes of carbon). This is similar to previous estimates, but is now supported by ten times as many observations below the top 1 m depth. Very roughly, the deepest deposits richest in carbon are near the Russian, Alaskan and Canadian Arctic coasts, with the poorest in mountainous regions and in areas close to glaciers and the Greenland ice sheet.

The carbon content in the top three metres of permafrost soils. From Hugelius et al (2013).



The history of emissions and the Great Acceleration

Posted on 7 April 2015 by Andy Skuce &

 This is a repost from the Critical Angle blog.

One of my pastimes is downloading data and playing around with it on Excel. I’m not kidding myself that doing this means anything in terms of original research, but I do find that I learn quite a lot about the particularities of the data and about the science in general by doing some simple calculations and graphing the numbers. There’s even occasionally a small feeling of discovery, a bit like the kind that you experience when you follow a well-trodden path in the mountains for the first time:

We were not pioneers ourselves, but we journeyed over old trails that were new to us, and with hearts open. Who shall distinguish? J. Monroe Thorington

Anyway, I downloaded some historical emissions data from the CDIAC site and played around with it. To repeat, there’s nothing new to science here, but there were a few things that I found that were new to me. First, let’s look at historical emissions of CO2 from man-made sources from 1850 to 2010. Note that for all of these graphs there are no data shown for 2011-2015.

What immediately struck me—something I hadn’t fully appreciated before—was how small oil consumption was before 1950. Both world wars were carried out without huge increases in oil use, despite the massive mobilizations of armies, navies and air forces. You can make out some downward blips in coal consumption for the Great Depression (~1930) and around the end of WW2 (~1945).

It wasn’t until after 1950 that fossil-fuel consumption went nuts. Some people have taken to calling this inflection point The Great Acceleration, there’s more on this later.

What do these emissions from different sources look like as a proportion of all human emissions over this time period?



Shell: internal carbon pricing and the limits of big oil company action on climate

Posted on 24 March 2015 by Andy Skuce &

Shell evaluates all of its projects using a shadow carbon tax of $40 per tonne of carbon dioxide. That's great. But why is the company still exploring in the Arctic and busy exploiting the Alberta oil sands?

Of all of the big fossil-fuel companies, Shell has adopted perhaps the most constructive position on climate change mitigation. Recently, the company's CEO, Ben van Buerden told an industry conference:

You cannot talk credibly about lowering emissions globally if, for example, you are slow to acknowledge climate change; if you undermine calls for an effective carbon price; and if you always descend into the ‘jobs versus environment’ argument in the public debate.

Shell employs engineer David Hone as their full-time Climate Change Advisor. Hone has written a small ebook Putting the Genie Back: 2°C Will Be Harder Than We Think, priced at just 99¢ and he writes a climate change blog that should be part of every climate-policy geek's balanced diet.

Shell also has a position they call Vice President CO2, currently occupied by Angus Gillespie. Here's Gillespie talking recently at Stanford on the company's internal shadow carbon pricing strategy (hat-tip to John Mashey). It's worth watching if only for Gillespie's vivid example of the limitations of looking at averages. The slides can be downloaded here.



The Consensus Project Website



(free to republish)



The Scientific Guide to
Global Warming Skepticism

Smartphone Apps


© Copyright 2015 John Cook
Home | Links | Translations | About Us | Contact Us