Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Donate

Twitter Facebook YouTube Pinterest

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
Keep me logged in
New? Register here
Forgot your password?

Latest Posts

Archives

Climate Hustle

Andy Skuce

Andy Skuce is a recently-retired geophysical consultant living in British Columbia. He has a BSc in geology from Sheffield University and an MSc in geophysics from the  University of Leeds. His work experience includes a period at the British Geological Survey in Edinburgh and work for a variety of oil companies based in Calgary, Vienna and Quito. Since 2005, he worked as an independent consultant. Andy has published a handful of papers over the years in tectonics and structural geology that can be viewed here. He described how his views on climate change evolved in this blog post.

Follow Andy on Twitter @andyskuce and on his blog Critical Angle

Publications

Google Scholar profile.

 

Recent blog posts


The Quest for CCS

Posted on 13 January 2016 by Andy Skuce &

This article was originally published online at Corporate Knights and will appear in the hard copy Winter 2016 Edition of the Corporate Knights Magazine, which is to be included  as a supplement to the Globe and Mail and Washington Post later in January 2016. The photograph used in the original was changed for copyright reasons.

Human civilization developed over a period of 10,000 years during which global average surface temperatures remained remarkably stable, hovering within one degree Celsius of where they are today.

If we are to keep future temperatures from getting far outside that range, humanity will be forced to reduce fossil fuel emissions to zero by 2050. Halving our emissions is not good enough: we need to get down to zero to stay under the 2 C target that scientists and policy makers have identified as the limit beyond which global warming becomes dangerous.

Shell boasting about its government-funded Quest CCS project, on a Toronto bus. (Photo: rustneversleeps) "Shell Quest captures over one-third of our oil sands upgrader emissions"

Many scenarios have been proposed to get us there. Some of these involve rapid deployment of solar and wind power in conjunction with significant reductions in the amount of energy we consume.

However, many of the economists and experts who have developed scenarios for the Intergovernmental Panel on Climate Change (IPCC) believe that the only way to achieve the two-degree goal in a growing world economy is to invest in large-scale carbon capture and storage (CCS) projects. These technologies capture carbon dioxide from the exhausts of power stations and industrial plants and then permanently store it, usually by injecting it into underground rock layers.

Even with massive deployment of CCS over coming decades, most scenarios modelled by the IPCC overshoot the carbon budget and require that in the latter part of the century, we actually take more carbon out of the atmosphere than we put into it. Climate expert Kevin Anderson of the Tyndall Centre for Climate Change Research at the University of Manchester recently reported in Nature Geoscience that, of the 400 IPCC emissions scenarios used in the 2014 Working Group report to keep warming below two degrees, some 344 require the deployment of negative emissions technologies after 2050. The other 56 models assumed that we would start rapidly reducing emissions in 2010 (which, of course, did not happen). In other words, negative emissions are required in all of the IPCC scenarios that are still current.

One favoured negative emissions technology is bioenergy with carbon capture and storage (BECCS). This involves burning biomass – such as wood pellets – in power stations, then capturing the carbon dioxide and burying it deep in the earth. The technology has not yet been demonstrated at an industrial scale. Using the large amounts of bioenergy envisioned in such scenarios will place huge demands on land use and will conflict with agriculture and biodiversity needs.

Read more...

78 comments


Alberta's new carbon tax

Posted on 31 December 2015 by Andy Skuce &

 

http://alberta.ca/documents/climate/climate-leadership-report-to-minister.pdfOn Sunday November 22nd, 2015, Alberta's new centre-left Premier, Rachel Notley, announced that the province would be introducing an economy-wide carbon tax priced at $30 per tonne of CO2 equivalent, to be phased in in 2016 and 2017. Observers had been expecting new efforts to mitigate emissions since Notley's election in May 2015, but the scope and ambition of this policy took many by surprise. 

Alberta, of course, is the home of the Athabasca oil sands and is one of the largest per-capita GHG emitters of any jurisdiction in the world. The new plan was nevertheless endorsed by environmental groups, First Nations and by the biggest oil companies, an extraordinary consensus that many would not have thought possible.

How was this done? I will try and explain the new policy as far as I can (the details are not all available yet), but the short answer is that a huge amount of credit is due to the panel of experts led by University of Alberta energy economist Andrew Leach and his fellow panelists. Not only did they listen to what all Albertans had to say, but they were thoughtful in framing a policy that is acceptable to almost everyone. 

The background

Alberta is the wealthiest province in Canada, with a population of 4.1 million.  In 2013, greenhouse gas emissions were 267 Mt CO2 equivalent, about 65 tonnes per capita, which compares with the average for the rest of Canada of about 15 tonnes. Among US states only North Dakota and Wyoming are worse. Alberta's fugitive emissions of methane alone amount to 29 Mt CO2e, about 7 tonnes per person, which is a little more than the average for all GHGs per-capita emissions in the world.

From the Climate Leadership Report. The 2030 emissions do not consider savings from the new policy.

Read more...

11 comments


The Road to Two Degrees, Part Three: Equity, inertia and fairly sharing the remaining carbon budget

Posted on 9 December 2015 by Andy Skuce &

In the first part of this series, I examined the implications of relying on CCS and BECCS to get us to the two degree target. In the second part, I took a detailed look at Kevin Anderson's arguments that IPCC mitigation scenarios aimed at two degrees are biased towards unproven negative-emissions technologies and that they consequently downplay the revolutionary changes to our energy systems and economy that we must make very soon. In this last part, I'm going to look at the challenges that the world faces in fairly allocating future emissions from our remaining carbon budget and raising the money needed for climate adaptation funds, taking account of the very unequal past and present.

Until now, economic growth has been driven and sustained largely by fossil fuels. Europe and North America started early with industrialization and, from 1800 up to around 1945, this growth was driven mainly by coal. After the Second World War there was a period of rapid (~4% per year) economic growth in Europe, N America and Japan, lasting about thirty years, that the French refer to as Les Trente Glorieuses, The Glorious Thirty. This expansion was accompanied by a huge rise in the consumption of oil, coal and natural gas. After this there was a thirty-year period of slower growth (~2%) in the developed economies, with consumption fluctuations caused by oil-price shocks and the collapse of the Soviet Union. During this time, oil and coal consumption continued to grow, but not as steadily as before. Then, at the end of the twentieth century, economic growth took off in China, with a huge increase in the consumption of coal.

Source of the emissions data is from the CDIAC. See the SkS post The History of Emissions and the Great Acceleration for further details.

If we are to achieve a stable climate, we will need to reverse this growth in emissions over a much shorter time period, while maintaining the economies of the developed world and, crucially, allowing the possibility of economic growth for the majority of humanity that has not yet experienced the benefits of a developed-country middle-class lifestyle.

Here are the the annual emissions sorted by country and region:

Read more...

15 comments


The Road to Two Degrees, Part Two: Are the experts being candid about our chances?

Posted on 26 November 2015 by Andy Skuce &

The first part of this three-part series looked at the staggering magnitude and the daunting deployment timescale available for the fossil fuel and bioenergy carbon capture and storage technologies that many 2°C mitigation scenarios assume. In this second part, I outline Kevin Anderson's argument that climate experts are failing to acknowledge the near-impossibility of avoiding dangerous climate change under current assumptions of the political and economic status quo, combined with unrealistic expectations of untested negative-emissions technologies.

In plain language, the complete set of 400 IPCC scenarios for a 50% or better chance of meeting the 2 °C target work on the basis of either an ability to change the past, or the successful and large-scale uptake of negative-emission technologies. A significant proportion of the scenarios are dependent on both. (Kevin Anderson)

Kevin Anderson has just written a provocative article titled: Duality in climate science, published in Nature Geoscience (open access text available here). He contrasts the up-beat pronouncements in the run-up to the Paris climate conference in December 2015 (e.g. “warming to less than 2°C” is “economically feasible” and “cost effective”; “global economic growth would not be strongly affected”) with what he see as the reality that meeting the 2°C target cannot be reconciled with continued economic growth in rich societies at the same time as the rapid development of poor societies.  He concludes that: “the carbon budgets associated with a 2 °C threshold demand profound and immediate changes to the consumption and production of energy”.

His argument runs as follows: Integrated Assessment Models, which attempt to bring together, physics, economics and policy, rely on highly optimistic assumptions specifically:

o   Unrealistic early peaks in global emissions;
o   Massive deployment of negative-emissions technologies.

He notes that of the 400 scenarios that have a 50% or better chance of meeting the 2 °C target, 344 of them assume the large-scale uptake of negative emissions technologies and, in the 56 scenarios that do not, global emissions peak around 2010, which, as he notes, is contrary to the historical data.

I covered the problems of the scalability and timing of carbon capture and storage and negative emissions technologies in a previous article.

From Robbie Andrew, adjusted for non-CO2 and land-use emissions.Note that these mitigation curves assume no net-negative emissions technologies deployed in the latter part of the century.

Read more...

40 comments


The Road to Two Degrees, Part One: Feasible Emissions Pathways, Burying our Carbon, and Bioenergy

Posted on 16 November 2015 by Andy Skuce &

This post looks at the feasibility of the massive and rapid deployment of Carbon Capture and Storage and negative-emissions Bioenergy Carbon Capture and Storage technologies in the majority of IPCC scenarios that avoid dangerous global warming. Some observers question whether the deployment of these technologies at these scales and within the required time frames is achievable. This is Part One of a three-part series on the challenge of keeping global warming under 2 °C.

The various emissions models that have been used to produce the greenhouse gas concentration pathway to 2°Celsius vary considerably, but the majority of them require huge deployment of Carbon Capture and Storage (CCS) as well as net-negative global emissions in the latter part of the twenty-first century. The only negative emissions methods generally considered in these scenarios are bioenergy capture and storage (BECCS) and land-use changes, such as afforestation. For there to be net-negative emissions, positive emissions have to be smaller than the negative emissions.

Kevin Anderson (2015) (open-access text) reports that of the 400 scenarios that have a 50% chance or greater of no more than 2 °C of warming, 344 assume large-scale negative emissions technologies. The remaining 56 scenarios have emissions peaking in 2010, which, as we know, did not happen.

Sabine Fuss et al. (2014) (pdf) demonstrate that of the 116 scenarios that lead to concentrations of 430-480 ppm of CO2 equivalent, 101 of them require net negative emissions. Most scenarios that have net-negative emissions have BECCS providing 10-30% of the world’s primary energy in 2100.

 

From Fuss et al. (2014), showing the historical emissions (black), the four RCPs (heavy coloured lines) and 1089 scenarios assigned to one of the RCPs (light coloured lines).

Read more...

22 comments


The thermometer needle and the damage done

Posted on 6 November 2015 by Andy Skuce &

Rising temperatures may inflict much more damage on already warm countries than conventional economic models predict. In the latter part of the twenty-first Century, global warming might even reduce or reverse any earlier economic progress made by poor nations. This would increase global wealth inequality over the century. (This is a repost from Critical Angle.)

A recent paper published in Nature by Marshall Burke, Solomon M. Hsiang and Edward Miguel Global non-linear effect of temperature on economic production argues that increasing temperatures will cause much greater damage to economies than previously predicted. Furthermore, this effect will be distributed very unequally, with tropical countries getting hit very hard and some northern countries actually benefitting.

Let me attempt a highly simplified summary of what they did. I’m not an economist and this analysis is not straightforward, so beware. If I confuse you, try Dana Nuccitelli’s take or Seth Borenstein’s or Bloomberg’s or The Economist's.

Firstly, Burke et al. looked at factors like labour supply, labour performance and crop yields and how they relate to daily temperature exposure. Generally these show little variation up to temperatures in the high twenties Celsius, at which point they fall off quickly. Secondly, those trends were aggregated to predict the relationship between annual average temperatures and the annual impact on economic output. Thirdly, they looked at annual economic output and average annual temperatures for individual countries for the period 1960-2010. Note that they only compared the economic effects of temperature change on individual countries, they did not correlate one country with another. Using these observations they were able to see how the observations compared with their predicted aggregate curve.

2015-10-28_11-43-09

All figures from Burke et al. (2015).

This work showed that the GDP of countries with an annual average temperature of 13°C were the least sensitive to temperature changes. Colder countries on the left side of the hump would benefit from an increase in temperature, whereas warmer countries would see their output suffer as temperature increases. Note that the figure does not show that a certain temperature predetermines the level of wealth of a country (China, despite recent rapid growth is poorer than the US and Japan even though average annual temperatures are similar). Rather, it illustrates how susceptible countries are to increases or decreases in productivity relative to their annual average temperature.

There is some evidence that rich countries are slightly less affected by changes in temperature (the curve is a little flatter for them). There are few hot and wealthy countries examined in the study, so any general conclusions about them cannot be certain, but the evidence still points to them being more prone to damage from rising temperature than rich, cooler countries. No matter how rich you are, extra heat hurts the warm lands more than it does the temperate and the cool. You can’t buy your way out of the effects of global warming, except by moving away from the Equator or up into the highlands.

Read more...

15 comments


B.C. lowballing fugitive methane emissions from natural gas industry

Posted on 8 October 2015 by Andy Skuce &

This article was first published in the Corporate Knights Magazine.

There is a supplementary article at my blog Critical Angle, that has more detail, links and references, along with an estimation of the GHG emissions (excluding end-use) associated with one liquefied natural gas project and the effect this will have on the feasibility of BC reaching its emissions targets.

There is a further piece at DeSmog Canada, where I compare the situation in BC's gas industry with the Volkswagen emissions reporting scandal, in which a corporation cheats on its emissions tests, with the tacit approval of industry-friendly regulators and governments, only to be exposed by independent researchers performing tests in real-world situations.

The push by British Columbia to develop a new liquefied natural gas (LNG) export industry raises questions about the impact such activities would have on greenhouse gas emissions, both within the province and globally.

One of the single most important factors relates to the amount of methane and carbon dioxide that gets released into the atmosphere, either deliberately through venting or by accident as so-called fugitive emissions. Fugitive emissions are the result of valves and meters that release, by design, small quantities of gas. But they can also come from faulty equipment and from operators that fail to follow regulations.

Photo by Jesús Rodríguez Fernández (creative commons)

According to the B.C. Greenhouse Gas Inventory Report 2012, there were 78,000 tonnes of fugitive methane emissions from the oil and natural gas industry that year. B.C. produced 41 billion cubic metres of gas in 2012. This means about 0.28 per cent of the gas produced was released into the atmosphere.

By North American standards, this is a very low estimate. The U.S. Environmental Protection Agency (EPA) uses a figure of 1.5 per cent leakage, more than five times higher. Recent research led by the U.S. non-profit group, Environmental Defense Fund (EDF), shows that even the EPA estimates may be too low by a factor of 1.5. B.C.’s estimate, in other words, would be about one-eighth of what has been estimated for the American gas industry.

Although the amounts of methane released are small compared to carbon dioxide emissions, methane matters because it packs a much bigger global warming punch. Determining the effect of methane emissions is complicated because molecules of methane only last in the atmosphere for a decade or so and the warming effect from its release depends on the time interval it is measured over. Compared to a given mass of carbon dioxide, the same mass of methane will produce 34 times as much warming over 100 years, or 86 times as much over 20 years.

Read more...

0 comments


Are we overestimating our global carbon budget?

Posted on 15 July 2015 by Andy Skuce &

The latest research suggests that natural sinks of carbon on land may be slowing or even turning into sources, creating climate consequences potentially worse than first thought.

Nature has provided humans with a buffer against the worst effects of our carbon pollution. Since 1750, we have emitted about 580 billion tonnes of carbon into the atmosphere by burning fossil fuels, cutting down forests and making cement. If those emissions had simply accumulated in the air, the concentration of carbon dioxide would have increased from 280 parts per million (ppm), as it was before the Industrial Revolution, to about 550 ppm today. Instead, we currently measure around 400 ppm, which is still a whopping 40 per cent above the planet’s pre-industrial atmosphere, but much less than a doubling.

Some 60 per cent of our emissions have been taken up in natural sinks by, in roughly equal parts, dissolving into the ocean and by being taken up by plants growing faster on land. Were it not for these natural carbon sinks, we would by now be much deeper into dangerous climate change.

As we continue to burn fossil fuels, our climate troubles will become worse should those sinks start to falter. And the outlook will be worse still if those sinks turn into sources of carbon. 

New research

According to the latest research, the carbon sink on land is unfortunately starting to show signs of trouble. Instead of providing a brake on human emissions, the land carbon sink could instead soon be giving our emissions a boost. (The ocean sink appears to be relatively safe for now, although there is a price to pay: the consequence of the process of carbon dioxide dissolving into seawater, ocean acidification, has been called climate change’s evil twin with its own, non-climate related consequences for marine life.)

Plants don't thrive solely on carbon dioxide

Plants don’t thrive solely on carbon dioxide.

Read more...

10 comments


Carbon cycle feedbacks and the worst-case greenhouse gas pathway

Posted on 7 July 2015 by Andy Skuce &

The worst-case emissions pathway, RCP8.5, is a scenario that burns a huge amount of fossil fuels, especially coal. The model has sometimes been criticized as implausible because of its huge resource consumption and emissions of ~1700 billion tonnes of carbon (GtC) over the century. Those emissions are based in part on carbon cycle model assumptions, which recent work suggests may be too optimistic. New research shows that future plant growth may be restricted by nutrient availability, turning the land carbon sink into a source. Also, permafrost feedbacks (not considered in IPCC CMIP5 models) may also add significant emissions to the atmosphere under the RCP8.5 pathway. In addition, the latest research on the Amazon Basin reveals that the tropical forest carbon sinks may already be diminishing there. Together, these feedbacks suggest that the greenhouse gas concentrations in the RCP8.5 case could be achieved with ~400 GtC smaller human emissions, making the RCP8.5 worst-case scenario more plausible.

The climate models referred to  in the recent IPCC Fifth Assessment Report (AR5) are founded on one of four Representative Concentration Pathways or RCPs. The key word in RCP is concentration. In the RCPs, the concentration of greenhouse gases is fixed at different times in the future and the climate model (or general circulation model or GCM) uses those atmospheric concentrations to calculate future climate states. Underpinning the concentration pathways are socio-economic and emissions scenarios. There can be more than one underlying emissions scenario capable of producing the concentration pathway.

If you are unfamiliar with RCPs, check out the great guide that Graham Wayne wrote in August 2013 for Skeptical Science.

This way of modelling differs from previous approaches in which the starting point was a story or scenario about economic and social development that led to emissions. These emissions are run through a carbon-cycle model (which may be simple or complex) to produce atmospheric concentrations over time. 

The schematic illustrates the differences in approach. The elements in red boxes are the prescribed inputs into the models, whereas the elements in blue ellipses are outputs. The advantage of the RCP prescribed-concentration approach is that the climate model outputs do not depend to the same degree on carbon-cycle models as they did in the emissions scenario method. The disadvantage is that there is no unique link between concentrations and emissions. The schematic is simplified in that there are feedbacks and loops in the processes that are not illustrated. 

The worst-case scenario among the four Representative Concentration Pathways (RCPs) is known as RCP8.5. The number “8.5” refers to the radiative forcing level measured in W/m2 in the year 2100. RCP8.5, despite it often being called “business-as usual”, has been criticized as an unlikely outcome. While true, that’s more feature than bug, since, as one of the two extreme pathways, it is designed to provide climate modellers with an unlikely, but still just plausible “how bad could it be” scenario.

Let’s look briefly at some of the underlying socio-economic assumptions behind RCP8.5, then we’ll examine how the latest research on the terrestrial carbon cycle makes the GHG concentrations in the RCP8.5 model easier to reach.

Read more...

19 comments


Why the 97 per cent consensus on climate change still gets challenged

Posted on 18 May 2015 by Andy Skuce &

Here are some excerpts from an article I wrote for the magazine Corporate Knights, published on May 14, 2015. Some references and links have been added at the end.

In 2004, science historian Naomi Oreskes published a short paper in the journal Science concluding there was an overwhelming consensus in the scientific literature that global warming was caused by humans.

After the paper’s release, there was some unexpectedly hostile reaction. This prompted Oreskes and her colleague Erik Conway to go even deeper with their research, leading to the publication of the book Merchants of Doubt. It documents how a small group of scientists with links to industry were able to sow doubt about the scientific consensus and delay effective policy on DDT, tobacco, acid rain and, now, global warming.

Fast forward to two years ago: a team of volunteer researchers (myself included) associated with the website Skeptical Science decide to update and extend Oreskes’ research. Led by University of Queensland researcher John Cook, we analyzed the abstracts of about 12,000 scientific papers extracted from a large database of articles, using the search terms “global warming” and “global climate change.” The articles had been published over a 21-year period, from 1991 to 2011.

As an independent check on our results, we also sent emails to the more than 8,500 scientist authors of these articles. (These were the scientists whose e-mail addresses we were able to track down). We asked them to rate their own papers for endorsement or rejection of man-made global warming.

Both approaches yielded a very similar result: 97 per cent of the scientific literature that expresses an opinion on climate change endorses the expert consensus view that it is man-made. The results were published in May 2013 in the journal Environmental Research Letters.

We were astonished by the positive reception. Mention of the paper was tweeted by U.S. President Barack Obama, Al Gore and Elon Musk, among others. Obama later referenced it in a speech at the University of Queensland, while U.S. Secretary of State John Kerry has referred to the 97 per cent consensus in recent speeches. John Oliver based an episode of his HBO comedy show Last Week Tonight around it, a clip viewed online more than five million times.

Read more...

32 comments



COP21 LiveBlog


The Consensus Project Website

TEXTBOOK

THE ESCALATOR

(free to republish)

THE DEBUNKING HANDBOOK

BOOK NOW AVAILABLE

The Scientific Guide to
Global Warming Skepticism

Smartphone Apps

iPhone
Android
Nokia

© Copyright 2016 John Cook
Home | Links | Translations | About Us | Contact Us