Enter a term in the search box to find its definition.
Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).
|Home Arguments Software Resources Comments The Consensus Project Translations About Donate|
Recent blog posts
Posted on 30 September 2016 by Andy Skuce &
This article was originally published online at Corporate Knights Magazine and will appear in the publication's Fall 2016 hard-copy magazine.
Climate scientists are certain that human-caused emissions have increased carbon dioxide in the atmosphere by 44 per cent since the Industrial Revolution. Very few of them dispute that this has already caused average global temperatures to rise roughly 1 degree. Accompanying the warming is disruption to weather patterns, rising sea levels and increased ocean acidity. There is no doubt that further emissions will only make matters worse, possibly much worse. In a nutshell, that is the settled science on human-caused climate change.
What scientists cannot yet pin down is exactly how much warming we will get in the future. They do not know with precision how much a given quantity of emissions will lead to increased concentrations of greenhouse gases in the atmosphere. For climate impact it is the concentrations that matter, not the emissions. Up until now, 29 per cent of human emissions of carbon dioxide has been taken up by the oceans, 28 per cent has been absorbed by plant growth on land, and the remaining 43 per cent has accumulated in the atmosphere. Humans have increased carbon dioxide concentrations in the atmosphere from a pre-industrial level of 280 parts per million to over 400 today, a level not seen for millions of years.
There’s a possibility that the 43 per cent atmospheric fraction may increase as ocean and terrestrial carbon sinks start to become saturated. This means that a given amount of emissions will lead to a bigger increase in concentrations than we saw before. In addition, the warming climate may well provoke increased emissions from non-fossil fuel sources. For example, as permafrost thaws, the long-frozen organic matter contained within it rots and oxidizes, giving off greenhouse gases. Nature has given us a major helping hand, so far, by the oceans and plants taking up more than half of our added fossil carbon, but there’s no guarantee that it will continue to be so supportive forever. These so-called carbon-cycle feedbacks will play a big role in determining how our climate future will unfold, but they are not the largest unknown.
Atmospheric physicists have long tried to pin down a number to express what they refer to as climate sensitivity, the amount of warming we will get from a certain increase in concentration of greenhouse gases. Usually, this is expressed as the average global warming, measured in degrees Celsius that results from a doubling of carbon dioxide concentrations. The problem is not so much being able to calculate how much warming the doubling of the carbon dioxide alone will cause – that is relatively easy to estimate and is about 1 degree C. The big challenge is in figuring out the range of size of the feedbacks. These are the phenomena that arise from warming temperatures and that amplify or dampen the direct effects of the greenhouse gases that humans have added to the atmosphere.
The biggest feedback is water vapour, which is actually the most important single greenhouse gas in the atmosphere. Warm air holds more water vapour. As carbon dioxide increases and the air warms, there is plenty of water on land and in the sea available to evaporate. The increased amount of vapour in the air, in turn, provokes more warming and increased evaporation. If temperatures go down, the water vapour condenses and precipitates out of the atmosphere as rain and snow. Water vapour goes quickly into and out of the air as temperatures rise and fall, but the level of carbon dioxide stays around for centuries, which is why water vapour is considered a feedback and not a forcing agent. Roughly speaking, the water vapour feedback increases the sensitivity of carbon dioxide alone from 1 to 2 degrees C.
Posted on 9 September 2016 by Andy Skuce &
This is a re-post from Critical Angle
Climate scientist Michael Mann has teamed up with cartoonist Tom Toles to write TheMadhouse Effect: How Climate Change Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy. It’s an excellent book, well-written, authoritative on the science, revealing on the politics and laced with the wit of many superb cartoons. Buy a copy for the climate science doubter in your family. They will be drawn in by the cartoons and may well be unable to resist dipping in to the text.
Michael Mann has previously written The Hockey Stick and the Climate Wars: Dispatches from the Front Lines, about how he was hounded for writing a paper that featured the striking hockey-stick graph. He also authored Dire Predictions: Understanding Climate Change with scientist Lee Kump. At the same time that he turns out first-class books, Mann is a prolific research scientist and has an active presence on social media. You can only wonder how he does it all.
Tom Toles is a Pullitzer-Prize winning cartoonist who works for the Washington Post. His main focus is politics, but his cartoons have often featured climate science and the absurd lengths that many American politicians go to in avoiding the facing up to the reality of global change.
Writing about scientific subjects like climate change for the non-specialist is not easy and authors have to walk a fine line. Many readers expect scientists to be detached about the implications of their work, but that would make their message less engaging, less human. The science needs to be explained in ways that the average person can understand, but oversimplification can gloss over some of the important complications. And treatments of the topic can so so easily be depressing and dull. The Mann/Toles team have succeeded in bringing their talents together to overcome these problems. The writing is excellent and the cartoons add a much-needed satirical perspective.
Posted on 5 May 2016 by Andy Skuce &
Originally published in Corporate Knights Magazine
In 1998 and 1999, American scientists Michael Mann, Raymond Bradley and Malcolm Hughes published two papers that reconstructed the average temperatures of the northern hemisphere back to the year 1000. The articles showed a temperature profile that gently declined from 1000 to 1850, fluctuating a little along the way, with a sudden increase in the late nineteenth and the twentieth centuries. The graph was nick-named “the Hockey Stick”, with its long relatively straight handle showing the stable pre-industrial climate and the blade representing the sudden uptick in the last 150 years.
The diagram was a striking depiction of the abrupt warming that had occurred since the Industrial Revolution compared to what happened before. For those opposed to the scientific consensus on Anthropogenic Global Warming (AGW), the Hockey Stick posed a threat and had to be broken.
As detailed in Mann’s 2013 book The Hockey Stick and the Climate Wars: Dispatches from the Front Lines,his critics employed a variety of tactics to try to break the hockey stick. They disputed the statistical methods that Mann and his colleagues used, although they never produced new results of their own. Stolen private conversations were quote-mined for damning phrases. Senior US politicians and the right-wing press denounced the work a fraud.
Mann and other scientists were subjected to numerous investigations, all of which exonerated the Hockey Stick authors. Most importantly, other researchers, using alternative methods and new data, produced additional temperature curves that closely matched the original results of Mann et al. Nevertheless, the attacks on the original Hockey Stick continued, as has the harassment of Mann by right-wing pundits. If you need to deny the consensus on AGW, you have to keep repeating that the “Hockey Stick is Broken”. Never mind that it is intact and that there are enough new sticks to equip an NHL team.
There are parallels with the reception given to the paper Quantifying the consensus on anthropogenic global warming in the scientific literature, published in 2013 by University of Queensland researcher John Cook and eight volunteers associated with the website Skeptical Science (including me). The paper, published in the journal Environmental Research Letters (ERL), has been deemed a hoax and a fraud, by contrarian bloggers as well as by Republican presidential hopefuls such as Ted Cruz and Rick Santorum.
Posted on 12 April 2016 by Andy Skuce &
This is reposted from Critical Angle with slight modifications and updates.
In a recent article in Skeptical Inquirer, geologist and writer James Lawrence Powell, claims that there is a 99.99% scientific consensus on Anthropogenic Global Warming (AGW). You might think that after all of the harsh criticism that the 2013 Cook et al. paper (C13) has received from climate contrarians that we would be pleased to embrace the results of a critique that claims we were far too conservative in assessing the consensus. While it certainly does make a nice change from the usual rants and overblown methodological nit-picks from the contrarians, Powell is wrong to claim such a very high degree of agreement.
He makes many of the same errors that contrarian critics make: ignoring the papers self-rated by the original authors; and making unwarranted assumptions about what the “no-position” abstracts and papers mean.
Powell’s methodology was to search the Web of Science to review abstracts from 2013 and 2014. He added the search term “climate change” to the terms “global climate change” and “global warming” that were used by C13. He examined 24,210 papers co-authored by 69,406 scientists and found only five papers written by four authors that explicitly reject AGW. Assuming the rest of the abstracts endorsed AGW, this gives consensus figures of 99.98% (by abstract) and 99.99% (by author).
His definition of explicit rejection would align roughly with the seventh level of endorsement used in C13: “Explicitly states that humans are causing less than half of global warming”. In the abstracts from 1991-2011, C13 found 9 out of 11,914 that fit level 7, which using Powell’s consensus calculation assumptions, would yield 99.92%. So, there is probably not much difference between the two approaches when it comes to identifying an outright rejection paper. It’s what you assume the other abstracts say—or do not say—that is the problem.
C13 also counted as “reject AGW” abstracts that: “Implies humans have had a minimal impact on global warming without saying so explicitly, e.g., proposing a natural mechanism is the main cause of global warming”. These are more numerous than the explicit rejections and include papers by scientists who consider that natural causes are more important than human causes in recent warming, but who do not outright reject some small human contribution.
Competing Climate Consensus Pacmen. Cook on the left, Powell on the right.
Posted on 24 March 2016 by Andy Skuce &
Originally published at Corporate Knights on March 17, 2016.
Last year was the warmest year recorded since the measurement of global surface temperatures began in the nineteenth century. The second-warmest year ever was 2014. Moreover, because of the persisting effects of the equatorial Pacific Ocean phenomenon known as El Niño, many experts are predicting that 2016 could set a new annual record. January and February have already set new monthly records, with February half a degree Celsius warmer than any February in history.
This news is deeply unsettling for those who care about the future of the planet. But it is even more upsetting for people opposed to climate mitigation, since it refutes their favourite talking point – that global warming has stalled in recent years.
U.S. Congressman Lamar Smith claims there has been a conspiracy among scientists to fudge the surface temperature records upwards and has demanded, by subpoena, to have scientists’ emails released.
Senator and presidential candidate Ted Cruz recently organized a Senate hearing on the temperature record in which he called upon carefully selected witnesses to testify that calculations of temperature made by satellite observations of the upper atmosphere are superior to measurements made by thermometers at the Earth’s surface.
It’s easy to cherry-pick data in order to bamboozle people. The process of making consistent temperature records from surface measurements and satellite observations is complicated and is easy to misrepresent.
But the fact remains that there are no conspiracies afoot. Here’s why.
Posted on 13 January 2016 by Andy Skuce &
This article was originally published online at Corporate Knights and will appear in the hard copy Winter 2016 Edition of the Corporate Knights Magazine, which is to be included as a supplement to the Globe and Mail and Washington Post later in January 2016. The photograph used in the original was changed for copyright reasons.
Human civilization developed over a period of 10,000 years during which global average surface temperatures remained remarkably stable, hovering within one degree Celsius of where they are today.
If we are to keep future temperatures from getting far outside that range, humanity will be forced to reduce fossil fuel emissions to zero by 2050. Halving our emissions is not good enough: we need to get down to zero to stay under the 2 C target that scientists and policy makers have identified as the limit beyond which global warming becomes dangerous.
Shell boasting about its government-funded Quest CCS project, on a Toronto bus. (Photo: rustneversleeps) "Shell Quest captures over one-third of our oil sands upgrader emissions"
Many scenarios have been proposed to get us there. Some of these involve rapid deployment of solar and wind power in conjunction with significant reductions in the amount of energy we consume.
However, many of the economists and experts who have developed scenarios for the Intergovernmental Panel on Climate Change (IPCC) believe that the only way to achieve the two-degree goal in a growing world economy is to invest in large-scale carbon capture and storage (CCS) projects. These technologies capture carbon dioxide from the exhausts of power stations and industrial plants and then permanently store it, usually by injecting it into underground rock layers.
Even with massive deployment of CCS over coming decades, most scenarios modelled by the IPCC overshoot the carbon budget and require that in the latter part of the century, we actually take more carbon out of the atmosphere than we put into it. Climate expert Kevin Anderson of the Tyndall Centre for Climate Change Research at the University of Manchester recently reported in Nature Geoscience that, of the 400 IPCC emissions scenarios used in the 2014 Working Group report to keep warming below two degrees, some 344 require the deployment of negative emissions technologies after 2050. The other 56 models assumed that we would start rapidly reducing emissions in 2010 (which, of course, did not happen). In other words, negative emissions are required in all of the IPCC scenarios that are still current.
One favoured negative emissions technology is bioenergy with carbon capture and storage (BECCS). This involves burning biomass – such as wood pellets – in power stations, then capturing the carbon dioxide and burying it deep in the earth. The technology has not yet been demonstrated at an industrial scale. Using the large amounts of bioenergy envisioned in such scenarios will place huge demands on land use and will conflict with agriculture and biodiversity needs.
Posted on 31 December 2015 by Andy Skuce &
On Sunday November 22nd, 2015, Alberta's new centre-left Premier, Rachel Notley, announced that the province would be introducing an economy-wide carbon tax priced at $30 per tonne of CO2 equivalent, to be phased in in 2016 and 2017. Observers had been expecting new efforts to mitigate emissions since Notley's election in May 2015, but the scope and ambition of this policy took many by surprise.
Alberta, of course, is the home of the Athabasca oil sands and is one of the largest per-capita GHG emitters of any jurisdiction in the world. The new plan was nevertheless endorsed by environmental groups, First Nations and by the biggest oil companies, an extraordinary consensus that many would not have thought possible.
How was this done? I will try and explain the new policy as far as I can (the details are not all available yet), but the short answer is that a huge amount of credit is due to the panel of experts led by University of Alberta energy economist Andrew Leach and his fellow panelists. Not only did they listen to what all Albertans had to say, but they were thoughtful in framing a policy that is acceptable to almost everyone.
Alberta is the wealthiest province in Canada, with a population of 4.1 million. In 2013, greenhouse gas emissions were 267 Mt CO2 equivalent, about 65 tonnes per capita, which compares with the average for the rest of Canada of about 15 tonnes. Among US states only North Dakota and Wyoming are worse. Alberta's fugitive emissions of methane alone amount to 29 Mt CO2e, about 7 tonnes per person, which is a little more than the average for all GHGs per-capita emissions in the world.
Posted on 9 December 2015 by Andy Skuce &
In the first part of this series, I examined the implications of relying on CCS and BECCS to get us to the two degree target. In the second part, I took a detailed look at Kevin Anderson's arguments that IPCC mitigation scenarios aimed at two degrees are biased towards unproven negative-emissions technologies and that they consequently downplay the revolutionary changes to our energy systems and economy that we must make very soon. In this last part, I'm going to look at the challenges that the world faces in fairly allocating future emissions from our remaining carbon budget and raising the money needed for climate adaptation funds, taking account of the very unequal past and present.
Until now, economic growth has been driven and sustained largely by fossil fuels. Europe and North America started early with industrialization and, from 1800 up to around 1945, this growth was driven mainly by coal. After the Second World War there was a period of rapid (~4% per year) economic growth in Europe, N America and Japan, lasting about thirty years, that the French refer to as Les Trente Glorieuses, The Glorious Thirty. This expansion was accompanied by a huge rise in the consumption of oil, coal and natural gas. After this there was a thirty-year period of slower growth (~2%) in the developed economies, with consumption fluctuations caused by oil-price shocks and the collapse of the Soviet Union. During this time, oil and coal consumption continued to grow, but not as steadily as before. Then, at the end of the twentieth century, economic growth took off in China, with a huge increase in the consumption of coal.
If we are to achieve a stable climate, we will need to reverse this growth in emissions over a much shorter time period, while maintaining the economies of the developed world and, crucially, allowing the possibility of economic growth for the majority of humanity that has not yet experienced the benefits of a developed-country middle-class lifestyle.
Here are the the annual emissions sorted by country and region:
Posted on 26 November 2015 by Andy Skuce &
The first part of this three-part series looked at the staggering magnitude and the daunting deployment timescale available for the fossil fuel and bioenergy carbon capture and storage technologies that many 2°C mitigation scenarios assume. In this second part, I outline Kevin Anderson's argument that climate experts are failing to acknowledge the near-impossibility of avoiding dangerous climate change under current assumptions of the political and economic status quo, combined with unrealistic expectations of untested negative-emissions technologies.
Kevin Anderson has just written a provocative article titled: Duality in climate science, published in Nature Geoscience (open access text available here). He contrasts the up-beat pronouncements in the run-up to the Paris climate conference in December 2015 (e.g. “warming to less than 2°C” is “economically feasible” and “cost effective”; “global economic growth would not be strongly affected”) with what he see as the reality that meeting the 2°C target cannot be reconciled with continued economic growth in rich societies at the same time as the rapid development of poor societies. He concludes that: “the carbon budgets associated with a 2 °C threshold demand profound and immediate changes to the consumption and production of energy”.
His argument runs as follows: Integrated Assessment Models, which attempt to bring together, physics, economics and policy, rely on highly optimistic assumptions specifically:
o Unrealistic early peaks in global emissions;
He notes that of the 400 scenarios that have a 50% or better chance of meeting the 2 °C target, 344 of them assume the large-scale uptake of negative emissions technologies and, in the 56 scenarios that do not, global emissions peak around 2010, which, as he notes, is contrary to the historical data.
I covered the problems of the scalability and timing of carbon capture and storage and negative emissions technologies in a previous article.
Posted on 16 November 2015 by Andy Skuce &
This post looks at the feasibility of the massive and rapid deployment of Carbon Capture and Storage and negative-emissions Bioenergy Carbon Capture and Storage technologies in the majority of IPCC scenarios that avoid dangerous global warming. Some observers question whether the deployment of these technologies at these scales and within the required time frames is achievable. This is Part One of a three-part series on the challenge of keeping global warming under 2 °C.
The various emissions models that have been used to produce the greenhouse gas concentration pathway to 2°Celsius vary considerably, but the majority of them require huge deployment of Carbon Capture and Storage (CCS) as well as net-negative global emissions in the latter part of the twenty-first century. The only negative emissions methods generally considered in these scenarios are bioenergy capture and storage (BECCS) and land-use changes, such as afforestation. For there to be net-negative emissions, positive emissions have to be smaller than the negative emissions.
Kevin Anderson (2015) (open-access text) reports that of the 400 scenarios that have a 50% chance or greater of no more than 2 °C of warming, 344 assume large-scale negative emissions technologies. The remaining 56 scenarios have emissions peaking in 2010, which, as we know, did not happen.
Sabine Fuss et al. (2014) (pdf) demonstrate that of the 116 scenarios that lead to concentrations of 430-480 ppm of CO2 equivalent, 101 of them require net negative emissions. Most scenarios that have net-negative emissions have BECCS providing 10-30% of the world’s primary energy in 2100.
From Fuss et al. (2014), showing the historical emissions (black), the four RCPs (heavy coloured lines) and 1089 scenarios assigned to one of the RCPs (light coloured lines).
|© Copyright 2016 John Cook|
|Home | Links | Translations | About Us | Contact Us|