Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Twitter Facebook YouTube Mastodon MeWe

RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

Huber and Knutti Quantify Man-Made Global Warming

Posted on 10 December 2011 by dana1981

Huber and Knutti (2011) have published a paper in Nature Geoscience, Anthropogenic and natural warming inferred from changes in Earth’s energy balance.  They take an approach in this study which utilizes the principle of conservation of energy for the global energy budget to determine  and quantify the various contributions to the observed global warming since 1850 and 1950.  Over both timeframes, the authors find that human greenhouse gas emissions are the dominant cause of global warming.

Huber and Knutti summarize their methodology as follows:

"We use a massive ensemble of the Bern2.5D climate model of intermediate complexity, driven by bottom-up estimates of historic radiative forcing F, and constrained by a set of observations of the surface warming T since 1850 and heat uptake Q since the 1950s....Between 1850 and 2010, the climate system accumulated a total net forcing energy of 140 x 1022 J with a 5-95% uncertainty range of 95-197 x 1022 J, corresponding to an average net radiative forcing of roughly 0.54 (0.36-0.76)Wm-2."

Essentially, Huber and Knutti take the estimated global heat content increase since 1850, calculate how much of the increase is due to various estimated radiative forcings, and partition the increase between increasing ocean heat content and outgoing longwave radiation.  The authors note that more than 85% of the global heat uptake (Q) has gone into the oceans, including increasing the heat content of the deeper oceans, although their model only accounts for the upper 700 meters.

Figure 1 is a similar graphic to that presented in Meehl et al. (2004), comparing the average global surface warming simulated by the model using natural forcings only (blue), anthropogenic forcings only (red), and the combination of the two (gray).

knutti attribution

Figure 1: Time series of anthropogenic and natural forcings contributions to total simulated and observed global temperature change. The coloured shadings denote the 5-95% uncertainty range.

In Figure 2, Huber and Knutti break down the anthropogenic and natural forcings into their individual components to quantify the amount of warming caused by each since the 1850s (Figure 2b), 1950s (2c), and projected from 2000 to 2050 using the IPCC SRES A2 emissions scenario as business-as-usual (2d).

knutti breakdown

Figure 2: Contributions of individual forcing agents to the total change in the decadal average temperature for three time periods. Error bars denote the 5–95% uncertainty range. The grey shading shows the estimated 5–95% range for internal variability based on the CMIP3 climate models. Observations are shown as dashed lines.

Natural Variability

Some recent articles on this paper have slightly misinterpreted its conclusions, claiming that it attributes three-quarters of the warming since 1950 to humans, and most of the rest to natural variability.  This is not correct, as the authors conclude (emphasis added):

"Our results show that it is extremely likely that at least 74% (+/- 12%, 1 sigma) of the observed warming since 1950 was caused by radiative forcings, and less than 26% (+/- 12%) by unforced internal variability."

Internal variability is shown in the grey shading of Figures 2b to 2d.  While it could account for as much as ~0.15°C warming since 1950, it could also account for ~0.15°C cooling, or anything in between.  What the authors have concluded is that natural variability can very likely account for no more than 26% of the warming since 1950, and no more than 18% since 1850 (and in both cases, the most likely value is close to zero).

The correct interpretation of this statement, confirmed by Dr. Knutti in a personal communication, is that there is 95% certainty that external forcings are responsible for between 74% and 122% of the observed warming since 1950, with a most likely value of close to 100%.  Or as Dr. Knutti put it (personal communication):

"Our best estimate is that close to 100% is forced, but because the whole method is probabilistic, the forced component could be a bit smaller, with some variability contributing to the warming, but the forced part could also be larger than observed, with variability opposing it.  More technically it’s saying that 95% of the PDF [probability distribution function] is above 0.74 times the observed warming."

Additionally, Huber and Knutti note that natural variability cannot account for the observed global warming:

"For global surface temperature it is extremely unlikely (<5% probability) that internal variability contributed more than 26 +/- 12% and 18 +/- 9% to the observed trends over the last 50 and 100 years, respectively.  Even if models were found to underestimate internal variability by a factor of three, it is extremely unlikely that internal variability could produce a trend as large as observed."

Natural Forcings

The authors also note that the relatively small contribution of natural variability to the observed long-term temperature change is consistent with past climate data:

"This is consistent with reconstructions over the last millennium indicating relatively small temperature variations that can mostly be explained by solar and volcanic forcing"

Solar and volcanic activity are the main natural forcings included in the Huber and Knutti study.  Both are slightly positive since 1850, and account for approximately 0.2°C of the observed 0.8°C surface warming over that period.  Since 1950, the volcanic forcing has been negative due to a few significant eruptions, and has offset the modestly positive solar forcing, such that the net natural external forcing contribution to global warming over the past 50 years is approximately zero (more specifically, the authors estimate the natural forcing contribution since 1950 at -10 to +13%, with a most likely value of 1%).

The authors also note that they chose a reconstruction with high variability in solar irradiance, so if anything they may have overestimated the natural contribution to the observed warming.

"Even for a reconstruction with high variability in total irradiance, solar forcing contributed only about 0.07°C (0.03-0.13°C) to the warming since 1950."

Human Forcings

As expected, Huber and Knutti find that greenhouse gases contributed to substantial warming since 1850.  In fact, greenhouse gases caused greater warming than was observed, because much of that warming was offset by the cooling effect of human aerosol emissions.

"Greenhouse gases contributed 1.31°C (0.85-1.76°C) to the increase, that is 159% (106-212%) of the total warming. The cooling effect of the direct and indirect aerosol forcing is about -0.85°C (-1.48 to -0.30°C). The warming induced by tropospheric ozone and solar variability are of similar size (roughly 0.2°C). The contributions of stratospheric water vapour and ozone, volcanic eruptions, and organic and black carbon are small."

This is similar to, and actually somewhat higher than the Skeptical Science back-of-the-envelope estimate of 0.8C warming over this period from CO2 (~1.1°C from all long-lived greenhouse gases).  Those who constantly seek to downplay the role of greenhouse gases in the current warming would do well to learn from the results of Huber and Knutti.

Since 1950, the authors find that greenhouse gases contributed 166% (120-215%) of the observed surface warming (0.85°C of 0.51°C estimated surface warming).  The percentage is greater than 100% because aerosols offset approximately 44% (0.45°C) of that warming.

"It is thus extremely likely (>95% probability) that the greenhouse gas induced warming since the mid-twentieth century was larger than the observed rise in global average temperatures, and extremely likely that anthropogenic forcings were by far the dominant cause of warming. The natural forcing contribution since 1950 is near zero."

Climate Sensitivity

Huber and Knutti ran thousands of model simulations and came up with the following climate sensitivity distribution:

"The resulting distribution of climate sensitivity (1.7-6.5°C, 5-95%, mean 3.6°C) is also consistent with independent evidence derived from palaeoclimate archives."

As the authors note, these results are consistent with the climate sensitivity range given by the IPCC and Knutti and Hegerl (2008).

The Future

If we continue with business-as-usual emissions scenario A2, Huber and Knutti project approximately 1.3°C warming between the 2000s and 2050s, or more than 2°C above pre-industrial levels, surpassing the "danger limit" by mid-century.  Due to the likely decrease in aerosol emissions as a result of efforts to achieve clean air and transition away from fossil fuels, the aerosol cooling by mid-century is expected to decrease.  Thus, as shown in Figure 2d above, the net warming by the 2050s is projected to be very similar to the greenhouse gas-caused warming.

Summary

Overall, Huber and Knutti implement a robust method (using conservation of energy) to confirm that humans are the dominant cause of the observed warming over the past 150 years, and particularly over the past 50 years.  In fact, greenhouse gases have very likely caused more warming than has been observed, due to the offsetting cooling effect from human aerosol emissions.

Huber and Knutti's results are also consistent with the body of climate sensitivity research, with an average value (3.6°C for doubled CO2) slightly higher than the IPCC best estimate (3°C).  And they find that if we continue with business-as-usual, we will reach 2°C above pre-industrial levels within the next few decades.  All the more reason to change course.

Note: the rebuttals to Increasing CO2 has little to no effect, It's natural variability, and It's the sun have been  updated to include the results from Huber and Knutti (2011)

0 0

Printable Version  |  Link to this page

Comments

Comments 1 to 19:

  1. Huber and Knutti write in their paper : "The basis for our energy balance model and a crucial step in determining the contributions of anthropogenic and natural (solar and volcanic) forcings to the observed changes is the magnitude of the internal unforced variability of global temperature and energy content. Figure 4 compares the observed trends in global average temperature and energy content over the past 50 years with the distribution of 50-year linear trends derived from unforced control runs in the World Climate Research Programme's (WCRP) phase 3 Climate Model Intercomparison Project." Where can we find informations about this "crucial step"? It is unclear for me how the EBMs / AOGCMs deal with unforced or intrinsic variability, either at low or high frequency, but particularly for hypothetical low frequency modes (> 30 yrs). If such a variability exists, what do we know about its physical mechanisms and how are they implemented in models? What are the methods used by modellers for assessing the quality (realism) of the "unforced control runs" from which they can deduce the forced variability? Thank you for any recent link centered on these questions.
    0 0
  2. skept - I can't answer your question, but you could try to contact Dr. Knutti, if you really want an answer. His email is listed in the paper, which is linked at the top of the post.
    0 0
  3. Link to liberated pdf. This result really just confirms common sense. There has been too much warming in too little time for it to be only coincidentally connected to the known drivers. As Gavin previously noted at RC about the fraction of warming likely due to human forcing:
    Over the last 40 or so years, natural drivers would have caused cooling, and so the warming there has been (and some) is caused by a combination of human drivers and some degree of internal variability. I would judge the maximum amplitude of the internal variability to be roughly 0.1 deg C over that time period, and so given the warming of ~0.5 deg C, I'd say somewhere between 80 to 120% of the warming. Slightly larger range if you want a large range for the internal stuff. - gavin]
    0 0
  4. Dana: Excellent post. I especailly like the introductory paragprah that captures the essence of the article. You've set a good example for other SkS authors to follow.
    0 0
  5. skept.fr @1, information about the CMIP 3 model runs can be found . The data can be downloaded here (requires registration, and may require non-commercial institutional affiliation). The data listed is probably from experiment 1 or experiment 2.
    0 0
  6. Dana : maybe one of the author will come here for comments, as Nathan Urban and Andreas Schmittner did in a previous discussion, so I’ll wait before disturbing Dr Knutti. (And it would be more informative for SkS readers than a private exchange.) But thanks for the information. Tom : unfortunately, I’m a layman and I cannot interpret the terabytes of data from CMIP3, I’m not even sure I could read them on my computer ! I hope my point refers to some published and free articles in the literature, where climate scientists discuss the challenges and methods for dealing with unforced change in the system. I did find some documents (like this Shukla 2010 presentation , but this is a bit elliptic and complex for my level of understanding, I'd prefer a more introductive paper.
    0 0
  7. I'm not sure how the forcings in this study were determined. The text says "Although the estimates for most forcing agents are similar, we infer a larger energy flux from variations in solar irradiance as a result of the particular forcing reconstruction used. If anything our estimate of the solar contribution is likely to be overestimated (see Methods)." In methods they point to [15] Joos 2001 which has nothing on solar forcings and [16] Crowley 2000 which shows solar forcing quite variable (e.g. Crowley fig 2B) nothing like the smooth rise in the current paper fig 2c. Nor do other depictions of TSI (e.g. "It's the sun" thread) match the smooth rise in fig 2c. It looks to me like fig 2c is a model output in the current study. If that is the case, what is the model input, specifically for solar forcing, or is it simply an output (essentially what the simulation came up with to match observed temperature rises and other constraints).
    0 0
  8. skept.fr @6, unfortunately I cannot find any succinct discussion of the issue, so I'll have to do the best I can myself. The CMIP3 is a collection of model runs from 25 different models under different configurations. Some of the models are different versions of the same underlying architecture. For example, there are three GISS models, differing in ocean configuration and resolution. Each model did multiple runs. When set up for 1956 conditions and run for 50 years with no forcings, they show the following distribution of 50 year temperature trends (fig 4 a from Huber and Knutti, 2011): This is a histogram of the 50 year trends obtained by the CMIP3 constant forcing experiment. As you can see the mean of the trends is zero, and the 1, and 2 standard deviations being shown be blue bars below the graph. A quick measurement shows that the surface temperature record (red bars on the right) with the lowest trend is (rounded down) 6.8 standard deviations above the mean, which means that there is less than a 1 in 100 billion chance that the temperature trend over that period arose by unforced variability if the climate models fairly represent internal variability in the climate. I think the assumption of fair representation is a good approximation (though unlikely to be exactly true). More importantly, we definitely know that there have been forcings over that period, so attributing the trend to unforced variability while ignoring the known forcings is foolish. Turning directly to your question @1, assuming the models fairly represent internal variability, then we know that there are no significant natural internal variable cycles of greater of 30 - 100 year length because if there were the distribution of the histogram would not be so tightly constrained. Of course, many of the models had very simple oceans, so a long term internal cycle may exist but not be reflected in most of the models. However, as seen in the residual of the CMIP3 21 model mean from HadCRUT3, there is no apparent cycle in the residual. That means there is little statistical evidence to suspect a cycle. Indeed, to attribute the large scale temperature variations over the century to internal variability, you would need to find a reason as to why the known forcings did not apply. (Source PDF) Further, there are good physical reasons to doubt the existence of such long term cycles of internal variability. Specifically, such a cycle would mean the Earth must maintain a net energy imbalance for sustained periods. That is highly unlikely. Finally, the internal variability that exists in the climate can be analogized to a pendulum, and under forcing may well be analogical to a forced pendulum. That means the internal variability under an unforced state may well not match that under a forced condition, ie, the conditions that actually exist. In that case, we would expect an increase in natural variability with time as the forcing becomes stronger. Following the pendulum analogy, that increase would not be consistent over time, and may well include periods of reduced variability. But statistically, over time there would be an increase. There is in fact some evidence of that, but the increase in variability is uncertain with regard to ENSO and precipitation, and relatively small with regard to temperature. Therefore this possibility is unlikely to significantly alter Huber and Knutti's result.
    0 0
  9. Eric (skeptic) @7, if you look at figure 1 a, the solar cycle shows up in the solar forcing until 2000. Thereafter it is smooth. The fluctuations due to the solar cycle appear small due to the scale, not because they are ignored. Presumably in the chart of cumulative contribution the small fluctuations due to the solar cycle make so little difference to the cumulative change as to be indiscernible. Alternatively, that chart has an 11 year plus smooth that would effectively eliminate the solar cycle from the data. Please note that by flat lining the solar contribution at the average value from 2000 forward, Huber and Knutti over estimate the solar contribution on average over the last eleven years in that the most recent solar cycle was much smaller than the one that preceded it.
    0 0
  10. Thanks Tom. I zoomed fig. 1a and the solar there looks fairly representative. The flatlining of the data from 2000 on makes sense, since that is the date of those papers. It looks like ref 17 will help explain this paper, it is Huber's thesis (too big to download ATTM, but the abstract looks promising).
    0 0
  11. I downloaded the thesis and it has the proper caveat:"Therefore, we emphasize that the distributions of climate sensitivity estimates derived here cannot be regarded as proper probability distribution functions since the prerequisite of independence of both the climate models and the indices is not fulfilled in the framework of this study." on page 26. That applies to the picture Tom posted: not a probability distribution, but a model run distribution assuming particular models and model parameters.
    0 0
  12. Please excuse these basic questions, but 1)many sources I come across list the rise in land-surface temps in the 'last century' as O.8 or 0.9 C...but the BEST graph, for example, seems to show a rise of about 1.2 C when taking 1900-1910 as a start point. Is 1.2 more accurate? 2) What is the generally accepted figure of total warming from start of industrial age (i.e. Fossil Fuel Era)and what approximate date is used as the 'start date' of ramp up due to GHGs? I have found these basic questions a bit challenging to get a solid fix on and I am writing a book using my avoidance/denial to acceptance journey with major disease as a metaphor to how humans are facing (or not) climate change Thanks!
    0 0
  13. dagold @12, the BEST project currently only has a land temperature index. Land has warmed faster than the sea, so full global indices have a lower temperature rise over the century. The three data bases you need to consult for an accurate temperature are: GISSTemp (The column headed J-D is the annual mean) HadCRUT3v (the last column is the annual mean) NOAA. Of these, NOAA and GISS are the best, IMO, with HadCRU running cooler than the others because of flaws in their methodology. However, both NOAA and GISS only extend back to 1880, while HadCRU extends back to 1880 with dubious reliability due to limited land station data. The preindustrial era is generally taken as being prior to 1750. No global temperature record exists to that period, and reconstructions differ significantly. Temperatures where probably lower than in 1900-1910 (which was exceptionally cool), but not by much.
    0 0
    Moderator Response: [John Hartz] Typo in the third paragraph re the start dates shown?
  14. The scary thing in these graphs sis that aerosol and ozone are strongly negative. If we clean the air as we would like to do, temperature increase will be 1/3 larger. I think others authors got a similar result.
    0 0
  15. #8 Tom : thank you for this nice introduction. I’m pretty sure that we cannot explain the observed temperature trends without forcings, and mainly anthropogenic forcings, because it would be physically impossible (‘foolish’ as you say) to do so by ignoring their radiative and convective properties. But as we come to more precise estimates (the sense of the Huber et Knutti 2011 paper, but also Santer et al 2011 previously discussed on SkS ), the methodology underlying theses estimates becomes of interest in its details. The relaxation time of atmosphere being very short, the question of unforced / intrinsic variability concerns more probably the oceanic circulation, particularly ist long term change known as thermohaline circulation (THC) and connected to (more or less) low frequency oscillations in large basins (eg AMO, PDO, etc.). So, in order to calculate the temperature distribution histogram you reproduce from HK2011, I suppose the GCM models (or EBM) are obliged to begin by a kind of long term (centennial to millenial) simulation of oceanic heat distribution, so as to constrain the desequilibrium state at the beginning of the modern period of the simulation (that is, in the year 1956 in your figure). The AR3 (2001) had some mentions of this kind of reflexion among modellers, for example 14.2.2.1 (sorry, I can't link to the precise page of AR3 because the undefined url of the report don't allow this) : Another important (and related) challenge is the initialisation of the models so that the entire system is in balance, i.e., in statistical equilibrium with respect to the fluxes of heat, water, and momentum between the various components of the system. The problem of determining appropriate initial conditions in which fluxes are dynamically and thermodynamically balanced throughout a coupled stiff system, such as the ocean-atmosphere system, is particularly difficult because of the wide range of adjustment times ranging from days to thousands of years. This can lead to a "climate drift", making interpretation of transient climate calculations difficult Or in the same report, 8.4.1 : This "climate drift" can make interpretation of transient climate change simulations difficult, so models are generally allowed to adjust to a state where such drifts have become acceptably slow, before starting climate change simulations. A number of techniques have been used to achieve this (see Stouffer and Dixon, 1998), but it is not possible, in general, to say which of these procedures gives "better" initial conditions for a climate change projection run. In the IPCC AR4 (2007), we can see in 9.4.1.2 and fig 9.5 an exercise very comparable to HK2011, with a forced and unforced simulation of the 20th century. As you can read, the legend of the figure stipulates : The simulated global mean temperature anomalies in (b) are from 19 simulations produced by five models with natural forcings only. The multi-model ensemble mean is shown as a thick blue curve and individual simulations are shown as thin blue curves. Simulations are selected that do not exhibit excessive drift in their control simulations (no more than 0.2°C per century). Each simulation was sampled so that coverage corresponds to that of the observations. So, we find again this concept of 'drift' in control (no forcing) simulations, with a selection of thoses simulations that do not exhibit more than 0,2 K drift. But why this limit value of 0,2K ? Does it come from a physical non-sense (beyond this value) or an empirical adjustment and, if the second solution is correct, adjustment to which set of observations constraining the signature of an unforced variability ? If there is such a secular drift, why temperature change from unforced variability on six decades would be centered on zero (your figure) rather than a positive or negative value ? The appendix of AR4 chapter 9 gives some information about methods (optimal fingerprinting and methods of inference), but beside the technical complexity (or because of it and my consequent poor level of understanding), it seems circular in my mind : Fitting the regression model requires an estimate of the covariance matrix C (i.e., the internal variability), which is usually obtained from unforced variation simulated by AOGCMs (e.g., from long control simulations) because the instrumental record is too short to provide a reliable estimate and may be affected by external forcing. Atmosphere-Ocean General Circulation Models may not simulate natural internal climate variability accurately, particularly at small spatial scales, and thus a residual consistency test (Allen and Tett, 1999) is typically used to assess the model-simulated variability at the scales that are retained in the analysis. To avoid bias (Hegerl et al., 1996, 1997), uncertainty in the estimate of the vector of scaling factors a is usually assessed with a second, statistically independent estimate of the covariance matrix C which is ordinarily obtained from an additional, independent sample of simulated unforced variation. I basically read this as : AOGCMs constrains the realism of unforced variability from… AOGCMs simulations of unforced variability ! That is : there is no reference to empirical (observation-based or proxy-based) assessment of the long term change in oceanic circulation, the best candidate for unforced variability. Of course, when you deal with a huge temperature change (eg 2, 3, 4 K), these questions are probably of minor importance. And if unforced variability could help us to restrain the climate sensitivity range, it is very unlikely it will change this range (it could even drive to higher values). But when you try to adress precisely the different contributions to an observed trend of a 0,79 K in one century or 0,55 K in six decades (HK2011), maybe theses questions of ‘drift’ and ‘control runs’ are to be adressed more precisely in the explanations of the results.
    0 0
  16. dagold - as Tom Curtis said, the global surface temperature record extends back to about 1880. HadCRUT goes back to 1850, but it's biased lower than the other two main data sets, and its measurements from 1850 to 1880 are based on fewer measurements. These three groups include sea surface temperatures, while BEST only has land temperature measurements, which is why it's higher. The GHG emissions ramp-up also really began starting right around 1880 or so, conveniently. The global surface temperature increase since then is right around 0.8C.
    0 0
  17. So this study would imply that the IPCC assessment of "very likely" (90% or above) and "most of the observed warming" (> 50%) since mid-century is too conservative. They appear to be putting a 90% range around 74%-122% (lower bound not extending down to 50% as the IPCC implies). "Skeptics" really need to stop pretending that IPCC assessments are too confident.
    0 0
  18. Yes, because the IPCC uses a consensus process, it tends to be quite conservative, as is the case in global warming attribution.
    0 0
  19. As the basis, where all their calculations start, Huber and Knutti (2011) take "the Bern2.5D climate model of intermediate complexity". I read this as: We make evidence from a model (and assume that the model matches observations from the past). On figure 1a, the line for solar radiation is taken from "Radiative forcings from historical reconstructions and the SRES A2 scenario for different forcing agents". It is almost near zero. My point is; if you start with such an assumption, it is not very surprising to come to the result that GHG must be the dominant contributor. So, for me, it seems like this article does a kind of a no-conclusion and just elaborates on said assumption. How come that TSI is assumed so low? I just wanted to point this out here, because I am coming from the "solar-cycles" article, and also from "it's the sun", and I was pointed towards this article here. I frankly do not understand how and why TSI is estimated with such a low increase in radiative forcing.
    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us