Direct Air Capture

I thought I’d written about this before, but can’t seem to find a post. Either, my searching ability is poor, or my memory is poor. I mostly wanted to highlight an interesting YouTube video by David Kipping that illustrates why Direct Air Capture (DAC) is thermodynamically challenging. I encourage you to watch the video (which I’ve put at the end of this post) but his basic conclusion is that thermodynamic constraints mean that implementing DAC at the necessary scale would require a significant fraction of all global electricity consumption.

I wanted, however, to work through some of the numbers myself and to do the calculation of how much DAC we would need to use in a slightly different way.

A key point is that given an atmospheric concentration of 400 ppm and a temperature of 300K, it takes a minimum of 19505 J to remove 1 mole of CO2. 1 mole of CO2 is 44g, so 1 tonne of CO2 has 22727 moles. Therefore, removing 1 tonne of CO2 requires a minimum of 4.43 x 108 J.

Typically, however, we emit so much that we tend to think in terms of gigatonnes of CO2 (GtCO2). Removing 1 GtCO2 would require a minimum of 4.43 x 1017 J.

Recent (optimistic?) projections now suggest that we’re currently heading for warming of about 2.5oC. Given a Transient Climate Response to Cumulative Emissions (TCRE) of 0.45oC per 1000 GtCO2, this suggests total emissions of about 5500 GtCO2 (we’re currently at around 2500 GtCO2). If we want to limit overall warming to 1.5oC, then we’d need net emissions to be about 3300 GtCO2. If we’re going to rely on DAC, this means removing 2200 GtCO2, which would require a minimum energy of 1021J.

To put this into contect, current (2024) global electricity consumption is about 31000 TWh, which is 1.1 x 1020 J. If we assume 3% growth in electricity consumption per year, then by the end of this century, we will have consumed a total of 3.5 x 1022 J of electrical energy between now and 2100. The minimum energy requirements for DAC would be about 3% of this total.

However, this assumes 100% efficiency, which seems unrealistic. More realistic estimates suggest something like 10% efficiency. Hence, if we want to use DAC to limit overall warming to 1.5oC, while emitting enough to reach 2.5oC, DAC would need to use something like 30% of all electricity consumed over the next 80 years.

I should stress that although I have checked some of these numbers, and I think my basic calculations are correct, I am happy to be corrected if anyone sees an error. Also, my estimates are approximate and I’m ignoring uncertainties, which could make things sightly better, or considerably worse.

I do think that the basic conclusion that DAC at scale would consume a significant fraction of future electricity is broadly correct. This probably suggests that it would be better to use this electrical energy to avoid these emissions, rather than using DAC to later remove what has already been emitted.

Posted in advocacy, Climate change, Global warming, Scientists | Tagged , , , , , , , , | 12 Comments

A “controversial” methane metric?

There’s a recent Carbon Brief article about a supposedly controversial methane metric. The metric in question is GWP*, which I’ve actually written about before. Methane emissions are typically compared to CO2 using a metric known as Global Warming Potential (GWP). These are often measured over periods of 20 years (GWP20) or 100 years (GWP100). For methane GWP20 has a value of about 80, while GWP100 has a value of about 30.

As the Carbon Brief article says, these are often interpreted as suggesting that

one tonne of methane causes the same amount of warming as around 80 tonnes of CO2, when measured over a period of 20 years…….. When calculated over 100 years, methane’s shorter lifetime means it causes around 30 times more warming than CO2. 

These metrics highlight that methane is a potent greenhouse gas that can contribute substantially to global warming. The problem is that the interpretation of these metrics is not actually correct. These metrics are computed by integrating the radiative forcing of a pulse of emissions over the relevant time period. However, this doesn’t necessarily correctly represent the warming due to this pulse of emission.

Image
Credit: Allen et al. (2016)

Unlike CO2, methane has an atmospheric lifetime of about 10 years. This means that half of a pulse of methane will be gone after 10 years, three-quarters after 20 years, and only a few percent will be left after 50 years. Hence, it’s really not correct to suggest that over 100 years, methane will cause around 30 times more warming than CO2.

The figure on the right (from Allen et al. 2016) illustrates it very nicely. It shows the warming due to equivalent pulses of different greenhouse gases, using GWP100 to determine the equivalence. It’s certainly the case that an equivalent pulse of methane will cause more warming than CO2 initially, but after about 40 years the warming is about the same. The warming due to a pulse of CO2 levels off after about 20 years and persists well beyond 100 years, at which point there is virtually no warming from the equivalent pulse of methane.

The GWP* metric was introduced to more accurately represent methane-driven warming, and it does a pretty good job. It accounts for key differences between warming due to long-lived greenhouse gases (CO2) and short-lived greenhouse gases, like methane. Stopping CO2-driven warming requires getting CO2 emissions to (net) zero. This isn’t the case for methane. As this Carbon Brief article highlights, methane-driven warming will actually stabilise if methane emissions stabilise and will actually fall if methane emissions are reduced.

Image
Figure 7.22 from the IPCC AR6 WGI report.

The Figure on the left also compares GWP100, GWP20 and GWP* in two different emission scenarios. GWP20 (light blue) almost always, over-estimates the warming. GWP100 (purple) does pretty well when emissions are increasing, but starts to diverge when emissions start to decrease. GWP* (dark green) follows the actual warming (black line) in both emission scenarios.

The reason GWP* is seen as controversial is because some may use GWP* to argue that we don’t need to reduce methane emissions, or don’t need to reduce them as much as might be suggested if using GWP20, or GWP100. I think this is a valid concern, but I don’t think it’s a good reason to keep using metrics that don’t actually represent what many think they do, over one that does.

One key point to make is that GWP20, GWP100 and GWP* are just metrics that are used to link methane emissions with warming. None of them actually tells us what we should do. Those are policy decisions policy makers make based on information provided.

One advantage of reducing methane emissions is that it will actually reverse some past warming, which is not the case when it comes to reducing CO2 emissions. I think there are perfectly good reasons for doing this, but I do think we should be clear about this and use a metric that properly represents this, rather than ones that will suggest we need to do this or else methane-driven warming will continue.

We also have to be careful of thinking that we can offset some CO2 emissions through reductions in methane emssions. This is simply not the case, as illustrated by this 2010 Realclimate post called Losing time, not buying time. I think it’s really important to not treat methane and CO2 as equivalent, so it seems odd to characterise a metric that does this as being controversial. I’ll stop there and will put some links below to other posts that might be worth reading.

Links:
What the ‘controversial’ GWP* methane metric means for farming emissions – Carbon Brief article that motivated this post.
Methane – one of my earlier posts about methane emissions and GWP*.
Methae, again – another of my posts about methane.
Losing time, not buying time – 2010 Realclimate post about why it’s important to not treat methane and CO2 as equivalent.
Understanding methane – another post by me about understanding methane emissions.
A new way to assess ‘global warming potential’ of short-lived pollutants – Carbon Brief article by Michelle Cain introducing GWP*.
Agricultural emissions – a post by my about emissions from agriculture.
The definitive CO2/CH4 comparison post – Realclimate post comparing methane and CO2.

Posted in Climate change, Global warming, Policy | Tagged , , , , , , , | 24 Comments

Emergence vs Detection & Attribution

Since effective communication often involves repeating things, I thought I would repeat what others have pointed out already. The underlying issue is that there is a narrative in the climate skeptosphere suggesting that extreme weather events are not becoming more common, or that we can’t yet attribute changes in most extreme weather types to human influences (as suggested in the recent DoE climate report).

Image
Table 12.12 from the IPCC AR6 WGI report.

This is often based on the Table shown on the right, taken from Chapter 12 of the IPCC AR6 WGI report. As, I think, originally pointed out by Tim Osborn, this table is not suggesting that we haven’t yet detected changes in most extreme weather events, or haven’t managed to attribute a human influence. It’s essentially considering if, or when, a signal has/will have emerged from the noise. Formally, emergence is defined as the magnitude of a particular event increasing by more than 1 standard deviation of the normal variability.

In some sense it’s highlighting when the typical properties of these events will be outside the range of what was normal in the baseline climate. As Andrew Dessler points out in this post, this does not indicate that we have yet to detect a change or attribute a human influence. Detection and attribution does not require emergence. You can detect a trend and attribute a human influence before the signal has emerged. For many extreme events, changes have indeed been detected and in many cases have been attributed to human influences.

Of course, there is a prominent voice in the climate skeptosphere who thinks Andrew Dessler is wrong and that it is ironic that someone trying to police the scientific discourse doesn’t understand IPCC terminology (as an aside, complaining about others being science police may itself be rather ironic). Given that one of the co-ordinating lead authors has posted a comment on Linkedin that seems entirely consistent with Andrew’s post, the irony might be that someone professing to be an honest broker doesn’t really understand the topic nearly as well as they seem to suggest that they do. Of course, this will be surprising to noone familiar with the public climate debate.

I do find this all rather fascinating, but also concerning. Confident voices can become prominent and be treated as domain experts when they present arguments that are convenient to some audiences, even if they really don’t understand the topic nearly as well as their confidence would seem to suggest. Anyone paying attention at the moment is certainly not going to be surprised by something like this, but it still seems worth considering how to improve the dialogue or how to counter those who rise to prominence mostly by promoting erroneous, but convenient, arguments. Pointing out when they’re wrong seems like a reasonable place to start, even if it clearly won’t be sufficient.

Posted in Climate change, ClimateBall, Roger Pielke Jr, Severe Events | Tagged , , , , , , | 219 Comments

Koonin providing clarity on climate?

It seems that the US Department of Energy has now disbanded the Climate Working Group that drafted the report that I discussed in this post. However, about a week ago, Steven Koonin – one of the authors of the report – had an article in the Wall Street Journal titled At Long Last, Clarity on Climate. Clarity is a bit of a stretch. Personally, I think it more muddied the waters, than brought clarity.

A general point that I didn’t really make in my previous post (and that has just been highlighted in a comment) is that it is explicitly focussed on the US. The richest country in the world probably is more resilient than most others and could well decide that it’s better to deal with the impacts of climate change than committing too much now to avoiding them. I happen to disagree with this as I think it ignores how the US has benefitted from something that will negatively impact others, ignores that countries can’t really exist in isolation, and ignores that there are potentially outcomes that even wealthy a country will struggle to deal with. However, I can see how some might conclude this, although it might be good if they were much more explicit.

What I thought I would do is try to address some of the claims and conclusions made in Steven Koonin’s article. There’s an element of truthiness to the article; some claims may be true, but they don’t really support the argument being made.

For example, he says:

While global sea levels have risen about 8 inches since 1900, aggregate U.S. tide-gauge data don’t show the long-term acceleration expected from a warming globe.

U.S. tide-gauges may indeed not show the expected long-term acceleration, but the rate of global sea level rise is indeed accelerating.

Similarly, he says that:

Data aggregated over the continental U.S. show no significant long-term trends in most extreme weather events. Claims of more frequent or intense hurricanes, tornadoes, floods and dryness in America aren’t supported by historical records.

Some of the statements (no long-term trends, historical records) may indeed be technically true. However, there are numerous studies that have shown that climate change has affected extreme events in Northern America. You can find many examples in this Carbon Brief article that has mapped how climate change affects extreme weather around the world.

He also claims that:

Natural climate variability, data limitations and model deficiencies complicate efforts to attribute specific climate changes or extreme events to human CO2 emissions.

I suspect these factors do indeed complicate efforts, but so what? It is complicated, but that doesn’t mean that studies haven’t been done that do indeed demonstrate that human CO2 emissions are driving climate change and influencing extreme events.

I’ll end this bit with a comment about something he says about climate models:

Complex climate models provide limited guidance on the climate’s response to rising carbon-dioxide levels. Overly sensitive models, often using extreme scenarios, have exaggerated future warming projections and consequences.

There is a hot model problem, but there are ways to correct for this, and climate models have generally been skillfull. Also, climate models are typically making projections – or conditional predictions – because the emission pathways are inputs to the models. Hence the result is telling us something about what might happen if we follow that emission pathway. The emission pathways that are considered range from ones where we soon start reducing emissions to ones where it continues increasing. To suggest that climate models have exaggerated future warming projections when the emission pathways are inputs seems a little confused.

I’m not writing this to try and change the minds of those who think the DOE climate report was excellent and who think that the authors are some of the best scientists in the field. That would be silly and naive. I’m partly writing this because it’s a rainy Saturday afternoon and it’s a topic I find interesting.

However, another reason is that I think it’s important to think about why people with relevant expertise can write something that seems intellectually weak and sloppy, but present it as if it’s a careful piece of work that’s provided clarity. Would be easy to conclude that it’s simply them being dishonest, but I’m not convinced it’s quite that simple or convenient.

I wouldn’t be surprised if the authors believe that they have written a good report and that what they’ve presented has provided some clarity. So, how do you have serious discussions about complex topics when people who are regarded as experts in the field can’t even decide on some of the scientific fundamentals, or the significant of what the scientific evidence suggests? I certainly don’t know the answer, but I do think it is something worth thinking about.

Links:
At Long Last: Clarity on Climate – Steven Koonins WSJ article.
The New DOE Climate Report – my earlier post on the DOE Climate Report, with a link to the report.
Trump’s Energy Department disbands group that sowed doubt about climate change – NPR article about the DOE CWG being disbanced.
Climate Change: Global Sea Level – NOAA webpage highlighting that the rate of global sea level rise is accelerating.
Mapped: How climate change affects extreme weather around the world – Carbon Brief article mapping attribution studies for extreme events.
The ‘hot model’ problem – my post about the hot model problem.
Evaluating the Performance of Past Climate Model Projections – paper by Hausfather et al.
Past warming trend constrains future warming in CMIP6 models – Tokarska et al. with a method for downweighting models based on how well the agree with past warming trends.

Posted in Climate change, Global warming, Research, Uncategorized | Tagged , , , , | 31 Comments

Some thoughts about doing science

I’ve often thought that science communicators should spend a bit more time discussing what we might call the scientific method and a bit less time highlighting what are seen as exciting scientific results. With that in mind, I thought I would promote a couple of very good posts by John Kennedy.

The first is a post called how to science a science with science. I don’t want to say too much, because you should really read John’s post, but the basic suggestion is that even though people might describe some kind of scientific method, the details are complicated. Scientists are generally trying to answer questions about the world and are doing their best to find ways of doing so.

Sometimes they use existing methods, sometimes they develop new ones. Sometimes a method might not be the right one to have used, but you learn something anyway and do better next time. Sometimes there are disagreements that aren’t easy to resolve. Just because someone vocal claims that researchers are doing something wrong doesn’t mean they’re right – think RCP8.5, for example.

The other post is one about metathugs, people who investigate papers, check things, and make sure that researchers are doing things properly. There is certainly a tendency for some to become what I would call academic critics who seem to think that their role is simply to critique what others do. There are others who see themselves as auditors. Although you might imagine that this is generally good thing, it can create problems.

One is that there isn’t really a hierarchy. If you’re interested in a scientific problem, you’re free to try and tackle it. If you think a research study has flaws, go ahead and do it the way you think it should be done. In a sense the process is meant to be iterative with our understanding emerging/improving as more and more people/groups tackle interesting problems.

There isn’t some special group of people whose role it is to decide when studies are done properly or when they have major flaws that somehow invalidate them. This is a reason why I’ve had issues with some of what I’ve seen in Science and Technology Studies (STS). Some STS work seems to be presented as policing, or over-seeing, the scientific community and any push-back from scientists can be rather poorly recieved, rather being engaged with in a constructive way.

This is a complex issue and I’ve summarised my thoughts rather briefly and missed many nuances. I encourage you to read John’s posts as he goes into much more detail and articulates his thoughts very nicely and more thoroughly than I have, or maybe even than I can.

Links:
How to science a science with science – John’s first post about the scientific method.
Metathugs – John’s second, bleaker, post about those who check scientific work.

Posted in Climate change, Philosophy for Bloggers, Research, The philosophy of science, The scientific method, We Are Science | Tagged , , , , , , | 35 Comments

Climate sensitivity

In 2020, a large group of scientists published a paper in which they used multiple lines of evidence to assess Earth’s climate sensitivity. The lines of evidence they used were the physical processes that determine climate sensitivity, the historical climate record, and the paleoclimate record. The key results were

  • The 66% range is 2.6–3.9 K for the Baseline calculation and remains within 2.3–4.5 K under robustness tests.
  • the corresponding 5–95% ranges are 2.3–4.7 K, bounded by 2.0–5.7 K.
  • all three lines of evidence are difficult to reconcile with an equilibrium climate sensitivity, characterised by an effective sensitivity S, below 2K.
  • the paleoclimate evidence provides the strongest evidence against S > 4.5 K.

All of this seems quite reasonable. A likely range from just above 2K to about 4.5K, little evidence to support an equilibrium climate sensitivity below 2K, and evidence against it being above 4.5K.

Unsurprisingly, however, Nic Lewis has views. He has a published a response in which he objectively combines climate sensitivity evidence and finds that

[t]he estimates of long-term climate sensitivity are much lower and better constrained (median 2.16 °C, 17–83% range 1.75–2.7 °C, 5–95% range 1.55–3.2 °C)

and that

[t]his sensitivity to the assumptions employed implies that climate sensitivity remains difficult to ascertain, and that values between 1.5 °C and 2 °C are quite plausible.

As far as I can tell, the differences are mostly due to different choices about the various parameters. Given that different choices of values can give such large variations in the results, does seem to suggest that climate sensitivity remains difficult to ascertain. However, it’s less clear that values between 1.5 °C and 2 °C are quite plausible, although it does depend on what one means by plausible.

I realise that one can select a set of potentially plausible parameters that will give values between 1.5oC and 2oC, but given that we’ve already warmed by ~1.5oC, that the planetary energy imbalance has recently been above 1 Wm-2, and that we haven’t yet reached in change in anthropogenic forcing equivalent to a doubling of atmospheric CO2, values in this range don’t seem particularly plausible.

I do like Nic Lewis’s work and I have learned quite a lot by working through some of it. However, I do think a weakness is a reluctance to properly interrogate why his work seems to suggest values for climate sensitivity that are lower than many other experts would regard as plausible.

I think there’s a tendency to think that if you’ve justified all your assumptions, carefully chosen your parameters, and ensured that the methodology is robust, that the results should then stand. In my view, it’s always worth sanity checking the results. I realise that you have to be careful of not introducing additional biases, but you also have to be careful of trusting a result simply because the analysis is supposedly objective.

Posted in Climate sensitivity, ClimateBall, The philosophy of science, The scientific method, Uncategorized | Tagged , , , , , , | 57 Comments

Another pause?

It seems that a slowdown in the melting of Arctic sea ice is now being used to suggest that climate science is melting. This is very silly and is remininsence of the claims of a pause in global warming that dominated much of the discourse in the 2010s.

Arctic sea ice is a small part of the climate system and it’s well known that variability can easily mask long-term trends on decadal timescales. Arctic sea ice extent was particularly low in 2012, so maybe it’s not that surprising that there’s been an apparent pause since then. A strong El Nino in 1998 that led to a record warm year was one of the main reasons for the subsequent suppposed pause in global warming.

You’d hope that skeptics would have learned by now to not use short-term variability to claim that climate science is somehow broken, but that would be naive. This isn’t about genuinely trying to understand the climate system, it’s just about constructing a narrative that suits their ideology.

I do have a small vested interest in this. The only climate bet I’ve taken is that the average of the 2026/27/28 Arctic sea ice minimum would be smaller than the average of 2011/12/13. It’s not looking all that good for me at the moment, but there’s still a chance, it’s only for a pint and me losing a climate bet doesn’t somehow undermine our basic understanding of climate science.

Posted in Climate change, ClimateBall, Global warming, Science | Tagged , , | 93 Comments

Getting climate risk wrong

Ted Nordhaus has a recent article in The EcoModernist about why he stopped being a climate catastrophist. His basic argument is that we used to think that we were heading for 5oC of warming, which would have been catastrophic, but are now heading for more like 3oC of warming. Despite this good news, many in the climate science and advocacy community have refused to become less catastrophic. Ted, on the other hand, has change his mind and is no longer a climate catastrophist.

I’ve been involved in discussions about this topic for more than a decade, and I don’t think I’d ever have described Ted as a catastrophist, at least not as I would expect it to be defined. This reminds me of when one of Ted’s early colleagues – Michael Shellenberger – also wrote an article in which he suggested that he was a reformed climate activist who was now condemning alarmism. It can be a convenient narrative; you get praised for changing your mind and others might think that if you can do it, maybe they can too.

What Ted seems to be suggesting is that those who continue to cling to climate catastrophe are getting climate risk wrong. There may well be some truth to this, but Ted’s article seems to largely dismiss any climate risk. Apparently, at local and regional scales, the impact of climate change is very small when compared to climate variability. Things like sea level rise and thawing of the permafrost will occur on very long timescales. Even though warming has clearly been measured, the normalised economic cost of climate related disasters isn’t increasing. There is also apparently an absence of an anthropogenic signal in most climate and weather phenomena.

Also, technological innovation and the development of clean energy is happening anyway and we decarbonised faster prior to climate change becoming a global concern than we have since. Although the article doesn’t argue against cleaner energy it does suggest that if catastrophic climate change is not looming then there’s no reason for a rapid transformation of the global energy economy at the speed and scale necessary to avoid catastrophic climate change.

My problem with these kind of arguments is that they’re not completely wrong, but they’re also not quite right. I don’t think it’s true that the impact of climate change is always small when compared to climate variability (think heatwaves and extreme precipitation). Just because the normalised cost of climate disasters isn’t increasing doesn’t mean climate change isn’t having an impact (how are you defining the null?). I also don’t think it’s true that there is an absence of an anthropogenic signal in most climate and weather phenomena (e.g., detection and attribution versus storyline)

Also, even if the trajectory we appear to now be on is heading in a less catastrophic direction than was thought to the case in the past, we’re still increasing emissions and the climate will continue to change until anthropogenic emissions get to (net) zero. There are also various uncertainties that mean that even if we do continue on the currently expected emission trajectory, we still can’t rule out warming by more than 4oC.

I don’t think this means that we should catastrophise, but I don’t think we should be complacent either. It should be possible to recognise that climate change does present some risks, even if there are going to be other factors that need to be taken into account when considering how best to motivate decarbonising global energy. As Stoat once said “if you can’t imagine anything between “catastrophic” and “nothing to worry about” then you’re not thinking“.

Posted in Climate change, Global warming | Tagged , , , | 42 Comments

Is there a conspiracy to ignore challenging new ideas?

A narrative associated with the new DOE report is that there are ideas that are being ignored by the research community and scientists who are being shunned for promoting these ideas. The suggestion is that this indicates some kind of major problem with the mainstream scientific community. There are similar narratives around Avi Loeb and his claims that Aliens have visited and Eric Weinstein’s Geometric Unity which, according to Sabine Hossenfelder, has physicists afraid.

It’s true that there can be ideas that are very entrenched within a research community. However, there is generally a very good reason for this; these are the ideas for which there is a strongest supporting evidence. If someone wants to overthrow a paradigm, or suggest some kind of alternative explanation, (alien spacecreaft instead of a comet, for example) then the evidence needs to be pretty convincing.

The research community doesn’t have the resources to seriously explore every idea, especially if the evidence is particularly weak, or if they’ve already done so many time before. If someone wants the research community to take their ideas seriously, then they need to put the effort in to convincing them that it’s worth doing so.

In fact, researchers have an obligation to present their work so as to convince their peers that it’s worth taking seriously. People who have new ideas can’t simply expect them to be taken seriously without putting some effort in themselves. If people with such ideas start suggesting that they’re a modern day Galileo and that the mainstream research community is afraid to consider their ideas because it challenges the orthodoxy, then it’s much more likely that their ideas are simply not worth considering than they’re some kind of maverick who is being unfairly shunned by other researchers.

Of course, you can find examples of ideas that were ignored by the mainstream research community that later turned out to be right. However, I would argue that this is the nature of scientific progress. It takes time and effort to change a paradigm because the evidence needs to become strong enough before it becomes clear that the alternative indeed has merit. Also, just because someone’s ideas are being ignored now isn’t some indication that these ideas are one day going to be shown to be right. It’s much more likely that they’re simply wrong than promoting a groundbreaking idea that a cabal of mainstream researchers are actively suppressing.

I thought I would finish this by providing links to some blog posts and YouTube videos/channels that I’ve found useful when thinking about this issue. If anyone has other examples, feel free to provide them in the comments.

Jason Wright has some good blog posts on Avi Loeb and Comet 3I/Atlas. You could start here, but there are also some newer ones with updates.
Bad Boy of Science’s YouTube video on Avi Loeb.
Bad Boy of Science’s YouTube video on Eric Weinstein.
Bad boy of Science’ YouTube video on Sabine Hossenfelder’s defence of Eric Weinstein.
Professor Dave Explains is a YouTube channel that has videos addressing Avi Loeb and Eric Weinstein. A bit blunter than I typically like, but may well be to some peoples’ taste.

Posted in Philosophy for Bloggers, Sound Science (tm), The philosophy of science, The scientific method | Tagged , , , , , , , , , , | 128 Comments

The new DOE Climate report

Since ClimateballTM (H/T Willard) is back with the publication of the first report from the US Department of Energy’s (DOE) Climate Working Group, I thought it was time to start writing posts again. The report is called A Critical Review of Impacts of Greenhouse Gas Emissions on the U.S. Climate, and the authors are John Christy, Judith Curry, Steven Koonin, Ross McKitrick, and Roy Spencer.

The first point to make is that if someone had asked me 10 years ago to guess the names of a group of scientists who might be recruited to write a contrarian report on climate science, I would have guessed a significant number of those listed above. Why are there so few contrarians that it’s pretty easy to guess most who might be invited to write such a report? Why are there no new names? Why were no early-career researchers invited? The answer is pretty obvious: the bench is pretty thin.

The report is rather long, so I’ve only read the Executive Summary and the chapter on Climate Sensitivity, since I’ve written quite a lot about that. The Executive Summary would do reasonably well in ClimateBall Bingo and the climate sensitivity chapter mostly complained about climate models and promoted what they refer to as data-driven estimates of climate sensitivity, such as those presented by Nic Lewis. These are perfectly valid ways of estimating climate sensitivity, but they tend to return lower values than some other methods and there are reasons to be cautious of accepting these as being somehow more reliable than other estimates.

So, there’s nothing particularly original about the chapter on climate sensitivity. Pretty much what you’d expect from this group of authors. It does, however, give me an opportunity to highlight a few things that I think are pretty important and which seem to be ignored by this report.

One key aspect of this issue is that carbon dioxide (CO2) accumulates in the atmosphere. In other words, while humans continue to emit CO2 into the atmosphere, atmospheric CO2 concentrations will continue to increase. There is also now strong evidence to indicate that warming will also continue until CO2 emissions get to (net) zero and that the overall warming depends linearly on cumulative CO2 emissions. In other words, if we emit twice as much, we warm twice as much.

So, whatever range of climate sensitivity you think is most likely, if you accept that there is a level of warming we might want to try to avoid, then anthropogenic/human emissions of CO2 need to get to (net) zero. How rapidly we’d need to do so does depend on climate sensitivity, but that we’d need to do so does not.

Interestingly. the report does suggest that estimates for the transient climate response (TCR) tend to show more agreement than those for the equilibrium climate sensitivity (ECS). This might suggest that there should be more agreement about how rapidly we should be aiming to reduce emissions, even if there isn’t agreement about how we might achieve this. Of course, that there isn’t is not a great surprise.

I suspect there will be many other more detailed response to this DOE report. Will try to highlight any others I find in the comments.

Update: 14 August 2025
Carbon Brief has put together a detailed response with input from many of the authors of papers cited in the DOE report.

Posted in Climate change, Climate sensitivity, ClimateBall, Judith Curry, Roy Spencer, Scientists | Tagged , , , , , , , , , | 128 Comments