From the Concorde to Sci-Fi Climate Solutions

by Almuth Ernsting (Truthout)

The interior of the Concorde aircraft at the Scotland Museum of Flight. (Photo: Magnus Hagdorn)

Touting “sci-fi climate solutions” – untested technologies not really scalable to the dimensions of our climate change crisis – dangerously delays the day when we actually reduce greenhouse gas emissions.

Last week, I took my son to Scotland’s Museum of Flight. Its proudest exhibit: a Concorde. To me, it looked stunningly futuristic. “How old,” remarked my son, looking at the confusing array of pre-digital controls in the cockpit. Watching the accompanying video – “Past Dreams of the Future” – it occurred to me that the story of the Concorde stands as a symbol for two of the biggest obstacles to addressing climate change.

The Concorde must rank among the most wasteful ways of guzzling fossil fuels ever invented. No other form of transport is as destructive to the climate as aviation – yet the Concorde burned almost five times as much fuel per person per mile as a standard aircraft. Moreover, by emitting pollutants straight into the lower stratosphere, the Concorde contributed to ozone depletion. At the time of the Concorde’s first test flight in 1969, little was known about climate change and the ozone hole had not yet been discovered. Yet by the time the Concorde was grounded – for purely economic reasons – in 2003, concerns about its impact on the ozone layer had been voiced for 32 years and the Intergovernmental Panel on Climate Change’s (IPCC) first report had been published for 13 years.

The Concorde’s history illustrates how the elites will stop at nothing when pursuing their interests or desires. No damage to the atmosphere and no level of noise-induced misery to those living under Concorde flight paths were treated as bad enough to warrant depriving the richest of a glamorous toy.

If this first “climate change lesson” from the Concorde seems depressing, the second will be even less comfortable for many.

Back in 1969, the UK’s technology minister marveled at Concorde’s promises: “It’ll change the shape of the world; it’ll shrink the globe by half . . . It replaces in one step the entire progress made in aviation since the Wright Brothers in 1903.”

Few would have believed at that time that, from 2003, no commercial flight would reach even half the speed that had been achieved back in the 1970s.

The Concorde remained as fast – yet as inefficient and uneconomical – as it had been from its commercial inauguration in 1976 – despite vast amounts of public and industry investment. The term “Concorde fallacy” entered British dictionaries: “The idea that you should continue to spend money on a project, product, etc. in order not to waste the money or effort you have already put into it, which may lead to bad decisions.”

The lessons for those who believe in overcoming climate change through technological progress are sobering: It’s not written in the stars that every technology dreamed up can be realized, nor that, with enough time and money, every technical problem will be overcome and that, over time, every new technology will become better, more efficient and more affordable.

Yet precisely such faith in technological progress informs mainstream responses to climate change, including the response by the IPCC. At a conference last autumn, I listened to a lead author of the IPCC’s latest assessment report. His presentation began with a depressing summary of the escalating climate crisis and the massive rise in energy use and carbon emissions, clearly correlated with economic growth. His conclusion was highly optimistic: Provided we make the right choices, technological progress offers a future with zero-carbon energy for all, with ever greater prosperity and no need for economic growth to end. This, he illustrated with some drawings of what we might expect by 2050: super-grids connecting abundant nuclear and renewable energy sources across continents, new forms of mass transport (perhaps modeled on Japan’s magnetic levitation trains), new forms of aircraft (curiously reminiscent of the Concorde) and completely sustainable cars (which looked like robots on wheels). The last and most obscure drawing in his presentation was unfinished, to remind us that future technological progress is beyond our capacity to imagine; the speaker suggested it might be a printer printing itself in a new era of self-replicating machines.

These may represent the fantasies of just one of many lead authors of the IPCC’s recent report. But the IPCC’s 2014 mitigation report itself relies on a large range of techno-fixes, many of which are a long way from being technically, let alone commercially, viable. Climate justice campaigners have condemned the IPCC’s support for “false solutions” to climate change. But the term “false solutions” does not distinguish between techno-fixes that are real and scalable, albeit harmful and counterproductive on the one hand, and those that remain in the realm of science fiction, or threaten to turn into another “Concorde fallacy,” i.e. to keep guzzling public funds with no credible prospect of ever becoming truly viable. Let’s call the latter “sci-fi solutions.”

The most prominent, though by no means only, sci-fi solution espoused by the IPCC is BECCS – bioenergy with carbon capture and storage. According to their recent report, the vast majority of “pathways” or models for keeping temperature rise below 2 degrees Celsius rely on “negative emissions.” Although the report included words of caution, pointing out that such technologies are “uncertain” and “associated with challenges and risks,” the conclusion is quite clear: Either carbon capture and storage, including BECCS, is introduced on a very large scale, or the chances of keeping global warming within 2 degrees Celsius are minimal. In the meantime, the IPCC’s chair, Rajendra Pachauri, and the co-chair of the panel’s Working Group on Climate Change Mitigation, Ottmar Edenhofer, publicly advocate BECCS without any notes of caution about uncertainties – referring to it as a proven way of reducing carbon dioxide levels and thus global warming. Not surprisingly therefore, BECCS has even entered the UN climate change negotiations. The recent text, agreed at the Lima climate conference in December 2014 (“Lima Call for Action”), introduces the terms “net zero emissions” and “negative emissions,” i.e. the idea that we can reliably suck large amounts of carbon (those already emitted from burning fossil fuels) out of the atmosphere. Although BECCS is not explicitly mentioned in the Lima Call for Action, the wording implies support for it because it is treated as the key “negative emissions” technology by the IPCC.

If BECCS were to be applied at a large scale in the future, then we would have every reason to be alarmed. According to a scientific review, attempting to capture 1 billion tons of carbon through BECCS (far less than many of the “pathways” considered by the IPCC presume) would require 218 to 990 million hectares of switchgrass plantations (or similar scale plantations of other feedstocks, including trees), 1.6 to 7.4 trillion cubic meters of water a year, and 75 percent more than all the nitrogen fertilizers used worldwide (which currently stands at 1 billion tons according to the “conservative” estimates in many studies). By comparison, just 30 million hectares of land worldwide have been converted to grow feedstock for liquid biofuels so far. Yet biofuels have already become the main cause of accelerated growth in demand for vegetable oils and cereals, triggering huge volatility and rises in the price of wood worldwide. And by pushing up palm oil prices, biofuels have driven faster deforestation across Southeast Asia and increasingly in Africa. As a result of the ethanol boom, more than 6 million hectares of US land has been planted with corn, causing prairies and wetlands to be plowed up. This destruction of ecosystems, coupled with the greenhouse gas intensive use of fertilizers, means that biofuels overall are almost certainly worse for the climate than the fossil fuels they are meant to replace. There are no reasons to believe that the impacts of BECCS would be any more benign. And they would be on a much larger scale.

Capturing carbon takes a lot of energy, hence CCS requires around one-third more fuel to be burned to generate the same amount of energy. And sequestering captured carbon is a highly uncertain business. So far, there have been three large-scale carbon sequestration experiments. The longest-standing of these, the Sleipner field carbon sequestration trial in the North Sea, has been cited as proof that carbon dioxide can be sequestered reliably under the seabed. Yet in 2013, unexpected scars and fractures were found in the reservoir and a lead researcher concluded: “We are saying it is very likely something will come out in the end.” Another one of the supposedly “successful,” if much shorter, trials also raised “interesting questions,” according to the researchers: Carbon dioxide migrated further upward in the reservoir than predicted, most likely because injecting the carbon dioxide caused fractures in the cap rock.

There are thus good reasons to be alarmed about the prospect of large-scale bioenergy with CCS. Yet BECCS isn’t for real.

While the IPCC and world leaders conclude that we really need to use carbon capture and storage, including biomass, here’s what is actually happening: The Norwegian government, once proud of being a global pioneer of CCS, has pulled the plug on the country’s first full-scale CCS project after a scathing report from a public auditor. The Swedish state-owned energy company Vattenfall has shut down its CCS demonstration plant in Germany, the only plant worldwide testing a particular and supposedly promising carbon capture technology. The government of Alberta has dropped its previously enthusiastic support for CCS because it no longer sees it as economically viable.

True, 2014 has seen the opening of the world’s largest CCS power station, after SaskPower retrofitted one unit of their Boundary Dam coal power station in Saskatchewan to capture carbon dioxide. But Boundary Dam hardly confirms the techno-optimist’s hopes. The 100-megawatt unit costs approximately $1.4 billion to build – more than twice the cost of a much larger (non-CCS) 400-megawatt gas power station built by SaskPower in 2009. It became viable thanks only to public subsidies and to a contract with the oil company Cenovus, which agreed to buy the carbon dioxide for the next decade in order to inject it into an oil well to facilitate extraction of more hard to reach oil – a process called enhanced oil recovery (EOR). The supposed “carbon dioxide savings” predictably ignore all of the carbon dioxide emissions from burning that oil. But even with such a nearby oil field suitable for EOR, SaskPower had to make the plant far smaller than originally planned so as to avoid capturing more carbon dioxide than they could sell.

If CCS with fossil fuels is reminiscent of the Concorde fallacy, large-scale BECCS is entirely in the realm of science fiction. The supposedly most “promising technology”has never been tested in a biomass power plant and that has so far provenuneconomical with coal. Add to that the fact that biomass power plants need more feedstock and are less efficient and more expensive to run than coal power plants, and a massive-scale BECCS program becomes even more implausible. And then add to that the question of scale: Sequestering 1 billion tons of carbon a year would produce a volume of highly pressurized liquid carbon dioxide larger than the global volume of oil extracted annually. It would require governments and/or companies stumping up the money to build an infrastructure larger than that of the entire global oil industry – without any proven benefit.

This doesn’t mean that we won’t see any little BECCS projects in niche circumstances. One of these already exists: ADM is capturing carbon dioxide from ethanol fermentation in one of its refineries for use in CCS research. Capturing carbon dioxide from ethanol fermentation is relatively simple and cheap. If there happens to be some half-depleted nearby oil field suitable for enhanced oil recovery, some ethanol “CCS” projects could pop up here and there. But this has little to do with a “billion ton negative emissions” vision.

BECCS thus appears as one, albeit a particularly prominent, example of baseless techno-optimism leading to dangerous policy choices. Dangerous, that is, because hype about sci-fi solutions becomes a cover for the failure to curb fossil fuel burning and ecosystem destruction today.

Is geoengineering research going outdoors?

by Blaž Gasparini and Prof. Ulrike Lohmann

Weather-balloon carrying an ozone measuring device. Could geoengineering soon go beyond computer simulations and lab experiments? (Photo: Penn State / flickr CC BY-NC 2.0)

Geoengineering research has so far been confined to modelling and laboratory studies. Serious research outside of these limits has been a taboo because of the serious risks this may pose for ecosystems and society. However, two recent publications are breaking the ice and bringing the discussion of field experiments into the limelight of the scientific community.

Climate science is giving a clear signal that action has to be taken to halt global warming. Rising greenhouse gas emissions are driving us towards a climate with negative consequences for society in most parts of the world, for instance through an increase in weather extremes. The fact that we still do not have any binding agreements on reducing greenhouse gases is pushing part of the scientific community towards researching technological fixes for the climate problem – geoengineering.

Geoengineering aims at treating the “symptoms” of climate change – most notably the temperature increase – by altering the Earth’s radiation balance. The proposed methods vary greatly in terms of their technological characteristics and possible consequences. In this blog post, we focus on the best-known solar geoengineering method: the “artificial volcano” or stratospheric aerosols method. Scientists propose injecting small, reflective sulphate particles (aerosols) into the stratosphere at 15–20 km altitude. The small particles reduce the amount of solar radiation reaching the Earth’s surface and thus cool the lower atmosphere. This effect has been observed after large volcanic eruptions (thus the name “artificial volcano”), most recently after the 1991 Mt. Pinatubo eruption, when the global average temperature decreased by almost 0.5°C in the year following the eruption. Unlike volcanoes, geoengineers would continuously inject the aerosols until greenhouse gas levels dropped below a level determined to be safe. So far, all solar geoengineering studies have been confined to computer models [1].

Field experiments

A group of atmospheric scientists have recently proposed nine field experiments to test solar geoengineering methods [2]. They divided ideas into those that aim at understanding the effectiveness and risks of geoengineering and those aimed at developing technologies needed for the deployment of geoengineering. Furthermore, the scientists made a clear distinction between experiments seeking to understand small-scale atmospheric processes like chemical reactions on the surface of artificially injected particles and those targeting large-scale climate responses, e.g. a decrease in global average temperature. The impact of large-scale experiments cannot be simply extrapolated from small-scale ones. However, large-scale experiments would only be performed in cases where numerous prior small-scale tests proved successful with only negligible environmental risks – which might be too late to avoid some of the negative consequences of global warming.

The proposed experimental design of a small-scale stratospheric sulphur (and water) injection field test (source: [3])
Of the proposed experiments, a small field test called the stratospheric controlled perturbation experiment (SCoPEx) is at the most advanced planning stage [3]. A Harvard research group designed the experiment to better quantify a side-effect of stratospheric sulphur injections: ozone depletion. A decrease in stratospheric ozone levels can increase the risk of skin cancer, which could be even more disruptive for society than the greenhouse gas-driven warming of the planet. A sudden decrease in ozone concentrations during SCoPEx would probably kill the idea of stratospheric sulphur geoengineering. As illustrated in the figure, the experiment is comprised of a balloon with a module carrying an aerosol generator, observational instruments, and an engine. The module both injects and monitors the aerosol plume. The experiment is expected to emit less sulphur and water than an intercontinental flight between Europe and the US. The researchers estimate the total costs of the field experiment to be around USD 10 million.

Why is field testing so controversial?

SCoPEx and other currently proposed small-scale experiments most likely do not pose significant risks for environment and society. Unlike full geoengineering deployment with large-scale, decades-long injections, these experiments would not modify the planet’s energy balance. However, there remain other issues related to the proposed outdoor geoengineering research:

  1. The first field experiments could add momentum towards a rapid deployment and commercialisation of geoengineering research. Can we imagine a large multinational company taking over geoengineering research and the possible economic interests this would create?
  2. Increased geoengineering research could discourage mitigation efforts. Why would we mitigate carbon emissions if we have a Plan B which can partially counteract global warming?
  3. Who/which body would be authorised to monitor outdoor tests? Who can define the limit between a small-scale experiment and full deployment? And finally, who would control the global thermostat if full deployment took place?

Why bother with geoengineering at all?

Curiosity-driven geoengineering research provides the information society and policymakers need to choose the best strategy in dealing with climate change [4]. Geoengineering modelling studies contribute to a better understanding of the stratosphere and more accurate representation of aerosol processes and their interactions with climate, e.g. the impact of volcanoes on global temperature, precipitation, crop yields, etc. This results in more robust modelling projections of the future climate.

We think geoengineering tests should be constrained to either computer models or laboratories until we develop a good understanding of all associated natural processes and risks. Small-scale process-based experiments could prove to be useful – however, we suggest taking a step back and focusing on open questions regarding natural atmospheric processes like stratospheric aerosol microphysics.

This blog was co-written by PhD student Blaž Gasparini and Prof. Ulrike Lohmann, and was originally posted to ETH Zürich

Further information

[1]The topic has already been discussed in more detail in previous blog entries (in German):Geoengineering – Ein gefährliches Spiel mit Aerosolen?  and Kann Geoengineering das Klimaproblem lösen?

[2] Keith et al., 2014: Field experiments on solar geoengineering: report of a workshop exploring a representative research portfolio, Phil. Trans. Roy. Soc., doi: 10.1098/rsta.2014.0175

[3] Dykema et al., 2014: Stratospheric controlled perturbation experiment: a small-scale experiment to improve understanding of the risks of solar geoengineering, Phil. Trans. Roy. Soc., doi:10.1098/rsta.2014.0059

[4] see also Robock, A. 2012: Is geoengineering research ethical? (Pdf)

‘Climate hacking’ would be easy – that doesn’t mean we should do it

by Erik van Sebille and Katelijn Van Hende

Just mimic this a few dozen times and we’ll be right. Right? Taro Taylor/Wikimedia Commons, CC BYSome people might argue that the greatest moral challenge of our time is serious enough to justify deliberately tampering with our climate to stave off the damaging effects of global warming.

Geoengineering, or “climate hacking”, to use its more emotive nickname, is a direct intervention in the natural environments of our planet, including our atmosphere, seas and oceans.

It has been suggested that geoengineering might buy us time to prevent warming above 2C, and that we should look at it seriously in case everything goes pear-shaped with our climate.

There are two problems with this argument. The first is that we already have an affordable solution with a relatively well-understood outcome: reducing our carbon emissions.

The second is that geoengineering itself is fraught with danger and that, worryingly, the most dangerous version, called solar radiation management, is also the most popular with those exploring this field.

Down in flames

In essence, solar radiation management is about mimicking volcanoes. Climate scientists have known for years that major volcanic eruptions can eject so much ash into the high atmosphere that they effectively dim the sun.

The tiny ash particles block the sunlight, reducing the amount of solar energy that reaches Earth’s surface. A major volcanic eruption like that of Mount Pinatubo in 1991 can cause worldwide cooling of about 0.1C for about two or three years.

As global temperatures will rise in the business-as-usual scenario, leading to a projected increase of almost 4C in the coming century, the ash of a few volcanic eruptions each year could theoretically offset the temperature rise due to the burning of fossil fuels.

Science has also taught us that depositing the ash, or something similar, into the high atmosphere is not very difficult. Some studies show that by using balloons, it could cost as little as a few billion dollars per year.

It certainly sounds like a much cheaper and easier approach than trying to negotiate a worldwide treaty to cut carbon emissions from nations across the globe.

Unlike global emissions cuts, geoengineering has the potential to be financed and implemented by a single wealthy individual, and can arguably be accomplished with a lot less effort.

Major problems

If it is so easy, why aren’t we already pumping ash into the sky to dim the Sun? Perhaps predictably, it’s because this climate solution is likely to create new problems of its own.

The Intergovernmental Panel on Climate Change (IPCC) has completely rejected solar radiation management – not because it is too hard, but because there is no guarantee that the consequences will be benign.

There are three major problems that make this form of geoengineering so dangerous that, hopefully, it will never be used.

First, it does not address the root cause of climate change. It only addresses one of the symptoms: global warming, while failing to deal with related issues such as ocean acidification. This is because our carbon dioxide emissions will continue to build up in the atmosphere and dissolve in the oceans, making seawater more acidic and making it harder for species like corals and oysters to form their skeletons.

The second problem is also related to the continued build-up of atmospheric carbon dioxide. If, at some point in the future, we stop pumping ash into the skies, the ash will rapidly wash out from the atmosphere in a few years. Yet with atmospheric carbon dioxide levels even higher than before, Earth will experience rapid “catch-up” warming. According to the IPCC, this could be as much as 2C per decade – roughly 10 times the current rate. This would be very troubling, given that many species, including in places such as Sydney, are already struggling to adapt to the current pace of change.

Third, pumping dust into our skies will almost certainly change the weather. In particular, it is likely to alter the amount of rainfall from country to country. Some will become drier, others wetter, with a range of grave impacts on many types of agriculture. It is not yet clear how individual countries will be affected, but we know that unpredictable water and food supplies can provoke regional conflict and even war.

Safeguarding the future

The precautionary principle has been embedded into national environmental laws and some international agreements (such asArticle 3 (3) of the UN Framework Convention on Climate Change). While this principle impels countries to act to stave off climate harm, it would also arguably require geoengineering proposals to be scrutinised with care.

It is difficult to design cautious policies, or even draw up regulations, on issues like geoengineering, where the outcome can at best only be partly predictable. Policies and regulations should be designed to have an intended and purposeful effect, which geo-engineering at the moment cannot deliver.

Some researchers have gone as far as to brand geoengineering immoral, while the concept has also been described as an Earth experiment, in addition to the experiment already being done with greenhouse emissions.

The only thing we know for certain is that we need a lot more certainty before deciding to hack our climate.

This was originally posted to the Conversation.

Reflecting sunlight into space has terrifying consequences, say scientists

 Workers on Germany’s highest mountain, Zugspitze, cover the glacier with oversized plastic sheets to keep it from melting during the summer months. Scientist have said geoengineering must be researched to find a possible solution of last resort to dangerous levels of global warming. Photograph: Matthias Schrader/AP
Workers on Germany’s highest mountain, Zugspitze, cover the glacier with oversized plastic sheets to keep it from melting during the summer months. Scientist have said geoengineering must be researched to find a possible solution of last resort to dangerous levels of global warming. Photograph: Matthias Schrader/AP

But ‘geoengineers’ say urgent nature of climate change means research must continue into controversial technology to combat rising temperatures

This was originally published by the Guardian

by Damian Carrington

Fighting global warming by reflecting sunlight back into space risks “terrifying” consequences including droughts and conflicts, according to three major new analyses of the promise and perils of geoengineering. But research into deliberately interfering with the climate system must continue in search of technology to use as a last resort in combating climate change, scientists have concluded.

Billions of people would suffer worse floods and droughts if technology was used to block warming sunlight, the research found. Technology that sucks carbon dioxide from the air was less risky, the analysis concluded, but will take many more decades to develop and take effect.

The carbon emissions that cause climate change are continuing to rise and, without sharp cuts, the world is set for “severe, widespread, and irreversible impacts”. This has led some to propose geoengineering but others have warned that unforeseen impacts of global-scale action to try to counteract warming could make the situation worse.

Matthew Watson, at the University of Bristol, who led one of the studies in the £5m research programme, said: “We are sleepwalking to a disaster with climate change. Cutting emissions is undoubtedly the thing we should be focusing on but it seems to be failing. Although geoengineering is terrifying to many people, and I include myself in this, [its feasibility and safety] are questions that have to be answered.”

Watson led the Stratospheric Particle Injection for Climate Engineering (Spice) project, which abandoned controversial attempts to test spraying droplets into the atmosphere from a balloon in 2012. But he said on Wednesday: “We will have to go outside eventually. There are just some things you cannot do in the lab.”

Prof Steve Rayner at the University of Oxford, who led the Climate Geoengineering Governance project, said the research showed geoengineering was “neither a magic bullet nor a Pandora’s box”.

But he said global security would be threatened unless an international treaty was agreed to oversee any sun-blocking projects. “For example, if India had put sulphate particles into the stratosphere, even as a test, two years before the recent floods in Pakistan, no one would ever persuade Pakistan that that had not caused the floods.”

The researchers examined two types of geoengineering, solar radiation management (SRM) and carbon dioxide removal (CDR). Prof Piers Forster, at the University of Leeds, led a project using in-computer models to assess six types of SRM. All reduced temperatures but all also worsened floods or droughts for 25%-65% of the global population, compared to the expected impact of climate change:

  • mimicking a volcano by spraying sulphate particles high into the atmosphere to block sunlight adversely affected 2.8bn people
  • spraying salt water above the oceans to whiten low clouds and reflect sunlight adversely affected 3bn people
  • thinning high cirrus clouds to allow more heat to escape Earth adversely affected 2.4bn people
  • generating microbubbles on the ocean surface to whiten it and reflect more sunlight adversely affected 2bn people
  • covering all deserts in shiny material adversely affected 4.1bn people
  • growing shinier crops adversely affected 1.4bn people

The adverse effect on rainfall results from changed differences in temperature between the oceans and land, which disrupts atmospheric circulation, particularly the monsoons over the very populous nations in SE Asia. Nonetheless, Forster said: “Because the [climate change] situation is so urgent, we do have to investigate the possibilities of geoengineering.”

Rayner said SRM could probably be done within two decades, but was difficult to govern and the side effects would be damaging. He noted that SRM does not remove carbon from the air, so only masks climate change. “People decry doing SRM as a band aid, but band aids are useful when you are healing,” he said.

In contrast, CDR tackles the root of the climate change problem by taking CO2 out of the atmosphere, would be much easier to govern and would have relatively few side effects. But Rayner said it will take multiple decades to develop CDR technologies and decades more for the CO2 reductions to produce a cooling effect. “You are going to have to build an industry to reverse engineer 200 years of fossil fuel industry, and on the same huge scale,” he said.

The recent landmark report by the UN Intergovernmental Panel on Climate Change (IPCC), signed off by 194 governments, placed strong emphasis on a potential technology called bioenergy carbon capture and storage (BECCS) as a way to pull CO2 from the atmosphere. It would involve burning plants and trees, which grow by taking CO2 from the air, in power plants and then capturing the CO2 exhaust and burying it underground.

“But if you are going to do BECCS, you are going to have to grow an awful lot of trees and the impact on land use may have very significant effects on food security,” said Rayner. He added that the potential costs of both SRM or CDR were very high and, if the costs of damaging side effects were included, looked much more expensive than cutting carbon emissions at source.

Both Watson and Rayner said the international goal of keeping warming below the “dangerous” level of 2C would only be possible with some form of geoengineering and that research into such technology should continue.

“If we found any [geoengineering] technology was safe, affordable and effective that could be part of a toolkit we could use to combat climate change,” said Rayner.

“If we ever deploy SRM in anger it will be the clearest indication yet that we have failed as planetary guardians,” said Watson. “It [would be] a watershed, fundamentally changing the way 7bn people interact with the world.”

Intergovernmental Climate Report Leaves Hopes Hanging on Fantasy Technology

by Rachel Smolker

Wellheads that check the temperature and pressure of the sequestered carbon dioxide gas at American Electric PowerÕs Mountaineer plant.This year, the Intergovernmental Panel on Climate Change (IPCC) has confirmed for us, once again, that the planet is warming, even more and even faster than panel members thought. In fact, it is getting even warmer even faster than they thought the last time they admitted to having underestimated the problem. We humans are in deep trouble, and finding a way out of this mess – one that will ensure a decent future for us – is becoming increasingly difficult, if not nearly impossible.

That difficult task is what the latest installment from IPCC, the Working Group 3 report on mitigation is intended to address. This past weekend, the “summary for policymakers” was released after the mad rush of government negotiations over the scientists’ text took place in Berlin last week.

This is the fifth assessment report, and differed from the previous reports by also including some (contentious) discussion of ethical considerations. Notably, this report acknowledges that economic growth is the fundamental driver of emissions. It also offers economic analysis showing that taking necessary steps to protect the climate would require an annual economic growth opportunity loss of a mere 0.06%. As Joe Romm noted: “that’s “relative to annualized consumption growth in the baseline that is between 1.6 percent and 3 percent per year.” So we’re talking annual growth of, say 2.24 percent rather than 2.30 percent to save billions and billions of people from needless suffering for decades if not centuries.”

That’s great, but the big question is: What investments are recommended, and would they actually work? What became clear from leaked earlier drafts was a troubling prominence of false solutions and unicorns included among the strategies for mitigation.

The report considered 900 stabilization scenarios, aiming to achieve anywhere from 430-720 ppm (parts per million of CO2) by 2100. What they concluded is that to achieve (maybe) even the alarmingly high 450-550 ppm – the level thought to hold some chance for limiting warming to 2 degrees above pre-industrial levels – would at this point require not only reducing emissions, but also using some technology to actually remove CO2 from the atmosphere.

It seems that IPCC is at a loss to provide realistic pathways even to achieving 450 or 550 ppm, which is pretty alarming in itself, but also, it seems unrealistic to assume in any case that we are in control of earth systems such that we can pick a ppm target and just go there. We are already experiencing unanticipated, underestimated and uncontrollable feedbacks that make the discussions of targets and ppm modeling seem a bit obsolete. Nonetheless, this is the framework for the report.

IPCC is telling us that we will need not only to reduce the ongoing flow of emissions, but also to find a way to pull CO2 out of the atmosphere. The working group cochair Ottmar Edenhofer, a German economist, stated at the press briefing that many scenarios “strongly depend on the ability to remove large amounts of carbon dioxide from the atmosphere.”

How are we to supposedly remove CO2 from the atmosphere? The only techniques on offer are bioenergy with carbon capture and sequestration, also called BECCS, and afforestation.

The problem with this conclusion, and the reason the media picked up on it even prior to the final report release is that BECCS is almost entirely unproven; we already have a strong basis for assuming it will not actually work to remove CO2, and it is extremely risky and costly. IPCC acknowledges this, even as they deem it essential.

The media, starting with The Guardian, picked up on this even in advance of the final negotiations, referring to BECCS as “the dangerous spawn of two bad ideas,” and in another article referring to it as the “plan to worsen global warming.”

The BBC headlined “UN dilemma over ‘Cinderella’ Technology.” And the UK Daily Mail asked: “Could we SUCK UP climate change? Referring to the great potential for carbon storage in Britain due to many abandoned coal mines and gas wells.

Here is what the final summary report actually states: “Mitigation scenarios reaching about 450 ppm CO2eq (carbon dioxide equivalent) in 2100 typically involve temporary overshoot of atmospheric concentrations as do many scenarios reaching about 500-550 ppm CO2eq   in 2100. Depending on the level of the overshoot, overshoot scenarios typically rely on the availability and widespread deployment of BECCS and afforestation in the second half of the century. The availability and scale of these and other Carbon Dioxide Removal (CDR) technologies and methods are uncertain, and CDR technologies and methods are, to varying degrees associated with challenges and risks (see Section SPM 4.2, high confidence). CDR is also prevalent in many scenarios without overshoot to compensate for residual emissions from sectors where mitigation is more expensive. There is only limited evidence on the potential for large-scale deployment of BECCS, large-scale afforestation and other CDR technologies and methods.”

Biofuelwatch (the organization for which I serve as codirector) authored a report on BECCS in 2012, and so we have some familiarity with the nature of the “uncertainties” and the degree to which evidence on the potential is “limited.”

There is near-zero real-world experience with BECCS beyond a handful of attempts and a surprising number of canceled projects.

BECCS is not only “risky,” but already we have very good reasons to assume it will fail. For one thing, the entire logic behind BECCS rests on false assumptions. One false assumption is that bioenergy (and so far that appears to include all manner of processes, from corn ethanol refineries to coal plants retrofitted to burn trees in place of coal for electricity) is “carbon neutral.” The idea is that adding CCS to a carbon neutral process, will render it “carbon negative.” That simplistic thinking assumes that carbon absorbed out of the atmosphere by plants as they grow will be captured and buried, and then when more plants grow, they will absorb yet more carbon, a net “removal.” But, much is left out of that story.

Virtually nobody still contends that corn ethanol is “carbon neutral.” Yet the premier BECCS project that is often referred to is an ADM corn ethanol refinery in Decatur Illinois. In fact, when emissions from indirect impacts are included in analyses, along with a complete assessment of the impacts from growing, harvesting, fertilizer and chemical use etc., most bioenergy processes actually cause more emissions even than the fossil fuels they are meant to replace. As for burning biomass (mostly wood) for electricity, there is a substantial literature – including peer-reviewed science, challenging the “carbon neutral” claim. It is well-established that counting just the emissions from smokestacks, burning wood releases around 50 percent more CO2 per unit of energy generation even than coal, along with many other pollutants. And it is simply incorrect to assume that this CO2 (as well as even further emissions resulting from harvest, transport and many indirect impacts) will be resequestered in new tree growth. If new trees do in fact grow, it may take decades. Further, we know already from the current scale of biofuel and biomass demand – just look at the current corn ethanol debacle – that it is driving loss of biodiversity, higher food prices, land grabs and other damages. Scaling up bioenergy to the extent that would be required to supposedly reduce global CO2 levels would be a disastrous backfire.

IPCC might have noted that the US EPA officials, charged with regulating CO2 emissions, found itself stymied with regard to how to account for emissions from bioenergy. Under pressure from industry, they decided to exempt biomass burning facilities from regulation for three years while they studied the problem. But that exemption was challenged in court, and the judge ruled there was no basis for it. In other words, CO2 from bioenergy should not be assumed “neutral” and therefore should not be exempted from regulation.

Most BECCS projects so far involve capturing CO2 streams from ethanol fermentation processes (because that is a relatively pure stream of CO2 that is cheaper and easier to capture). But then, the CO2 is not stored safely away, rather it is pumped into depleted oil wells to raise the pressure enough to force remaining oil out, a process called “enhanced oil recovery.” Oil industry analysts in fact estimate there is huge potential for accessing oil in this manner, and because it is profitable, it offsets some of the very substantial costs associated with CCS. This is hardly “carbon dioxide removal”! Furthermore, it is laying the groundwork in experience for using CCS applied to fossil fuels – i.e. so called “clean coal.” Capturing CO2 from coal plants remains more expensive and difficult due to the mix of gases, but the coal industry is hopeful that technology development will occur with BECCS.

The largely prohibitive costs have to do with the fact that capturing, compressing, transporting and storing CO2 all requires infrastructure and energy. It is assumed that adding CCS results in a “parasitic” energy load in the range of at least 30 percent of the facility capacity. In other words, 30 percent more biomass would be needed simply to power the CCS process itself.

Pumping and storing CO2 – from bio or fossil fuels – underground is downright foolhardy. We know full well that the earth’s crust is not static! There is the potential that CO2 deposits could increase seismicity (earthquakes). A catastrophic sudden release would be very dangerous given that CO2 is lethal at high concentrations. There is also much concern that the vast infrastructure of pipelines and trucking etc. that would be entailed in large scale deployment of CCS (with fossil or bio energy), would result in myriad small scale leaks. Valclav Smil calculated that to sequester just a fifth of current carbon dioxide emissions “. . . we would have to create an entirely new worldwide absorption-gathering-compression-transportation-storage industry, whose annual throughput would have to be about 70 percent larger than the annual volume now handled by the global crude oil industry, whose immense infrastructure of wells, pipelines, compressor stations and storages took generations to build.”

IPCC recognizes how risky and uncertain BECCS is, and yet they still deem it essential? We might have hoped they would offer a pathway with more likelihood of success, given all that is at stake.

IPCC also include natural gas, nuclear and large-scale bioenergy all as “low-carbon or zero-carbon” options. And, as with BECCS, they provide lip service to the risks and concerns around these, but they seem to minimize these very real risks when the scenarios they rely on incorporate those same mitigation strategies (to differing degrees) as though they were viable.

To their credit, IPCC has recognized that geoengineering is not an option and should not be considered “mitigation.” While there was pressure, especially from Russia, to include geoengineering, including solar radiation management (SRM) into the mix, this was met with welcome resistance. Carbon Dioxide Removal techniques, including BECCS, also are considered in the context of geoengineering debates. But they are tightly linked to practices in place already, so it is more difficult to place them squarely in the geoengineering camp, where they would be subjected to the Convention on Biological Diversity defacto moratorium. We already know the impacts of large-scale bioenergy, and they are not at all clean, green, sustainable, low-carbon or carbon negative. They make matters worse, not better. Under the influence of desperation, we risk making lethal blunders.

While IPCC painted a remarkably palatable economic analysis of the costs of mitigation, they fall pretty flat in providing realistic means for using that finance to successful ends. Perhaps the problem boils down to this: IPCC knows economic growth is the driver, but instead of suggesting that we dramatically ramp it down within a justice-based framework, they instead seek a means to keep the engines of growth revving, but using “alternative,” and so-called “zero- and low-carbon” sources of energy and materials. In so doing, they sidestep reality.

This article originally appeared in Truthout.

Five facts CBC listeners didn’t hear from Canada’s geoengineering cheerleader

What’s missing from David Keith’s climate change charm offensive

by Jim Thomas

This article was originally published by the Media Co-op.

David Keith's preferred geoengineering scheme involves spraying sulphuric acid into the atmosphere.
David Keith’s preferred geoengineering scheme involves spraying sulphuric acid into the atmosphere.

Last Sunday, CBC listeners across Canada enjoyed their morning coffee and took care of a few chores around the house while the calm, mellifluous vocal cadences of Michael Enright and his guest David Keith washed over them. Keith, Enright said while introducing his guest, is a prominent and well-respected scientist, and the author of “The Case for Climate Engineering.”

Although both David Suzuki and Al Gore had branded Keith’s proposals “insane, utterly mad and delusional in the extreme”  Enright took pains to reassure listeners that his guest — a Harvard professor — was perfectly sane. Enright was kinder to Keith than Stephen Colbert had been a few months previous, and so unfortunately avoided a number of tough questions.
Climate Geoengineering is the process of attempting to counteract climate change by large-scale methods other than reducing carbon emissions. These include spraying tonnes of sulphuric acid into the atmosphere (Keith’s preferred option), mounting giant space mirrors to reflect sunlight and slow its warming effects, dumping tonnes of iron filings into the ocean to stimulate plankton growth, and sucking carbon out of the atmosphere with giant fans.
These measures have been opposed both because of their unpredictable effects and the fact that they give an excuse to rich countries to continue to increase carbon emissions on the basis of trumped-up techno-promises. In the same breath, Keith acknowledges and dismisses these criticisms.
Environmentalists who oppose geoengineering, Keith told Enright, are “more committed to their answer to the problem than really thinking in what I feel is a morally clear way about what our duties are to this generation and reducing the risks that they feel.”
Keith made the case for geoengineering, but he also made the case that those who oppose geoengineering are doing so because they have priorities other than slowing down the effects climate change. He aligned geoengineering with concerns about “how we want to leave the planet for our great-grandkids.” He took the time to talk about kayaking trips, and how he was motivated by a love of the natural world.
Keith didn’t take the time to mention a few other details. For those who are skeptical about Keith’s case for geoengineering, here are five things that Keith didn’t mention, and Enright kindly didn’t bring up.
1. David Keith runs a geoengineering company funded by tar sands money
In addition to being an author and a professor, David Keith heads up Carbon Engineering, a Calgary-based startup that is developing air-capture technologies for removing carbon dioxide from the atmosphere. The company is funded by Bill Gates, who is also a geoengineering proponent, and by N. Murray Edwards, an Alberta billionaire who made his fortune in oil and gas. Edwards is said to be the largest individual investor in the tar sands, and is on the board of Canadian Natural Resources Limited, a major tar sands extraction company. Carbon Engineering hopes to sell the carbon dioxide it extracts to oil companies to help in Enhanced Oil Recovery (EOR)- a technique for squeezing more fossil fuels out of the ground which will in turn be burnt to produce more atmospheric carbon.
2. The geoengineering that Keith proposes could be disastrous for the Global South
A study of the likely effects of one of the methods Keith is promoting, spraying sulphuric acid into the atmosphere with the aim of reflecting sunlight could cause “calamitous drought” in the Sahel region of Africa. Home to 100 million people, the Sahel is Africa’s poorest region. Previous droughts have been devastating. A 20-year dry period ending in 1990 claimed 250,000 lives. Other models predict possible monsoon failure in South Asia or impacts on Mexico and Brazil, depending where you spray the sulphur.
3. Keith’s geoengineering proposals are deeply aligned with the financial interests of the fossil fuel industry
If oil, natural gas and coal companies can’t extract the fossil fuels that they say they’re going to extract, they stand to lose trillions of dollars in stock value, $2 trillion in annual subsidies, and about $55 trillion in infrastructure. David Keith’s enthusiasm for geoengineering plays to the commercial interests of these companies whose share value depends on their ability to convince investors that they can continue to take the coal out of the hole and the oil out of the soil. This may be why fossil-sponsored neoconservative think tanks such as the American Enterprise Institute and the Heartland Institute have been so gung-ho for geoengineering research and development along exactly the lines that David Keith proposes. For example there is very little difference between what Keith proposes and what the American Enterprise Institute’s Geoengineering project calls for.
4. Climate scientists just issued a new round of criticisms of geoengineering
In the most recent report of Working Group II of the Intergovernmental Panel on Climate Change (IPCC), released before Keith’s interview aired, climate scientists loosed a new salvo of problems with various geoengineering schemes. “Geoengineering,” according to the report, “poses widespread risks to society and ecosystems.” In some models, Solar Radiation Management (SRM) — what Keith is pitching — “leads to ozone depletion and reduces precipitation.” And if SRM measures are started and then stopped for whatever reason, it creates a risk of ”rapid climate change.”
5. There’s already a widely-backed moratorium on geoengineering
While David Keith discussed possible ways of governing geoengineering internationally  he failed to mention that at least one UN convention was already dealing with the topic. The broadest decision yet on geoengineering, a 193-country consensus reached at the UN Convention on Biodiversity specifies that unless certain criteria are met, “no climate-related geo-engineering activities that may affect biodiversity take place.” The moratorium is to remain in effect until geoengineering’s impacts on biodiversity and livelihood are analyzed, scientific evaluation is possible, and “science based, global, transparent and effective control and regulatory mechanisms” exist.
In the interview, Keith said outright that he wants to bypass such a system. He considers the input of Africa and South America, and much of Europe and Asia as unnecessary in order to move forward with a geoengineering scheme. It would be enough, he told Enright, to gain the agreement of a small but powerful “countries with democratic institutions,” citing China as an example, along with the US and the European Union. David Keith has been recognized for his achievements in applied physics, but when it comes to political science, it may be time for him to hit the books.
Jim Thomas is a Research Programme Manager and Writer at ETC Group.


Where’s the Lorax When We Need Him?

It’s a shame that the Lorax and his message “Who Will Speak For The Trees” has been relegated to the realm of children’s cartoons and fantasy. Especially as trees, forests and ecosystems appear to be right smack in the epicenter of swirling debates about climate change. What those debates seem to boil down to (as the world burns around us) is whether it makes more sense to 1) cut down remaining forests and burn them for “renewable energy”, 2) put a fence around them, measure their carbon content and sell them to polluters as “offsets”, or 3) install vast plantations of trees — (perhapsgenetically engineered to grow faster), to suck up atmospheric carbon in hopes this will counter the ongoing gush of carbon into the atmosphere (geoengineering via “afforestation”) or this recent proposal which suggests that cutting down high latitude temperate and boreal forests — or replacing them with short rotation tree plantations, might help “fix” the climate.

Decisions, decisions! So many options. What shall we do with all the trees?

Here in the U.S>, a debate is brewing out west in light of recent legislation proposals from Oregon representatives Wyden and Defazio (among others) that would provide supports for “thinning and restoration” (climatespeak for logging) on public lands. The underlying motive is to get access to timber currently off limits to supply expanding demand for biomass to burn as “renewable energy.

Meanwhile, one of the few proclaimed “successes” coming out of Warsaw climate negotiations was towards an agreement on REDD+ (reducing emissions from deforestation and forest degradation). Yet this is hardly a success given mounting evidence that, among other concerns, REDD fails to address the underlying drivers of deforestation. What it has achieved is to create bitter divisions among indigenous communities faced with proposals that would commodify their lands and utterly distracted forest policymakers who are now so caught up in endless debates over REDD that they seem barely to notice that their forests are meanwhile being liquidated.

As if we were not confused enough already, brilliant scientists at Dartmouth have now determined that the value of high latitude temperate and boreal forests (let’s call them “Truffula” trees) — should not be measured solely in terms of their carbon content, nor the value of their timber, but also with respect to the “ecosystem service” they provide (to us that is) of absorbing or reflecting sunlight: their impact on albedo. The basic idea is that at higher latitudes, dark colored tree cover absorbs light and has net warming impact, whereas removing trees or keeping them small and immature allows light to penetrate, increasing the reflectivity of the white ground surface, and contributing to a net cooling effect.

The Dartmouth scientists say: “Our results suggest that valuing albedo can shorten optimal rotation periods significantly compared to scenarios where only timber and carbon are considered… we expect that in high latitude sites, where snowfall is common and forest productivity is low, valuing albedo may lead optimal rotation periods that approach zero.”

In other words those forests will be “more valuable” cut down or replaced with short rotation stunted tree plantations.

In their conclusions the Dartmouth authors state: “In particular, documenting relationships between forest biomass growth, the frequency of snowfall, latitude, and regional stumpage prices may help elucidate locations wherein different forest project strategies provide the maximum climatic benefits.”

Oh really? That sounds easy! Just plug those numbers into an equation and voila we can discover that “optimal climate benefits” indicate we should cut the forests down?

What would the Lorax say, I wonder? Probably that the “value” of forests should not be confined in so reductionist a manner where consideration is granted solely to carbon content, albedo, or timber harvest pricing. What about, just for example, the role of forests in regulating temperature and rainfall patterns over large areas of the earth? Or the role of compounds released into the atmosphere by trees in stimulating cloud formation (and hence influencing albedo)? What about the role of forest in creating fertile soils? Or the production of hydroxyl radicals by forests which are thought to play a key role in the breakdown of atmospheric pollutants? And what about the many many life forms that depend on healthy forest ecosystems?

The authors seem to have had some inkling that there might be other views of the “value” of forests, and offer lip service, recommending that “forest management should include biodiversity considerations when managing the flow of timber, carbon, and albedo services in mid and high latitude temperate and boreal forests.”

Managing the flow of services? If we cut them down then it seems most of the “flow” of “services” will likely come to a screeching halt!

The paper closes with a plug for funding: “Thus, as in all modeling work, we must take caution to consider that optimal forest management may vary quite drastically as the planet responds to climate change. Consequently, detailed and refined projections of these changes are critical for future work in this arena.”

Well, it is good that the authors realize that things will drastically change as climate change progresses. Like, for example, we might find that there is a lot less snow or it melts much faster. Which begs the question: After we have cut down the high latitude temperate and boreal forests to increase the reflective potential of snow cover, what happens when there is no more snow? Now we have no albedo and… no trees, no timber, no carbon, no biodiversity. This must be the Once-ler’s idea!

Perhaps they are taking their lead from another Once-ler, the Indonesian Ministry of Forestry. Recently when questioned about the horrendous doubling in Indonesia’s deforestation rate over the year following announcement of a moratorium on new concessions, responded that this was not “deforestation”, only “temporary deforestation“, (euphemism of the year).

The paleoclimate record suggests that in previous cases where earth’s climate has heated up or cooled down, stability was regained in large part by the sequestering of carbon in plants — forests and ecosystems. For example, Southeast Asian tropical peat forests (now being destroyed for oil palm and pulp and paper plantations and “temporary deforestation”) are thought to have played a key role in stabilizing climate between glacial and interglacial periods over the course of the last few million years of earth history.

The survival of many species faced with warming depends upon a steady northward and uphill climb towards cooler and more favorable conditions. So let’s see… now we are going to cut those forests down, further diminishing the remaining pool of ecosystem diversity in order to gain some supposed cooling due to albedo enhancement over the coming season, or two?

I am certain that the Lorax, and the children to whom that story so appeals, have far more common sense.

Another study reveals dangers of geoengineering through ‘iron fertilisation’

Diatom, picture: Alfred Wegener Institute for Polar and Marine ResearchYet another study questions shows how dangerously simplistic the assumption that dumping iron filements into oceans will sequester carbon is.  This latest study, by Ellery D. Ingall et al, published in Nature Communications, looks at a particular type of phytoplankton, a diatom which soaks up iron from oceans and stores it in its skeleton and thus, when the phytoplankton dies, on the ocean floor.  Continue reading “Another study reveals dangers of geoengineering through ‘iron fertilisation’”

Don’t Dump Iron — Dump Rogue Climate Schemes

George-Russ-Dumping-from-Ocean-Pearl-2012-Photo-HSRCOriginally posted to Huffington Post

by L. Jim Thomas

The press release had a pretty stark headline: “Haida Announce Termination of Russ George.” If the name sounds familiar its because he is the Californian businessman who last year led the world’s largest, and unapproved, geoengineering (climate manipulation) scheme to dump over 100 tonnes of iron into the Pacific ocean west of Haida Gwaii in British Columbia. Dubbed a “geo-vigilante” by The New Yorker and a “rogue” geoengineer by almost everyone else (including the World Economic Forum and Canada’s environment minister Peter Kent), George was the guy who persuaded the small impoverished indigenous community of Old Massett on Haida Gwaii to part with over $2.5-million. He did so under the pretense that dumping iron in the ocean to stimulate a plankton bloom would net lucrative profits in the carbon credit market and maybe even bring back salmon stocks. Continue reading “Don’t Dump Iron — Dump Rogue Climate Schemes”

Geoengineering Is a Dangerous Solution to Climate Change

by Rachel Smolker

As the realities of global climate change become ever more alarming, advocates of technological approaches to “geoengineer” the planet’s climate are gaining a following.

But the technologies that are promoted — from spraying sulphate particles into the stratosphere, to dumping iron particles into the ocean, to stimulate carbon absorbing plankton, to burning millions of trees and burying the char in soils — are all fraught with clear and obvious risks, and are most likely only going to make matters worse.

Yet zeal for these approaches continues unabated. According to right-wing think tank American Enterprise Institute, geoengineering offers:

“…the marriage of capitalism and climate remediation…What if corporations shoulder more costs and lead the technological charge, all for a huge potential payoff?…Let’s hope we are unleashing enlightened capitalist forces that just might drive the kind of technological innovation necessary to genuinely tackle climate change.”

 Forget about cutting emissions: manipulating the atmosphere and biosphere through geoengineering is the only sensible option for business and thus policy makers, they claim.

Notably, on the very same website, American Enterprise Institute claims that opponents of the Keystone Pipeline are exaggerating environmental risks while undermining economic gains and ‘neighborliness‘.

The connection between the tar sands industry and geoengineering advocates is perhaps not immediately obvious, but it makes perfect, ugly sense. Tar sands investors and their allies have long realized that geoengineering could provide them an extended lease on life — and a convenient means to avoid the shuttering of their industry, which many consider the single most destructive and climat — damaging form of energy extraction.

Hence, it isn’t surprising that tar sands magnate Murray Edwards, director of Canadian Natural Resources Ltd, actually fact funds a geoengineering company that works on techniques for capturing CO2 from the air called Carbon Engineering.

Carbon Engineering’s president, David Keith, is one of the most vocal and best funded advocates of geoengineering. Carbon Dioxide air capture is often viewed as benign or “soft” geoengineering. After all, what could possibly be wrong with removing carbon dioxide from the overloaded atmosphere?

For starters, air capture of CO2 requires vast amounts of water and, yes, more energy. According to one study, scrubbing all current annual fossil fuel emissions from the air would deprive 53 million people of water. Even capturing CO2 from power station smokestacks, where it is already in a relatively concentrated stream, requires those power plants to burn nearly one third more fuel in order to generate the same amount of energy, plus the additional demand required to power carbon capture.

Capturing CO2 from the air, where it is measured in parts per million, would require vastly more power stations to be built in the first place. More carbon-spewing power stations that is, to help scrub a bit of the emitted CO2 back out of the air.

What Carbon Engineering is developing may be nonsensical from an environmental and scientific perspective, but it fits neatly into the tar sands’ industry agenda for portraying themselves as “low carbon.” In 2011, Richard Branson chose Calgary for announcing the shortlist of his “Virgin Earth Challenge” which offers a $25 million prize to one project working to remove CO2 from the air. His spokesperson explained the rationale for this choice:

“Calgary is a good place to start low-carbon technology. It’s an energy centre [with inventiveness and rigor to apply “to sustainable, low-carbon and economically viable technology.”

Tar sands influence behind so-called ‘soft geoengineering’ can be found in unexpected places. Take a recent announcement by Vermont-based Green Mountain Coffee:

“Mountain Coffee Roasters is helping to fund nonprofit Radio Lifeline’s Black Earth Project, an initiative that uses biochar to help Rwandan farmers mitigate the effects of climate change. Radio Lifeline’s project partner Re:char, a Kenyan developer of small-scale biochar technologies, will use agricultural residues such as dried corn stalks, grasses, rice hulls, coffee pulp, cow manure and wood chips as feedstock for the biochar production.”

 Green Mountain Coffee and Radio Lifeline may not associate such a project with ConocoPhillips Canada, but in fact, ConocoPhillips has been the foremost corporation to promote and fund biochar developments, apparently motivated by hopes that they can eventually purchase cheap offsets for their tar sands operations — for example under the Alberta ‘tar sands’ Offset System. Re:char themselves have received fundingfrom Conoco .

Far greater Conoco funds have gone to biochar developments in Iowa, to the Biochar Protocol , which aims to get biochar included into carbon offset markets, and to CoolPlanet, a US Venture with the motto: “Imagine driving today’s cars & SUV’s while actually reversing global warming using fuel that costs less than $1.50/gallon.”

Other tar sands investors, including Cenovus Energy, BP and Shell have also funded biochar developments, as has their friend Richard Branson.

Some might argue that it is acceptable to take dirty money to fund projects that will help African farmers make their soils more fertile and hold more carbon. Yet what the scientific evidence and experience from field trials shows is that biochar cannot be relied on to achieve either of those goals.

It can even have the effect of suppressing yields and causing a loss of soil carbon. Farmers who are recruited for supposed “trials” tend to be ill-informed, hearing only the hype from project developers. In effect, they are being duped to take part in these projects based on incomplete and in some cases downright false information.

For example, when a Cameroonian researcher looked at a Biochar Fund project in his country, he found that farmers had been promised great benefits, including finance from nonexistent carbon markets. They had donated their land and labor. Yet the promised benefits failed to materialize, and the project was shortly abandoned. It was nonetheless touted as a “success” on websites and in the media.

So far, biochar projects are invariably small, largely serving PR purposes. Yet if, as many of its advocates hope, it were to be scaled up to the level needed to supposedly offset any significant amount of fossil fuel emissions, the consequences would be grave. According to a study about the “sustainable biochar potential”, 556 million hectares of land would need to be converted to biochar production to “offset” 12 per cent of annual CO2 emissions (presuming, of course, that all of that biochar would actually sequester carbon, which is contradicted by evidence).

Carbon dioxide air capture and biochar, despite their potentially massive impacts in terms of energy, water and land requirements, are among the geoengineering proposals that are considered more benign. They are being promoted in part to soften up public opinion for other more intuitively objectionable forms of geo-engineering, such as spraying vast amounts of sulphur particles into the stratosphere or manipulating clouds over large areas.

Those approaches would indeed be guaranteed to produce rapid effects. Among them: immediate crop-failures, acid rain and ozone destruction. In sum, geoengineering options amount to “picking your poison.” The tar sands industry, with somewhere on the order of 50 billion dollars invested and rapidly expanding its operations, is hoping that choice will enable them to continue profiting from their dirty business, at any and all cost to the planet.

This article originally appeared on HuffPo.