Post-Paris world: towards doom or salvation?

At the end of 2015, two events took place in Paris that resonated throughout the world and perhaps altered the course of modern history in significant ways: the ISIL terrorist attacks that killed 130 people, and the UN Conference on Climate Change (often referred to as COP21, which stands for “The 21st Conference of the Parties” in UN jargon). The former brought pain and misery both to the people directly affected by the attack, and to those who have to face the resulting global wave of mistrust, hate, and xenophobia. The COP21 meeting, on the other hand, showed that the global community can come together in times of crisis and work together trying to address Climate Change – a threat that has the potential to displace and cause the deaths of many more people than any terrorist organization ever could. Or did it?

OLYMPUS DIGITAL CAMERA

Climate March in Downtown Los Angeles. Photo: J. Baronas, 2015.

TL;DR of the Paris Agreement

In contrast to the widely acknowledged failure that was the 2009 Copenhagen COP15, the meeting in Paris has been hailed as a huge success and “the end to fossil fuel era” both by mainstream media, as well as climate change activists.  So what makes it so much better?

Surprisingly, in terms of specific goals agreed upon, not that much:

  • A long-term goal of keeping global warming “well below” 2° C – previously agreed upon in Cancun in 2010;
  • At the insistence of most vulnerable countries, such as low-lying islands, an “urging” to limit the warming to 1.5° C – a new but wildly unrealistic proposal when the reality of current emissions reductions pledges is considered (more than 3°C, see below);
  • The goal of setting up a $100 billion per year fund to help poor countries reduce their emissions and deal with the impacts of climate change – first proposed in Copenhagen in 2009;
  • A requirement for countries to pledge how much they want to cut their emissions and a framework to review everybody’s progress every five years – with the “encouragement” for countries to increase the cuts each time. This one is new and the center point of the whole Paris Agreement.

To be fair, the COP meetings build on each other, and the Paris Agreement text reflects the years of work since Copenhagen and even before then. What makes the Paris Agreement truly different is that it got the approval of all the 200+ nations involved and should become legally binding by mid-2016 when at least 55 (representing at least 55% of all CO2 emissions) of them ratify it. Whether that truly happens, remains to be seen. The second key difference is that, although legally binding, the text itself is phrased so as not to be binding in any real way, i.e. there are very few specific numbers or timelines, and instead a whole lot of feelgood promises and vague “commitments”.

Some gems (the full text can be found here):

… emphasizing that enhanced pre‐2020 ambition can lay a solid foundation for enhanced post‐2020 ambition …

… emphasizing the enduring benefits of ambitious and early action, including major reductions in the cost of future mitigation and adaptation efforts …

Essentially, the agreement says that it would be great if everyone were to cut their emissions and the sooner, the better. But then, considering what the individual parties have promised, the text recognizes that the problem has not actually been addressed:

Emphasizing with serious concern the urgent need to address the significant gap between the aggregate effect of Parties’ mitigation pledges <…> and aggregate emission pathways consistent with holding the increase in the global average temperature to well below 2 °C above preindustrial levels…

(emphasis mine)

Well, at least the concern is serious. Any proposed solutions to address this significant gap? Of course there are:

In order to achieve the long-term temperature goal set out in Article 2, Parties aim to reach global peaking of greenhouse gas emissions as soon as possible

(emphasis mine)

Well, that’s reassuring.

The unfortunate reality of climate negotiations is that if it were any more specific and ambitious, the agreement might not have passed. And if it had been phrased in a more truly legally binding fashion, the Republican-controlled U.S. Congress would have to vote on the ratification, which could only go one way.  The negotiators had learnt their lessons in Kyoto and Copenhagen and this appears to truly be the best that could have been.

Back to reality

Therefore, whether one considers the Paris Agreement a success or a failure, depends on whether one is an optimist or a pessimist (realist?), as well as the reference frame. Compared to Copenhagen, Paris indeed was a success. It seems that the fundamental shift in most politicians’ minds has occurred and whether “climate change is real” is not a discussion anymore, at least outside of Republican-led Congress hearings in the U.S. It is highly likely that the recent heat records, multi-year droughts, and unpredictable extreme climate events around the globe since the Copenhagen meeting in 2009 have played a role in convincing the general public, and therefore the politicians who depend on the public’s votes.

While the agreement to try and hold warming to below 2° C is certainly better than no agreement, the question no. 1 now becomes whether countries will be able to stick to their current pledges of emissions cuts (which, again, would still result in more than 3° C warming). The even bigger question no.2 is whether they will truly “ratchet” the cuts over time so that emissions peak and start decreasing at least before 2050. It essentially relies on the effectiveness of the proposed “name and shame” system, where countries report their progress for everyone to see. Peer pressure can be powerful when applied in real time but what happens when a large number of countries show up in 2023 (that’s when the first assessment will take place) without having made any progress? If they outnumber the ones that stuck to their goals, the “naming and shaming” simply won’t work, as everyone will be able to point a finger at someone else.

Science that was sidelined

One might wonder what scientific arguments have been used in coming up with the 1.5 and 2° C warming goals. The answer, unfortunately, is none. The “best case” (read: unlikely) scenario in the latest IPCC report represents 1.8° C of warming by 2100, whereas the “worst case” scenario goes up to 6.4° C within uncertainty. The Paris Agreement actually includes a request for the IPCC to consider a 1.5° C scenario in the next assessment. It will be interesting to see if IPCC decides to essentially waste time and computing resources modeling it, considering how unrealistic it is.

Let’s consider the slightly more realistic, but still highly improbable scenario, where the humanity manages to keep warming to below 2° C (essentially the IPCC scenario B1 or RCP4.5). Here is how IPCC describes it would have to happen:

… a future world of very rapid economic growth, global population that peaks in mid-century and declines thereafter, and the rapid introduction of new and more efficient technologies. Major underlying themes are convergence among regions, capacity building and increased cultural and social interactions, with a substantial reduction in regional differences in per capita income.

.. with rapid change in economic structures toward a service and information economy, with reductions in material intensity and the introduction of clean and resource-efficient technologies. The emphasis is on global solutions to economic, social and environmental sustainability, including improved equity, but without additional climate initiatives.

That sounds quite positive, perhaps even a bit utopian. But mind you, this would still result in 1.1 – 2.9° C of warming. What would that mean in terms of effects on the climate system and society? Below I am adopting IPCC’s language where very likely is > 90% and likely is > 66% probability. If you’re into gambling or betting, those are pretty amazing odds.

  • Sea level rise of 0.2-0.4 m, enough to inundate a number of low-lying islands and coastal areas, especially when coupled with likely frequent and intense tropical cyclones (think Katrina in New Orleans becoming a regular event);
  • Further ocean warming and acidification with well-documented adverse effects on corals and other marine life. Coupled with overfishing, this could result in the collapse of various fisheries and ocean ecosystems;
  • Very likely more frequent heat waves and floods;
  • A shift in precipitation patterns, making high latitudes very likely wetter and making lower latitudes, where a lot of regions are already drought-stricken, likely even dried.
  • Additional 0.5° C warming in 22nd century.
  • Possibly, a complete melting of the Greenland Ice Sheet and a sea level rise of 7 meters over the next few centuries.

This is what the diplomats and media are calling the “world’s greatest diplomatic success”“a historic turning point”, “a monumental success for the planet”,  “the agreement that changes everything”, and so on.

What is to be done

Despite the optimistic goals set out in Paris and a lot of self-congratulation after governments agreed to essentially “do their best” for now and “try to try a little harder” in the future, the reality is that fossil fuels are being extracted, and for the past few years at a rate that is significantly higher than the demand. While oil is so much cheaper than any other form of energy, it will continue to be extracted and eventually end up as CO2 in the atmosphere. With the deluge of Saudi Arabian oil, and soon, Iranian oil in the market, it is anybody’s guess when (if ever) the oil price will start to rise again. And if it does, the fracking and the deep offshore drilling industries will be ready to jump back in the game. Overall,  there is three times more fossil fuels still in the ground than that needed to cause 2° C warming by 2100. We cannot pretend to be addressing climate change, in Paris or elsewhere, while at the same time pumping and digging all that carbon up.

James Hansen, (ex-)NASA’s most prominent climate scientist is not buying into the Paris Agreement and suggests that the only actual way to cut emissions is to establish a carbon fee. However, it is hard to imagine carbon fee implementation at a rate and on a scale necessary to keep warming below 2° C and to avoid the resulting ecological catastrophe. Increasingly, humanity will have to look to potential geoengineering (or climate engineering) solutions, that is, global scale projects that aim to either reduce the solar radiation that our planet receives, or actively capture and remove CO2 from the atmosphere. Interestingly, the latter possibility was hinted at in the Paris Agreement itself. The potential advantages and dangers of climate engineering require a separate discussion. However, as climate change starts adversely affecting millions and then perhaps billions in the decades to come, climate engineering might become a moral imperative. It might become the only way to for the future generations to save themselves from the astonishingly huge mess that our generation has left them to inherit.

Advertisements

Geotraces webinar series!

A quick post: if you want to hear great oceanographers explain their research and describe some exciting discoveries in ocean biogeochemistry, check out the COSEE Geotraces webinar series! Topics cover the cycling of metals and nutrients in the ocean, as well as dynamics of oxygen minimum zones. It is extremely important for us to understand these processes well, since they will play a key role regulating ocean life as climate change affects ocean chemistry and circulation. Ocean fertilization might also prove to be an important CO2 sequestration technique in the future – but once again, we really need to understand how and IF it works in the first place – something one of the webinars will address as well.

Enjoy!

Fukushima Mon Amour

March 11th, 2015 marked the 4th anniversary of the 9.0 magnitude Tōhoku earthquake and a subsequent tsunami that killed over 18 thousand people in Japan. However, most of the media attention in the subsequent days shifted focus to the meltdown of the Fukushima Daiichi nuclear power plant, despite it having claimed zero human lives, in stark contrast to the tsunami. It is perhaps expected and reflects our collective fear of invisible and difficult to measure threats. Or perhaps it’s simply a reflection of our sensationalist mass media and its short attention span. Even so, concerns over radiation release from Fukushima still float around in the mass media and the blogosphere to this day. Unfortunately, a lot of recent “coverage” comes with catchy fearmongering headlines, such as this one.  So how much radiation was released, what are the consequences, and how worried should we be?

How much radiation was released during the Fukushima accident?

First, a quick recap. There are three major kinds of radioactive decay: alpha, beta, and gamma, the last one being the most penetrating and dangerous. However, the level of exposure (i.e. how many radioactive particles are encountered), the length of time over which the exposure occurs, and whether the particles are inhaled or ingested, all make a big difference. The most worrisome of all radionuclides in cases like Fukushima is Caesium-137 (Cs-137). It produces gamma decay and, being similar to potassium, dissolves in water and spreads easily through the environment.

It turns out that Fukushima fallout is anywhere between 14 and 90% of the radiation released during the Chernobyl meltdown (Buesseler 2014). However, in Fukushima’s case ~80% of the fallout ended up in the ocean (Morino et al., 2011), without affecting humans (directly). Furthermore, this is 10-70 times less than the amount released during nuclear weapon testing in the 1960s and 70s.

It is also important to realize that the ocean has an inventory of naturally occurring radioactive nuclides that is many many times bigger than what has been added by human nuclear escapades (Fig. 1). Therefore, the latter is only dangerous when it is released quickly in concentrated amounts. However, the ocean is really good at mixing, diluting Caesium (a soluble element) to non-dangerous concentrations quickly as the water moves away from the source.

Buesseler2014-1

Figure 1. The oceanic inventory of different radionuclides. Anthropogenic Caesium-137 comprises only a small fraction of ocean’s radioactive budget, while naturally derived Potassium-40 and Uranium-238 dominate. 1 PBq = 10^15 Becquerels (one Bq = one radioactive decay event per second). Taken from Buesseler 2014.

As mentioned before, no people have died because of exposure to radiation in Fukushima. World Health Organisation predicts a slightly higher incidence of cancer in the population that was the most exposed, although the models used for such predictions are highly uncertain and ten to overestimate the actual effect. For others, even people from other parts of the Fukushima prefecture, no health effects are expected. For comparison, coal power plants kill 13 000 Americans each year – higher than all nuclear power plant meltdown casualties in history combined.

With regards to contaminated food supply – it appears that as of 2014, only 0.6% of fish caught off Fukushima display Cs concentrations over the very strict limit of 100 Bq/kg. Nevertheless, the fisheries stay closed. It must be noted that some bottom-dwelling fish still display higher levels, most likely due to some accumulation of Cs in the sediments (Buesseler 2014). Therefore, it seems like rigorous monitoring of the local biota should be continued, especially since there appears to still be some leakage of contaminated water from the accident site.

What should we be (not) worried about?

First and foremost, we should not be worried about swimming in the Pacific. Yes, the concentration of Cs-137 in the Eastern Pacific is going to slightly increase over the next couple of years but it will be barely detectable and poses no risk to health. To measure Cs-137 concentration, scientists don’t just go and point a Geiger counter at the ocean – the levels are much lower than what a Geiger counter can detect. Here’s a quick summary of methods from Buesseler et al., 2012. First, 22 liters (6 gallons) of seawater are are passed through a resin that extracts and pre-concentrates all the Caesium. This sample is then placed in a extremely sensitive gamma counter (essentially a huge box made of lead that blocks any natural background radiation from penetrating inside) and allowed to sit for hours and sometimes DAYS. It takes that long for an extremely sensitive instrument to detect enough radiation coming from several gallons of seawater so as to provide an accurate number. If you live on the West coast of the U.S. (or elsewhere around the Pacific) and are particularly worried about radioactive seawater, you can collect a sample and send it to Ken Buesseler’s lab at Woods Hole Oceanographic Institution to be analyzed for Cs-137 (as long as you also provide funding needed for the analysis, since as we know, the U.S. government does not prioritize science funding nowadays). The above project is also a perfect example of how citizens can contribute to science by both collecting samples and crowdfunding (over $100 000 raised!).

Eating seafood is also not an issue, especially if you’re in Japan. Firstly, Caesium does not bioaccumulate to such degrees as, for example, Mercury. Caesium is essentially a salt and therefore gets regularly replaced in any given organism (Doi et al., 2012; Tateda et al., 2013). Either way, Japan has reduced the legal Cs limit in fish to 100 from previous 500 Bq/kg. For comparison, the U.S. limit is 1000 Bq/kg. Buesseler (2014) estimates that radiation dose a typical U.S. consumer will experience from eating seafood is equivalent to 1/5th of a dental X-ray per year, i.e. absolutely nothing to worry about. Here is a relevant XKCD by Russel Monroe, comparing different radiation doses we receive normally to those from Fukushima.

Generally, it does not hurt to be cautious when dealing with such things as radioactive fallout but in some cases over-cautiousness CAN cause harm. For example, obesity (and associated health issues) among Japanese children in Fukushima and nearby prefectures has been increasing due to the fact that they were (and possibly still are) prohibited from playing outside for no reason.

It is estimated that the evacuation and relocation of population during the nuclear incident has indirectly caused deaths of around 1000 mostly elderly people , far outweighing the current (zero) and future (possibly some tens to hundreds due to increased cancer incidence) radiation exposure casualties. And although radiation levels have since returned close to natural background values, the majority of the 80000 or so evacuees are still not allowed to return home by the Japanese government.

Debris from Japan washed up on Canadian shores (The Canadian Press/Jonathan Hayward).

Some silver linings?

Overall, of the triple earthquake-tsunami-nuclear tragedy that struck Japan in 2011, the nuclear plant meltdown part caused the least damage, despite acquiring the majority of media attention. One can only hope that it has served as a good cautionary tale for future nuclear programs and perhaps will result in a push towards renewable sources. Unfortunately, it appears that the closure of nuclear power plants aided by the Fukushima catastrophe will mostly result in a resurgence of coal and other fossil fuel burning energy. However, China knows the costs of coal-burning pollution very well (670 000 deaths in 2012 alone) and it is of note that they decided to carry on with their expansive nuclear energy program, building some 25+ power plants in the coming decades. In this case, Fukushima has at least served to cause a review of safety guidelines and plant siting.

Finally, some of the radionuclides, although diluted to harmless concentrations, will prove to be useful tools for physical oceanographers studying ocean currents and mixing processes, as little consolation to the Japanese as that may be.

UPDATE: I have just discovered a couple of papers published last week on radionuclide levels in seawater, fish, and rice from Fukushima. They’re open access, so go ahead and read through! In case you’re too lazy/busy, here’s a brief summary:

1) Cs-137 in seawater off Fukushima remains elevated due to continuing discharge of contaminated water (Povinec & Hirose, 2015) but the levels are not dangerous for marine life or humans.

2) Consumption of fish, shellfish, and seaweed from coastal Japan or open Pacific ocean waters provides a dose of Fukushima radionuclides (Cs-137, Sr-90) that is LOWER than the natural background level (mostly in the form of Po-210) (Povinec & Hirose, 2015).

3) Since 2012, only a very tiny fraction of rice grown in Fukushima exceeds the strict government standard for radioactivity level. Consumption of this rice, like the fish, provides a negligible dose to the consumers (Nihei et al., 2015).

Ice Ice Baby

Sorry for the recent drought of updates! I was a bit buried in work the past few weeks. You know, gradschool.

In any case, I really recommend you read this. It’s a response to a recent paper published in Nature Geoscience that the Antarctic Peninsula ice sheet is rapidly melting. It brilliantly captures the rhetoric and argument basis of some of our very confused peers. I won’t be surprised if a number of them will fail to recognize the text for what it is and end up embracing this great new “evidence” supporting their noble cause, as detecting sarcasm does require some intelligence.

I also found a great example of how another breed of denialists operate. These are slightly  more sly. Instead of spewing straightaway bullshit, they cherry-pick like real champs to craft their more refined “reports”.

Just to set the stage, there has been another paper published a couple of days ago in the same issue of Nature Geoscience. It essentially concludes that the there is too much uncertainty in the ice oxygen isotope data to firmly assign if the current warming over the West Antarctic is caused by anthropogenic greenhouse gas emissions or due to natural climate variability. Fair enough. (Oh, and so how about that whole mass conspiracy where only “alarmist” papers get published?)

Let’s take a look at what this guy does. He reports on the second paper (that uses d18O of precipitation, which is NOT a measure of any kind of melting in itself, and as the authors have mentioned, is affected by climate as far away as the tropics) and conveniently forgets to mention the first paper that looks directly at how much ice is melting. It is important to note that these two papers investigate different regions of Antarctica, and it has been shown that some of the regions are more sensitive to warming than others. Next, he uses this source to quote the lead author’s comments that a similar warming has happened naturally previously over the West Antarctic ice sheet. But then again conveniently skips over the sentence that says “The same is not true for the Antarctic Peninsula <..> where rapid ice loss has been even more dramatic and where the changes are almost certainly a result of human-caused warming.”

Good job, buddy, good job.

But you know what’s funny? Scientists have know for a while that the Antarctic ice sheets are doing quite ok. It’s the Arctic that is currently taking the biggest hit:

Source: National Snow and Ice Data Center

 

Wait, forget that. It’s probably just the polar bears trying to cash in on taxpayers’ money.

Cool science on mosh pits!

Tumblr: triumphawaits

Being a metal fan and having participated in quite a few mosh pits myself, I found this New Scientist article really interesting and entertaining. Jesse Silverberg from Cornell analyzed the behavior of moshers and showed that they move in the same random way as gas particles. Then his team did some more modeling and even managed to recreate a circle mosh (see the gif above)!

This is a great example of how scientific principles can be applied to investigate essentially anything we encounter. And a cool reminder that no matter how highly we humans might think of ourselves, essentially we are just clumps of particles, subject to the same laws of nature as an atom of gas (granted, mosh pits are probably not the best expression of our intelligence, but still).

Also, if you’ve never tried moshing, I highly recommend it (except for the wall of death, gas particles would never be that stupid). :)

Some brilliant advice on science reporting for scientists. Again the main leitmotif is “get rid of jargon” (check out the table of “forbidden” words!), with compelling story telling a close second. I think it’s time for me to pick some of those books Kristen mentioned.

Sciopic

It’s a daunting time to be a professional science writer. Science, it seems, is working too well. As Carl Zimmer told an auditorium full of science graduate students, “It’s hopeless to cover it all, and it’s only getting worse.”

Or as an exasperated Ed Yong put it:

edyong

This point in the history of science represents an embarrassment of riches – exciting discoveries are made every day, but there aren’t enough people to take scientific reports and craft them into stories that non-scientists understand and actually want to read. As Zimmer told those of us gathered in the auditorium of Yale’s Peabody Natural History Museum, “Let’s not restrict the wealth to this room.”

There is, in fact, a world outside of our departments, and there’s a lot at stake out there.

Nearly half of Americans believe that humans were created in their present form by God within the past 10,000 years…

View original post 1,256 more words

Paloozin’

Okay, so this is not the “next post” I was planning on but I have to do this while it’s still fresh in my memory. A couple of days ago I attended an event titled “Climate Palooza 2013” which was held at the Annenberg School of Communication and Journalism here at University of Southern California (USC).

Image

The event was organized by ESCInitiative (which in itself is a very welcome product of a collaboration effort between USC and NASA’s Jet Propulsion Laboratory (JPL)) and included a number of booths by various environmental and research organizations in addition to panel discussions featuring researchers from JPL and USC along with writers, activists, and policy people.

The event was aimed at the general public with the panelists explaining their research very accessibly and also sharing ideas of how to effectively communicate science. For example, Dr. Chip Miller (who is in charge of CARVE, a project investigating what happens to the carbon stored in permafrost as the planet warms) used three volunteers to represent atoms of a CO2 molecule and show how it is able to absorb infrared radiation, producing the greenhouse effect. Once again, I saw how powerful humor is in keeping the audience engaged – that’s something to keep in mind.

Also, the panelists unanimously agreed that it is extremely important to avoid jargon while explaining one’s research to a broad audience. Which seems like a given but it does require active effort and sometimes is not that easy. Although I have had to explain my research to people many times by now, I still find my “story” usually ends up being somewhat long-winded and contorted. As a scientist, I have the impulse to be as accurate as I can with my words — and that is important while in a academic setting — but for a layman audience a good amount of generalization and simplification is a must. Not to mention the benefit of certain metaphors, analogies and punchlines. In the end, what every researcher has to do is sit down, write up a elevator pitch type spiel and memorize it. And then after a while sit down again, and polish it. And so on. Again, such things are not usually taught in graduate school, which is a shame (it could be that in humanities and social sciences they emphasize it more, I am not sure).

In general, the event was educational for both scientists and non-scientists and I think it’s a great example of the kind of events that should take place ever more on  university campuses and elsewhere.

P.S. I am not sure if they were filming the panel discussions but I will post an update if they become available.