Showing posts with label climate. Show all posts
Showing posts with label climate. Show all posts

Tuesday, 5 January 2016

Thoughts after COP21 and the role of risk assessment and insurance

I recently published a blog for my employer's blog site summarising some of the outcomes of the COP21 meeting in Paris. I focused on some of the developments in the financial world that may be able to help with adapting to the impacts of global warming in the coming decades. You can find the blog here if you’re interested.

Although the various schemes and forums that have been set up will no doubt broaden the reach of the insurance industry through government risk pools, or through micro insurance, many are quite new and innovative for the industry. Some of the new successful options are built as bespoke parametric insurance products that can be very quick to pay out since they are based solely on a defined parameter. The main thing required for such a product is a reliable dataset upon which to build a relationship to losses or costs.

Some financial support for African farmers

A very relevant example, due to this year’s strong El Nino (see another of my company blogs here) and droughts in some parts of Africa (Figure 1), is the African Risk Capacity (ARC).

Figure 1:  Pumping well-water from a borehole in the village of Bilinyang, near Juba, South Sudan. Source: World Bank/Arne Hoel














Their approach uses a drought index and an agreed threshold of precipitation which triggers payouts to help small farmers via their governments. The number of countries signed up to this in Africa is growing year on year, and is a great example of the insurance sector providing financial stability in an efficient manner for farmers that would otherwise be at risk of losing their livelihoods in extremes of drought. According to the ARC website, an analysis by Boston Consulting Group for ARC showed that the potential benefit of running the scheme is 4.4 times the costs of emergency response in times of drought.

Basically, every one dollar spent through ARC, saves four dollars and 40 cents in emergency response costs, and the money through ARC will be where it needs to be in a matter of days rather than the weeks and months that it can take for governments to reassign funds or wait for international aid. This is a great example of the financial sector providing a cushion to potential climatic impacts which may well get worse in the future.

Global initiatives from COP21

As I explain in my company blog, the UN’s Secretary-General Ban-Ki Moon announced his climate resilience initiative named A2R, which stands for Anticipate, Absorb, Reshape. Much of the scientific endeavour for projecting climate change, while understanding and providing early warnings for current climate extremes, broadly fits within “anticipate” section of the initiative. “Absorb” seems to fit naturally with financial mechanisms, as well as building resilient infrastructure (see here for link to fellow MSc blogger) and mitigating actions to reduce CO2. And “Reshape” is again about resilience but with more focus for the future, in building partnerships between the public and private sectors to foster sustainable growth and better decision making for future infrastructure.

There are many complementary initiatives getting started in this push for resilience. A special Task Force on Climate-Related Financial Disclosures, chaired by Michael Bloomberg, who has been ardent in support of building climate resilience, galvanized no doubt by having seen first-hand the impacts of severe weather on New York while presiding as mayor during Hurricane Sandy. This aligns well with a UN endorsed initiative called the ‘1-in-100 initiative’ which aims to encourage companies to better assess and disclose their ‘tail’ risk (risk of a 1% probability loss), giving them a financial incentive to be more resilient if they want to attract investment through being resilient.

Public/private sector partnerships

There seems to be a groundswell of activity in the private sector. While COP21 was underway I was invited, through my employer, to attend a meeting in Paris regarding climate resilience hosted by one of the clients of my company. The hosts are a large international management, engineering and development consultancy firm, and are therefore interested in finding out how different industries and sectors are planning to approach the challenge that will befall us due to global warming. It was a Chatham House Rule session so I won’t go into any details, but the meeting involved delegates from the World Bank, the European Investment Bank, the Rockerfeller Foundation, the Global Sustainability Institute and Anglia Ruskin University, a senior professor from our very own UCL Geography Department to highlight but a few - an interesting line up indeed.

Discussions covered a number of topics from city resilience to financial stability with respect to climate change, but my general feel from the event was that there was a tangible motivation to deal with the future impacts of global warming sooner rather than later. There was a recognition that there is good business opportunity through building sustainable cities, and offering risk assessment products and services in areas that will see increasing climate risk.

I feel the key to helping climate resilience is certainly to engage all parties and ideally within mutually beneficial partnerships. Initiatives such as the UN’s AR2 or the Insurance Development Forum (IDF, also announced at COP21) can help. My own job is part of this too: I may have mentioned it before but my MSc studies are part time, beside my day job which is in the risk and (re)insurance sector. I work with business users and academics to try to match up both of their needs and capabilities, and work towards tangible outputs through research and internal client related projects. An applied science coordinator/leader of sorts, by coordinating a network of academic institutions working with my company on a wide range of risk related subjects, including climate extremes.

There are also academic led partnerships such as the Engineering for Climate Extremes Partnership (ECEP) hosted by the National Centre for Atmospheric Research (NCAR) which aims to “strengthen society’s resilience to weather and climate extremes” (ECEP website: About). I also have a separate blog on this on my company website here. This vision can only truly be achieved through partnerships between the public and private sectors.

The power of partnerships is examined in the Stern Report which also highlights the potential economic downsides of not adapting to climate change. And furthermore a recent paper by Estrada et. al (2015) shows the economic costs of climate change in terms of damage from hurricane. They estimate that 2 to 12% of the normalized losses from the busy hurricane season of 2005 in the U.S. are attributable to climate change. It seems an interesting finding, but since they have also found an increase in both frequency and intensity of storms from the geophysical data, where other papers have only found a increase in intensity, it does seem to be a finding worth more exploration in a future blog.

In summary, it seems clear that the financial world certainly has a key part to play, and when fully committed to investing in new technology and research, it can act as a powerful driver for change in terms of building resilience and financial stability in the face of changing climate extremes.

Afterthought

I know this blog is supposed to be about storms, but I’m starting to realise just how much climate change is a multidisciplinary challenge and so to focus on one subject, one problem, or one solution can reduce our ability to bring together different expertise and opportunity.

I think it’s healthy to take a step back and look at the wider interaction of various adaptation and mitigation initiatives, and then perhaps work out how they can fit into your own area of expertise and capability to do something useful for society.


Sunday, 3 January 2016

Disastrous Return Periods

When talking about return periods it’s easy to assume that a 1 in 100 year event will occur only once in 100 years. This is may lead to misconceptions in the understanding of risk. Consequently, it can lead to poor-decision making. Stakes could be fairly high, perhaps affecting whether or not you invest in flood protection for your home, or misconceptions may influence engineering and building code regulations, when communicating with decision-makers. The reality is that a 1 in 100 year storm, can happen once in 100 years, twice in the same 100 years, three times or even not at all! At the risk of ranting, it strikes me as a hugely misleading communication tool which continues to purvey the risk management world, and communications in the general media today.

I am not alone in this. Francesco Serinaldi, an applied statistician at Newcastle university, wrote a paper in 2014 called: Dismissing Return Periods! Using an exclamation mark in a title of an academic paper gets a thumbs up from me! He goes into much more detail than I could on this subject, describing how univariate frequency analysis can be prone to misconceptions when return period terminology is used.

Serinaldi also suggests better alternatives for engineering and risk modelling applications. These are; the use of probability of excedance, and risk of failure over the life time of a project or average life expectancy of a person perhaps. These describe more objective and robust quantifications of frequency of specific events, or categories of events, defined by a parameter or index.

Perception of safety

Another example is described by Greg Holland in his blog on the Engineering For Climate Extremes Partnership (ECEP) website. When discussing Hurricane Katrina, he also suggests how misleading the description of the levee protection as being able to withstand a 1 in 100 year storm, evoking a ‘sense of safety’. He elaborates (as I mentioned above) that their 1 in 100 year storms is simply a 1% chance of such a storm happening in any given year (irrespective of climate variability).  He explains that this means that there is a 65% chance that such a storm would occur in the next 100 years. When changing the time period it also means that there is a 25-30% chance that such a storm would occur within the next 30 years. This starts to concern a much wider stakeholder groups, including small businesses and home owners.

Return periods can also vary widely depending on the spatial scale of an event. The Great storm of 1987 in the UK was reported as a 1 in 200 year event for many of the southern counties of the UK, whereas for parts of the south coast, it has been assessed to be more like 1 in 10 years! This is misleading in that the storm itself was large enough to cover a broad swath of land with severe impacts, but within this storm return period estimations vary. It depends on how the calculations are conducted, which data are used, thresholds that are assigned for definition of event.

One generalised return period statistic is not adequate to clearly describe the risks associated with a storm. The more objective methods suggested by Serinaldi are an alternative for engineering applications.

Public perception tangent…

As a bit of a tangent, but this has made me think about public communications too. I think that the use of analogues or a narrative, to try to recreate conditions in a viewer’s mind using past experiences, is very powerful in changing perceptions and behaviour, much more so than a misleading return period estimate.  I find it fascination how perception can change based on storytelling: one thing at which humans have always excelled.

An interesting paper by Lowe et al (2006) studies the effects of blockbuster movies such as the “The Day After Tomorrow” which can act to skew perception of risk, but also increase motivation to act on climate change and sensitize the viewer. The paper also notes a lack of knowledge on how to use this new found Hollywood-induced motivation. This is an interesting area of research in its own right.

Too many blog subjects, not enough time!

Tuesday, 29 December 2015

Will GCMs really tell us everything we need to know about climate change?

In a previous blog, I discussed General Circulation Models (GCMs) at varying resolutions.

Here, I’ll highlight a few limitations, especially when looking at tropical cyclones.

Even though GCMs are able to capture tropical cyclone tracks and storm formation to provide hugely valuable forecasts for public safety concerns, we should be aware of the limitations in looking at climate scale variability and change. For example, looking seasons or years ahead into a climate projection, GCMs have less ability to say how many and how intense the storms might be. Hurricane season forecasts are put together using a variety of statistical and GCM-based techniques and we can get a lot of value from both approaches. But there is only so much that we can say.

However, papers by Deser et al 2012 and Done et al 2014 are useful in determining what can be explained on a seasonal or decadal time-scale. James Done found that based on one season, his regional climate model experiments shows that around 40% of the variability in tropical cyclone frequency in the North Atlantic is simply natural variability, and not associated with forcing from greenhouse gases, volcanoes, aerosols or solar variability (external forcing). He notes that from Deser et al. 2012, regional scales can see internal variability becoming greater than externally forced variability. This also highlights the difficulty in assigning a single regional event to changes in climate on a global scale.

To sum up, GCMs
  • as numerical weather prediction models, offer great ability to provide operational forecasts and warnings on a day-to-day basis, 
  • as global/regional climate models, to experiment with the atmosphere and explore sensitivities in the processes that bring about extremes of climate, global climate variability or climate change. 


When looking at seasonal or longer timescales, GCMs run at lower resolution and so lose the ability to capture small scale features that drive tropical cyclones, and so we have to model the large scale influences to look at more general shifts in probabilities of single or seasonal phenomena (e.g. hurricanes or droughts).

Deser et al. 2012 also calls for greater dialogue between science and policy/decision-makers to improve communication and avoid raising expectations of regional climate predictions. I totally agree. Better communication between scientists and stakeholders is important because talking about storms and climate change is highly political. Poor communication can lead to gross misrepresentations by those aiming to mitigate and adapt to climate change, as well as those who do not accept that climate change is a concern.

Future for GCMs?

I can see how GCMs have great ability in helping us understand the sensitivities of the climate system, and as they improve and as computing power increases (along with big data solutions), then so too should our understanding of various climate processes. In fact, growth of the GCM capabilities may well increase the level of uncertainty as we start to model more and more complexity. I do wonder where the next big step will be though. Between CMIP3 and CMIP5 (two rounds of climate model comparison projects – see previous blog) Bellenger et al. (2015) showed some progress, but also commented that overall, there were limited improvements of how ENSO (a dominant mode of climate variability)  is characterised.

An interesting article here by Shackley et al. back in 1998 called; “Uncertainty, Complexity and Concepts of Good Science in Climate ChangeModeling: Are GCMs the Best Tools?”, shows a range of interesting discussion points asking whether GCM-based climate science is actually the best approach from a number of perspectives. Are there alternative types of models that could allow us to better engage with the public, with policy makers or with the private sector? There are certainly alternatives that show promise as discussed on Judith Curry’s blog, who is of the opinion that climate modelling is in a “big expensive rut.” I hope I can find time to expand on this interesting topic in my blog here.


Personally, I am a big fan of GCMs. It's amazing that they can represent the atmosphere with such high fidelity, but it's good to ask these questions and not to forget alternative approaches which may be much more practical and 'fit-for-purpose' in particular situations.. 

In a future blog, I’ll discuss a little about how we talk about probability of future events, and then follow on with a blog on how we currently stand on tropical cyclones and climate change. 

Saturday, 26 December 2015

A Model Family

Many of my recent blogs have been quite focussed on the past. It seems clear that we have a few useful methods that can help us understand storm frequency, with less certainty on how severe they have been. As powerful as palaeotempestology might be, it is sadly unlikely to be able to provide enough data for us to truly compare the climate proxy outputs at the fidelity with which we have been observing storms in the last 100 or so years, especially since we began to use satellites to observe the weather.

However, as an ex-professional in the world of weather forecasting, I often get asked about the chances of a certain intensity of storm occurring, such as, could we see another Hurricane Katrina, or will the Philippines see another Typhoon Haiyan, or closer to home (UK), when will we see another Great Storm of 1987 (aka 87J). Of course, these questions are difficult to answer, unless a storm of similar characteristics is starting to form and picked up in numerical weather prediction models such as the UK Met Office’s Unified Model (UM), or the U.S. NOAA’s Global Forecast System (GFS) (there are many more).

This blog will talk a little about what I know of the types of models that are based on physical laws at work in the atmosphere and oceans, and take super computers bigger than my flat (not saying much) to run.

General Circulation Modelling – the granddaddy of physical modelling

General Circulation Models (GCMs) focus on the actual physical dynamics of the atmosphere and model them by building a system of grid cells (lego-like blocks) which talk to each other regarding momentum and heat exchanges. The size of these grid cells defines the scale of the weather phenomena that can be modelled.

However, there is a trade-off between three facets of a GCM configuration. With limited computing resources, a balance must be struck between complexity (the physics that are included in the model in the actual lines of code), resolution (size of grid-cells) and run-length (how much time does the model  represent i.e. into the future or a period in the past perhaps). Basically climate models use Duplo bricks, and high resolution models use normal Lego bricks. The analogy also works as the can fit together nicely (Figure 1).

Figure 1: Larger Duplo (climate models) bricks and smaller Lego (weather forecasting models) bricks working together. Source: Wiki Commons Contributor: Kalsbricks

I wonder what type of modelling is analogous to mechano? Thoughts on a postcard, please, or in the comments section below?

In case you were wondering, the Lego analogy came about since that's what I bought my three year old nephew, Harry, for Christmas. The present that keeps on giving! Merry Christmas by the way!

Lego Bricks

High-resolution model configurations of some of the big GCMs have been built that can, for example, capture the small-scale eddies around the headlands of the Isle of Wight in the UK (by the Met Office during their involvement London Olympics 2012). Models of grid-scale, in the order of a few hundred metres, are used for this detailed work and are run over a very small region.

Another example of high resolution modelling: A regional model was employed to reanalyse Cyclone Megi from 2010 which had one of lowest central pressures ever recorded. The comparison shows satellite imagery alongside a model run (by Stuart Webster at the Met Office) with amazing detail of the eye-structure and outer bands of convection. Because of the presentation of the model data, the two are difficult to distinguish for the untrained eye (Figure 2).


Figure 2: Cyclone Megi simulation (top) showing eye- wall and convective bands, compared to similar locations and overall size of the real storm in a satellite image from MT-SAT 2. Source: Met Office.

Duplo bricks

GCMs traditionally struggle to match the intensity of storms in climate model configurations, as described in the IPCC AR5 chapter on evaluation of climate models (IPCC WG1 AR5: 9.5.4.3), but example such as the Met Office’s Cyclone Megi, and others models with resolutions of 100km or so show that the science is still able to model many features of tropical cyclone evolution.

They are also used to model the large scale planetary interactions that govern phenomena such as ENSO, and are captured well according to the selection of models used in the Coupled Model Inter-comparison Project (CMIP). CMIP is currently on its fifth incarnation, CMIP5, which is used by the IPCC to understand future climate change. This paper by Bellenger et al. (2015) shows some of the progress made in recent years, between CMIP version, however, due to similra ability to represent large scale features when examining ENSO, both CMIP3 and CMIP4 models can be used in conjunction as a broader comparison

Assembling the ensemble

The “ensemble” is also a technique used to run a model multiple times with slightly different starting conditions to capture a range of uncertainty in the outputs. No model is perfect so their products shouldn’t be believed on face value, but ensembles can help us by showing the range of possibilities as we try to represent what we don’t know in the input data.

This addresses some of the observational uncertainty. GCMs starting points are based on the network of observations that are connected up throughout the world, and standardised by the World Meteorological Organisation (WMO) for weather forecasting. These observations include ground-based observations (manual and automatic), radar imagery of precipitation, satellite images, aircraft reconnaissance (with tropical cyclones), sea surface readings, and weather balloon ascents (and more) which are all assimilated into an initial condition, and gradually step forward in time by the gridded global model. The starting point is also called ‘the initialisation’ in a forecasting model. For climate models the starting point can be current climate, or whatever version of the climate is relevant to experimental design.

Regardless of how a mode is started on it time-stepping through a defined period, ensembles provide an idea of the range of possible outcomes through minor perturbations in observing conditions, or even how certain physical processes are handled (i.e. through different paramaterisation schemes for features too small to be represented at a given resolution). In my forecasting days at the Met Office, looking at the solutions from a variety of the world’s big weather modelling organisations (NOAA, Met Office, ECMWF, JMA) was colloquially termed ‘a poor man’s ensemble’ as normally an ensemble will consistent of many tens of solutions. A similar concept, although not using GCMs, is found in risk modelling applications such as catastrophe loss modelling, many tens of thousands of simulations are performed to try to statistically represent extreme events, but using extreme value theory and statistical fits to the rare events on a probability distribution. A useful paper reviewing methods in loss modelling for hurricanes can be found by Watson et al. in 2004.

And the weather today...

So numerical weather prediction models used for day-to-day forecasting are run at high resolution, high complexity, but can only go a week or so into the future. Their accuracy has improved greatly in the last few decades. A forecasting for three days ahead now is as accurate as a forecast for one day ahead in the 1980s, according to the Met Office. And below (Figure 3) is a picture of the European Centre for Medium Range Forecasting’s (ECMWF) verification of different ranges over the decades


Figure 3: ECMWF’s verification scores for a range of forecast ranges. Source: ECMWF.
.
Climate models on the other hand are run with lower complexity and lower resolution, allowing them to be run out to represent decades. Since large scale climate modes such as ENSO (or the AMO, or MJO, or many others) can influence storm activity, intensity and track, GCMs are invaluable tools in helping us understand the broader climate, as well as the small-scale processes.


Basically, GCMs can be run at different resolutions with different input data depending on the application (e.g. weather forecasting or climate experimentation). The computing power dictates how these model configurations perform and the range at which they can produce outputs in a reasonable run time. They have developed into the key tool for understanding our weather and climate and interactions with the Earth’s surface (via other modelling approaches such as land surface models  or ocean circulation models. 

Wednesday, 23 December 2015

A Vanishing Sea of Toxic Dust Storms


In the last Climate Change MSc lecture of 2015, a case study was presented regarding the changes that have happened in a relatively short space of time in the Aral Sea, on the border between Kazakhstan and Uzbekistan. Figure 1 below clearly shows the reduction in the area covered by water in the sequence which runs from 2000 to 2015.


Figure 1: Aral Sea satellite image sequence from 2000 to 2015 (looping). The black outline is the approximate lake shoreline in 1960. Source: Constructed animating gif from NASA Earth Observatory images.

Otherwise known as the ‘Sea of Islands’, this endorheic sea was once the fourth largest inland sea in the world, and allowed fishing communities and agriculture to sustain themselves for decades in the early half of the 20th century. As an endorheic sea (meaning no outflow to the ocean) it acts as a terminus for surrounding hydrological systems, also termed as a terminal lake. Terminal seas and lakes such as this are very sensitive to changes in climate, for example through changes in evaporation rates. In fact, the Aral Sea has undergone a cycle of drying out and filling up over the past 10 thousand years (Micklin 2007).

Another picture, Figure 2 (sourced cited by an article on the Aral Sea Crisis by Columbia University) shows some older images than Figure 1, which highlight the longer term reduction.
Figure 2: Clear reduction in Aral Sea. When combined with Figure 1 we see the extremes of the reduction in water surface area. Source: http:/www.envis.maharashtra.gov.in and cited by Thompson 2008


The main cause for this reduction was the development of the Karakum Canal, built for agricultural irrigation, shipping and fisheries allow for economic development of Turkmenistan. It was started in 1954 and completed in 1988. It has enabled huge areas of Turkmenistan to be committed to high intensity agriculture, essentially draining the Aral Sea of water.

The reason for this huge engineering endeavour was the farming of cotton. The cotton, nicknamed ‘white gold’, requires a huge amount of water. To make matters worse the engineering practices used to construct the canal allow around 50% of the water to be lost into the ground and to evaporation.

Impact

Micklin noted the reduction in water surface area to be around 75%, and the lake level reduction to be around 23 to 30 meters (Glantz 2007), which led to a volume reduction of 90% and an increase in salinity of over an order of magnitude, from 10 g/l to over 100 g/l. This lead to tragic and severe impacts to the local ecosystems, mainly fish species, as well as enhancing the frequency of dust storms to roughly ten per year (Glantz 1999, cited by an article by Thompson in 2008 on the Columbia University website). These impacts deveastated local communities and made the area extremely inhospitable. 

Knock on impacts on local communities and industries are numerous. Obviously, the fishing industry in the sea has been decimated due to increasing salinity and agricultural practices are now hampered by the loss of water resources. Mammals and birds have also seen sharp decline in species diversity: from 1960 to 2007, the area lost roughly half of the number of species (Micklin 2007).

The other major impact of over 36,000 km2 (Wiggs et al. 2003) of dusty seabed being created is that there is now a large source of extra dust available to be picked up by the winds and on occasion whipped into dust storms (Figure 3). Roughly ten dust storms occur in the region per year (Glantz 1999, cited by an article by Thompson in 2008 on the Columbia University website).



Figure 3: Dust storms on the coast of the Aral Sea in May 2007 (Source: NASA)

Agricultural waste products containing pesticides, insecticides, herbicides and fertilisers have drained into the sea, accumulated over time, and then once the sea dried, they became baked into the exposed sediments. The desiccated land surface also potentially contains remnants from Soviet Union's biological warfare testing in the 1950’s, including Antrax, which is just waiting to be transported around by the aeolian processes. Vozrozhdeniye island, also known as Resurrection island, remained a controversial subject as it was one of the chief locations for such testing.

Wiggs et al. (2003) studied the link between aoelian dust and child health in the populations close to the Aral Sea, and found some associations to local respiratory illness in local populations, although there are significant long-distance sources of dust in the region too. Micklin (2007) also confirmed this negative impact on human health and agriculture in the wider area from dust storms that can grow to be 500km in size.

Climate perspective

Although the case of the Aral Sea’s reduction is an extreme example, it seems fair to assume that endorheic lakes will see pressure due to global climate change (Timms 2005), whether there is significant human influence or not. The Aral Sea has suffered from a two-pronged attack as the region undergoes warming, and agricultural exploitation and over-use. Strategies to preserve the remaining water in the North Aral Sea through damming projects after the sea split in to two basins in 1987, seem to be successful, which will enable the communities in the area to hold on to their way of life to an extent.

The former majesty of the larger portion of Aral Sea (the Big Aral), now seems to resemble no more than a salty (and toxic) dust bowl, with former islands now parched monuments to the impact of cotton farming and climate change, although to a lesser but still significant extent, (Aus Der Beek et al. 2011). The region will only come under more pressure if water resources become scarcer in the area linked to global warming and high evapotranspiration rates. 

Small et al. (2001) examined how the desiccation of such a large area through excessive irrigation has modified the sea surface temperatures, precipitation regimes and the hydrological cycle in the area. I wonder if the original plans to build the Karakum Canal took any of these knock-on effects in to consideration.

To end, I’ll post this interactive storymap hosted by Esri which highlights some human induced change since 1990 using the Landsat satellite imagery from NASA. The first example is the Aral Sea and, you can see again, by swiping the dividing line, how the lake has undergone a dramatic and rapid drying out in the last 25 years. The other pages of the map, also show cases of anthropogenic land use change from urban expansion, damming, land reclamation, and agricultural uses.



*UPDATE*
My brother's comment below makes a very good point regarding the fact that such a sad story, now serves as a an evocative reminder of the impact of human over exploitation of the environment. This reminder should be documented as it happens, not only in scientific literature, but in art too. We are both keen photographers, and so I thought I'd add this link to herwigphoto.com's Aral Sea project. Some amazing and poignant images.

Saturday, 12 December 2015

Palaeotempestology: Tree rings

In my last blog, I explored how the layers of calcium carbonate, which build up as a coral skeleton grows, can be used as a climate proxy. We can find a similar process by looking at tree rings. One of the more established practices in palaeoclimatology is dendroclimatology (the use of tree rings to study the past climates). Like other palaeoclimatological proxies, it allows us to extend the range of our observational record beyond that of conventional weather recording instrumentation.

Just as corals live for hundreds of years (sometimes over a thousand years), trees can keep on recording the composition of the atmosphere in their layers of cellulose for many hundreds of years, and beyond when fossilised. Figure 1 below shows an example of Huon pine samples ready for analysis, each dark line denoting a season of growth.

Figure 1: Huon Pine ready for analysis. Source: Edward Cook, Lamont-Doherty Earth Observatory, Columbia University, Palisades, NY

Isotopic differences

Ancient pines are often the favoured study subjects due to their longevity. They can give annual or seasonal information on atmospheric composition. To extend the record beyond a single sample, a variety of sources can be combined together using distinctive signatures as shown in Figure 2 below.
Figure 2: Sources of tree ring data showing how various samples can be linked together. Source: Laboratory of Tree-Ring Research, The University of Arizona

The main process that allow us to look at past storms is the fractionation of stable oxygen isotopes through condensation and evaporation. I touch upon this in my previous blog about corals, it is the difference atomic weight between the heavier oxygen-18 isotope and oxygen-16 isotope that allows us to glean clues about past climate events from tree cores.

The difference in atomic weight of oxygen isotopes is derived from the number of neutrons in the atomic structure. The most common natural isotope is oxygen-16 (over 99% of atmospheric oxygen) which has 8 protons and 8 neutrons (electrons are virtually weightless by comparison), but stable oxygen atoms can also have 9 or 10 neutrons to make up the different isotopes that we find useful for palaeoclimatology. As mentioned before, the water molecules with the lighter oxygen isotopes (oxygen-16) are preferentially evaporated in warm temperatures, while conversely the water molecules with heavier isotopic values (oxygen-18) tend to condense and form clouds or precipitation more easily. It is this property that allows us to identify different sources of precipitation in tree ring samples.

In extreme precipitation events associated with tropical cyclones, the level of oxygen-18 depletion in the rain water is high due to the highly efficient process of forming precipitation via condensation in the core of a tropical cyclone (Lawrence in 1998, Monksgaard et al. 2015). In Lawrence’s paper, five tropical cyclones that made landfall in Texas, U.S, were studied. They showed much lower oxygen-18 to oxygen-16 ratios (or δ18O) from tropical cyclones than normal summer convective storms.

This finding was further corroborated by a study of Hurricane Olivia by Lawrence et al. in 2002. Tropical cyclones are also large and long-lived and create vast areas of precipitation that can stay in the water system for weeks, giving different isotopic characteristics associated with the location of the heaviest rain bands and storm centre (Monksgarrd et al 2015). Deep soil water can remain unaffected by normal summer rainfall, and in the absence of further heavy rain events, it is allowed to be taken up by trees (Tang and Feng, 2001).

It seems clear that oxygen isotope analysis seems to be the favoured form of tree ring analysis for palaeotempestology.

Tapping the potential

Upon learning about these methods it also seems reasonable to assume that different intensities and characters of storms will result in different levels of oxygen-18 depletion. It seems likely that there would be much uncertainty in making assumptions of a storm’s intensity based on isotope fractionation (but I’ll keep looking for more research on this). At the moment, it seems that the uncertainty may preclude a reliable intensity measure of past storms using this approach.

The oxygen isotopes uptake into the tree’s structure will depend on many factors, including biological processes that are dependent on species, tree age, exposure to the storm, soil composition. Growth cycles are also taken into account. By doing so we can try to limit the degree to which uncertainty derived from the mismatch between growth season and storm season, can cloud useful information.

In the North Atlantic basin for example, hurricane season runs from early June to late November and as such overlaps mainly with latewood (as opposed to earlywood) growing phase. Therefore it is these sections of the layers of tree rings which are focussed upon for palaeotempestological studies.

Miller et al. 2006 presented the emerging case for using oxygen isotopes more widely after the devastation left behind by the busy 2004 and 2005 hurricane seasons, by building a 220-year record to identify past storms from unusually low oxygen-18 isotopes in pine forests. This is potentially very useful for engineering and loss modelling concerns.

“Can’t see the wood for the trees”

There are many uncertainties in the application of tree ring data to palaeoclimatology, let alone palaeotempestology, as summarized in the review paper by Sternberg et al. in 2009, including complex cellulose uptake biology, changes in isotopic composition of soil water, assumptions based on the relationship between leaf temperature and ambient temperature.

However, every study adds to the wealth of information and since each site represents a single location slice through time, it seems as though the science of dendroclimatology will only continually benefit from new data. And there still seems to be push to collect and analyse more data. The National Climatic Data Center, hosted by NOAA, is a font of old and recent tree ring datasets.

A recent review of the data by Schubert and Jahren published in October this year (2015) takes a wide view. It aims to unify tree ring data sets, to bring together a global picture of past extreme precipitation events based on low oxygen-18 isotope records. They conducted 5 new surveys and used 28 sites from the literature to create a relationship using seasonal temperature and precipitation, which can explain most of the isotopic oxygen ratio in tree cellulose. This seems to be a step up in resolution, as looking at seasonal variations rather than annual cycles may provide a step closer to identifying individual storms or storm clusters using tree ring data. It is interesting to see a comment in the conclusion of this paper about the fact that much of the uncertainty that still remains in this link, is derived from disturbances, such as storms.


Figure 3: Comparison between measured δ18O in the cellulose of studies trees and the calculated δ18O using the model developed by Schubert and Jahren which uses known climate characteristics. It shows a good correlation on relating seasonal temperature and precipitation to oxygen-18 isotope ratios. Source: Schubert and Jahren, 2015

It seems clear that it would be much more difficult to develop a simple equation to explain the extremes of the isotopic ratio chronologies to identify extreme storms. However, Schubert and Jahren seem to have taken a step forward while remaining focussed on average seasonal conditions. Nevertheless, I can’t help but wonder if there is a way for extreme events to be linked in to somehow.

Alternatives to isotopes

When looking specifically at past storms in trees rings, I did find a couple of other approaches to using tree ring data that may also be worth a mention.  

Firstly, an interesting couple of papers by Akachuka in 1991  and another in 1993, used a method where trees that have been forced to lean after a hurricane. This phenomenon is examined for any extra clues that it may provide by assessing how these trees recover from such disturbances. Although the papers do not look specifically at characterising the storms themselves (i.e. there is no wind speed to bole displacement relationship), I couldn’t help but wonder if there is some extra information to gather from these trees and whether we could build a relationship to specific storms or storm seasons.

Another paper by Sheppard et al. in 2005 looks at the effect of a tornado in 1992 on a specific dendrochronology and re-evaluates the pre-historical records from wood samples retrieved from an 11th century ruin in Arizona. He looks for similar patterns in wood growth (see Figure 2 for conceptualisation). Unfortunately, the patterns found in the tree rings which were caused by the tornado in 1992 were not replicated in the ring patterns of the 11th century sample. This is certainly interesting work, but I imagine that finding enough data for trees that are damaged but still survive tornadoes is not easy, especially when comparing to single older samples.

Conclusions

Although individual studies using tree lean or damage from specific events like tornados, are interesting and worthwhile academic endeavours to help us understand the ways in which storms of various scales impact certain tree growth, they do seem somewhat less applicable to thinking about climate change and how frequency and severity of storms are changing over a wide area.

With so many subtleties based on factors such as tree species or topography of a study site, I feel that the broader synthesis approaches (as per Schubert and Jahren above) using stable oxygen isotopes offer greater immediate potential for aiding our understanding of past changes in storm activity with possibility for application to risk assessments and projecting impacts of future climate change. 

Saturday, 28 November 2015

Palaeotempestology: Lake sediment records

Digging in to sediment records

With continued debate among scientists on exactly how future climate change will affect storm frequency and severity, it seems logical to see if we can find out more about variability in storm activity from the past.

Lake sediments are extremely useful in studying past climates, for which we have no observational record (through conventional weather recording equipment). They provide a slice through time to look at the changes in lake chemistry and environmental activity affecting the make up of suspended particles in the lake that eventually settle at the bottom.

Radiocarbon dating, thickness of layers of different sediments, analysis of diatoms and inference from the occasional break in the record (a hiatus, perhaps due to the drying out of a lake), are various ways in which lake sediments can give us clues about the past.

Within this range of different approaches there are a few ways in which sediments from lakes can be used to look at past storm events. In my previous blog, I highlighted a paper by Dr Jeff Donnelly et al. in 2015 entitled “Climate forcing of unprecedented intense-hurricane activity in the last 2000 years”. It presents a history of storm events over the past two thousand years, using an analysis of sediment grain size in their collected samples, with a resolution of around 1 year. The work uses evidence gathered from field work during the project (and previous studies) to determine the presence of two distinct periods of higher activity in severe hurricanes for the west North Atlantic coastline of North America: one between 1400 and 1675 C.E.; and another period of high frequency storms further back in time between 250 and 1150 C.E.

The study location is a place called Salt Pond, in Massachusetts. It has a tidal inlet linking it to the ocean, making it full of brackish waters. This proximity to the ocean means that the pond is exposed to ‘overwash’ during storm surge events associated with large storms heading northwards along the Eastern seaboard of the United States. These salt water incursions occur when the storm surge level is higher than any natural or man-made defences. This ‘overwash’ leads to ‘coarse grain event beds’, and so these can be used as an indicator of severe storm activity. This process is vaidated using known hurricanes landfalls, which are represented in the sediment records and act as ‘anchors’ to verify that the samples are valid.

The study builds on a number of papers that were produced after the convening of a workshop on Altlantic palaeohurricane reconstructions in 2001 at the University of South Carolina. The workshop aimed to identify new opportunities in the field of palaeotempestology. A summary of the workshop can be found here. Dr Jeff Donnelly and colleagues studied a number of lakes in the Northeast of the US, in the states of New Jersey and New England, and so to learn a bit about the methodology, I dug into some of the papers in some more depth.


Getting your hands dirty

It seems the only way to get at clues available from sediment records is to get your hands dirty. I found an earlier paper by Donnelly at al. from 2001 which built a 700 hundred year sediment record of severe storms in New England. This paper (and a couple more in Boldt et al. 2010Liuand Fearn, 2000) started to show me that each project strategy is subtly different. 

Various schemes are planned based on the conditions of the study sites, to find the best locations for sampling overwash areas in a consistent manner. The aim is to try to consistently capture the process by which more intense storms erode more sand from the coastal beach and bring this coarse sediment into the brackish lakes and ponds, larger storms being assumed to produce wider fans of overwash sand deposits, being thicker near the shore and thinner near the centre of the study lake. A range of
samples should be taken to try to represent the range of possible characteristics of past intense storms. Figure 1 (below) is a hypothetical diagram from Liu and Fearn (2000) to show various patterns of deposition. Note the radial patterns associated with the various directions of storm approach, with the larger fans associated with more intense storms.

Figure 1: Hypothetical coarse grain deposition fans in severe storm surge events. Source: Liu and Fearn, 2000 
The coarse sand creates a layer over the more usual organic-based deposits that settle on the bottom of a lake as a stratified layer. This happens most effectively in anoxic lake beds (lacking dissolved oxygen) since any mixing from plant of animal life will be minimal.

Having never been in the field to collect sediment samples, I found it interesting to see how Donnelly et al. (and other teams) maintained a consistent chronology in the sediment records. They took multiple samples and use the variety of methods above to build their chronology.


Markers in time

Isotopic radio carbon dating and stratigraphic markers used to mark certain control points to validate the data. Pollution horizons are useful in this respect, for example lead concentrations mark the beginning of the industrial revolution as it quickly made it's way into the water systems and lakes and then 'fixed' by anoxic sediments. The presence of lead pollution is an indicator of the late 1800's (Donnelly et al. 2001) and then another change occurs when lead was removed from gasoline in the 1970's and 1980's. This is a good example as it shows how these markers are useful for calibrating sediment records, in a way that is easily understood and recognised.

Pollen records can also mark certain points in history, for example the European colonisation of the eastern U.S. led to large scale clearance of the vegetation for farmland meaning that the pollen composition changes drastically (Russell et al. 1993).

Once these markers are established, previous storms are used to calibrate storm events, and then previous coarse grain even layers are identified and carbon dated.


Clear as mud?

So having learned a lot more about sediment analysis in relation to palaeotempestology, I now have a greater respect for what these cores of old mud and sand can tell us about the past. However, it does seem to me that there is still a large degree of uncertainty in the data when trying to discern an idea about individual storms. For example, what if two storm occur in quick succession as a cluster, before a sediment layer has had a chance to settle and ‘lock in’ the information? This may end up looking look like one larger or more intense storm, when actually it is the frequency of storms in that season which is varing. Donnelly et al. 2001 give an example from their study location of a lack of agreement between historical accounts of two intense storms in 1635 and 1638 which likely created overwash signatures, but in the sediment proxy data, only one event was indicated. This means that the estimated frequencies may have significant uncertainty.

Also, responses of lake or pond to overwash events may change over time due to changes in natural or man-made barriers. However, even with these uncertainties in mind, it is still clear that there is great value in understanding the past clues left behind by storms in our coastal lake sediments. 

Without any alternative information, the best that we can do is to piece together palaeotempestological proxies and glean snippets of information to build a longer record of storms.

It also provides grounds for comparison in using climate models to try to understand past variability,
another subject I intend to explore in a future blog.

For now, I’ll leave you with an informational video by Ocean Today in conjunction with the Smithsonian Institution and NOAA, just after Hurricane Sandy in 2012 which will hopefully make a clear demonstration of what overwash looks like and how the coastal beach material can be dragged in across to end up in lakes or ponds that lay close to the ocean to give us these markers of past events.



My next blog will be on the evidence that can be derived from coral cores.

Thursday, 26 November 2015

Palaeotempestology series: Introduction

In a previous blog, I talked about the various ways in which historical documents, records and anecdotal evidence are used in climatology. I mentioned briefly some of environmental proxies used to derive information about the climate throughout the whole of the Earth’s history using palaeoclimatological techniques. Studying past climates is an essential part of any debate on climate change and there has been a huge amount of science produced in this field both in terms of improved methods and developing datasets.

Depth of data

Ice cores, lake sediments, tree rings records, coral analyses and more, have been conducted around the world for the last few decades to build the picture of past climates that we have today. The National Oceanic and Atmosphere Administration (NOAA) in the U.S. has an online portal and interactive map (Figure 1) that shows the geographical spread of data. I knew there was a lot of data out there but this map really puts into perspective the amount of work that has been done to gather information around the world, but also shows that there are still many gaps and much more that could be done. Check out the Climate Data Online interactive map of palaeo records here.

Figure 1: Screen shot on NOAA's Paleoclimatology interactive map at Climate Data Online. Source: NOAA (https://gis.ncdc.noaa.gov/map/viewer/#app=cdo&cfg=paleo&theme=paleo)


Depth of study

As a snapshot to show the amount of research into palaeoclimatology, a useful list of just one year’s worth of research is compiled here by the team at the 'Skeptical Science' website.

Palaeoclimatological proxies are signatures left behind in the natural environment that can tell us something about the climate in the past. They require detective work and often sophisticated laboratory analysis, but can provide windows into the past to show us data that are otherwise not available.

They are often used to derive at temperature trends over thousands of years from which drought periods can be inferred, or to develop records of past atmospheric composition (useful for revealing changes in greenhouse gas concentrations) but certain proxies can also used to investigate past storm activity.

Pre-historical storm evidence

Since I am obsessed with storms, when thinking about pre-historical records, I couldn’t help but be drawn towards Palaeotempestology (a term coined by Professor Kerry Emanuel at MIT) which is the study of pre-historic storms. In this context 'pre-history' refers to the time before the beginning of observed instrumental record of weather and climate data, which is generally no more than 100-150 years long at best, shorter still if you consider that observations and full representation of all storms that occur has only really been possible since weather has been observed using satellites.

The first satellite used to observe weather conditions was TIROS I, launched on April 1st 1960 and initially could only tell us some basics about locations of clouds, as analysed by hand. This image below (Figure 2) shows the very first image from this satellite.

Figure 2: The first image sent back from the first satellite used to observe the weather. SOURCE: NOAA/NESDIS
Satellite technology and application has come a long way since then (I’ll likely cover this in a future blog).


Palaeotempestology aims to look back hundreds or even thousands of years, so I’ll take a bit more time on this subject. In my next few blogs, I shall aim to investigate, and share, more on the various sources of data used to drill down in to using sediments (Figure 3), 
Figure 3: Heavy duty sediment core retrieval. Source: NOAA image by Ane Jennings. (ftp://ftp.ncdc.noaa.gov/pub/data/paleo/slidesets/heinrich/heinrich08.jpg)


swim through the information on coral cores (Figure 4),
Figure 4: SCUBA scientists extracting a core from coral. Source: NOAA image by Maris Kazmers. (ftp://ftp.ncdc.noaa.gov/pub/data/paleo/slidesets/coral/coral12.jpg)

and circle around the subject of tree rings (Figure 5).
Figure 5: Scientist preparing to take a sample from a Giant Sequoia tree. Source: NOAA image by Peter Brown. (ftp://ftp.ncdc.noaa.gov/pub/data/paleo/slidesets/treering/tree01.jpg)





Sunday, 22 November 2015

Notes on COP21 - follow up interactive infographic

While thinking about the COP21 negotiations in under two weeks, I came across this excellent interactive infographic produced by the World Resources Institute. It shows the various greenhouse gas contributions from different countries, split by sector sources too. 

It's a good quick reference guide when comparing countries that I couldn't resist sharing, as a quick follow up to my previous blog post about the critical meeting in Paris.





I found it at the bottom of an interesting article regarding China's INDC, the article comments on the boldness on China's commitments to a low carbon future. My favourite comment from the article is how the INDC's should be seen as a 'floor' rather than a 'ceiling' on ambition! Hopefully, this advice is heeded at the negotiations.

Sunday, 15 November 2015

Notes on COP21

A couple of weeks ago, on the 5th of November, I attended an evening presentation on COP21. There weren’t any fireworks but it was an illuminating talk, so I thought it would be useful to turn my notes into a blog and discuss the various points raised.


Source: Official COP21 logo



The presentation was by Jesse Scott, from the International Energy Agency (IEA). She's an ex-campaigner and lobbyist who has also worked in civil service in Paris before moving to the IEA.

Initially, she charmed us with her passion for the subject of climate change by describing how, with so many different issues, interests and stakeholders, climate change is simply too interesting to ignore from a political perspective.

She spoke to the audience with authority, about what COP21 actually is, and what it is trying to do. She gave an overview of the science, technology and economic linkages, and then moved on to discuss how COP will work in practice.

She explained it in real terms, and so this worked very well as a primer on 21st conference coming up soon. In this blog, I have used her presentation as the basis of my discussion on the COP21 meeting.

Firstly, a brief overview and history is as follows:
  • The Conference of the Parties (COP) meets roughly annually since 1995 (the first held in Berlin - the full list of meetings can be found here) to assess progress dealing with climate change under the UNFCCC and is the decision making body of the framework.
  • It was the driving force behind the Kyoto protocol (COP3 in Japan) in which was the first major example of legally binding obligation for developed countries to reduce the greenhouse gas emissions.
  • The UNFCCC is comprised of 196 countries and is committed to stabilising greenhouse gas emissions to a level that presents as little danger as possible to the global community.
  • Since Kyoto there have been attempts to update the legal obligations, the biggest and most recent attempt was in Copenhagen in 2009 (COP15) which was deemed to be a failure, and unilateral agreements could not be reached.
  • Paris is the next big concerted effort to reach binding legal agreements, based on the commitments outlined from each country ahead of the conference in their Intended Nationally Determined Contributions (INDC) of which most have already been submitted.
  • In COP20 and COP19 the decision was taken that these INDCs would be declared before the conference to promote clarity, transparency and understanding of each country’s position and idea of the methods they will chose to tackle adaptation and mitigation strategies.


Recommended Reading
Jesse Scott recommended an article on Christina Figueres inthe New Yorker as an excellent primer on COP21. Ms Figueres is the Executive Secretary of the UNFCCC. A transcript of Christina Figueres' speech to the 1st Global Climate Legislation Summit a couple of years ago, expresses how focussed she is on delivering the goals of the UNFCCC and the strength of her advocacy for climate change legislation. She will be blogging through the COP21 so it is worth following her articles. A recent article in the Guardian also shows her optimism for these talks here.

During the presentation, Ms Scott gave us a whistle stop tour of the science (via IPCC) as a precursor to a dialogue on broader governance and political aspects.

A few initial areas she touched upon include:

Procrastination: She also discussed examples of action today being more valuable than responsive action in the future. This reminded me of a often quoted figure regarding resilience which is replicated in various reports but in one case, the UNDP state that for every one dollar spent on disaster preparedness, we save seven dollars on emergency response as highlighted in their #Actnow campaign. This is relevant in a warmed world that may see greater extremes of climate. It also reminds me of the old saying: 'a stitch in time saves nine'. Act now to stop a worse situation in the future.

Technological advances: In recent years, technological advancements have allowed companies to start to realistically consider how to maintain their economic growth trajectories, while investing in sustainable and energy efficient technologies.


Communication: She also talked about the difficulty in communicating risk and uncertainty and described a game developed by Pablo Suarez and his team have designed a game that allows us to experience the difficulty with managing climate risk. Gamification can be an effective way to communicate complex processes.

Climate Justice
Climate Justice is an important and complex principle in the debate on climate change. Climate change doesn't deal out its impacts evenly from a human-centred perspective, as Ms Scott describes, the poor and the young tending to be most vulnerable. Historically, there is also a disparity in that those countries that have a long history of high carbon dioxide emissions, are those who have benefited most, and are most resilience to future impacts. Furthermore, in terms of where the changes are required, the biggest emitters of greenhouse gasses in the past, present and future are argued to be those that need to take most responsibility.

The question of who should do what to mitigate anthropogenic climate change (as we understand it) is a question of science, politics and responsibility, while being reliant largely on metrics. In some respects, it depends how emissions are compared and the political sway of those in power. 

When looked at in the context of the 17 UN sustainability goals, we can consider how different countries and regions will have differing priorities for many of the 17 goals, but with climate change, everyone is a stakeholder and everyone has some exposure to the risk.

Being a trans-boundary and inter-generational challenge means that it requires the concerted and long term commitments and efforts the COP21 is aiming towards.

Mary Robinson (former President of Ireland) was quoted on Climate Justice and human rights summing up that 'Climate Change impacts are biggest on those who have does least to produce them'. Using IPCC parlance, it is extremely likely that this is the case. 

The inter-generational aspects also highlight that those who have made no past contribution to greenhouse gas emission levels (those as yet unborn) will be those feeling impacts of climate change for the longest. 


Some small island nations are starting to plan for the displacement that will be caused by climate change. The Bikini Atoll, a site of nuclear testing in the 1940s and 1950s (and famous for swimwear design) has applied for land in the U.S. to relocate the population due to rising sea levels

Source: Getty Images via BBC (http://www.bbc.co.uk/news/science-environment-34642692)

Ms Scott affirms, from the perspective of the IEA, that the main solution is to find a way to provide 'clean energy' for everyone. Allowance must be made for the ever-increasing demand for energy in developed countries and the needs for increasing energy to foster development in poorer countries.


Good COP, Bad COP
The talks in Copenhagen in 2009 (COP15) were largely seen as a failure, in that they could not reach binding commitments for countries that have the highest emissions. However, there is reason to be optimistic. Ms Scott describes herself as being ‘very cautiously optimistic’. A lot has changed since 2009:

  • The science has moved on, especially through another round of IPCC research.
  • Technology is offering new solutions for renewable and efficient energy at a dramatically reduced cost. 
  • The U.S. and China governments have steadily shifted towards addressing climate change during the last half a decade, with recent confirmation of their positive intentions on the climate (joint presidential White House statement on September 25th).
  • G7 countries addressing climate change in a practical manner, and expressing a feeling of responsibility, as seen here in a White House press summary in June.
  • Lessons have been learnt from the difficulties in Copenhagen. There is generally a much brighter outlook on the potential for meaningful and binding agreements being reached, based on the INDCs.
  • The Pope has issued a number of statements regarding climate change.
  • Mark Carney of the Bank of England has delivered recent speeches regarding climate change and the role of the insurance industry in managing future risks. The ‘1-in-100 initiative’ is an example of how the methods used in insurance industry can benefit both the public and private sectors if adopted more widely in the risk management processes.
  • Military interests (e.g. NATO) are concerned regarding their resources in a warmer world. If they need to use their troops for disaster relief after severe events, both home and abroad, then how does that affect their ability to maintain national security - an issue amplified if we can expect changes for the worse in frequency and/or severity of extreme events like floods and storms, as well as migrations and likelihood of conflict through severe droughts. 
  • On the public front, celebrity endorsements (for example Leonardo DiCaprio who spoke eloquently at the UN climate change summit last year) have continued and activism continues to put pressure on companies and governments to invest on environmentally responsibly technologies.


INDC - Are all of the cards on the table?
As of last week, 85% of the INDC had been submitted. Hopefully these will be much more robust than previous efforts, focusing of the three elements that are required for any mitigation strategy to work: Monitoring, Reporting, Verification.

It should be noted that these INDCs are self-defined, and so the cynic in me suspects that they may be quite lenient, but equally, there is probably little alternative to this approach as every country will have different processes and issues that need to be understood so they can action their obligations. The sum of the INDCs should add up to one global agreement that is achievable within the context of each nation - not easy to achieve.

If the details of the INDCs were not country specific it would be very difficult to find common ground. For example, in terms of monitoring, there is the question of which metric to use. Per capita CO2 emission may favour China, with such a large population, but would this be representative with such a huge disparity between the high and low (rich and poor respectively) emitters?





Would using per capita metrics put countries other countries at a disadvantage? Using absolute (total) emissions would conversely perhaps put China at a disadvantage, being top of the worst offenders list?


This type of conversation will no doubt be had during the negotiations.

Disclaimer
I'm still getting to grips with the angles of the politics of climate change (luckily I'll be studying it more specifically next term), but this talk certainly helped me gain deeper appreciation of the complexity and importance of the COP21 meeting. If there is anything in this blog with which you disagree or looks to be misunderstood, then feel free to comment and let me know.  Most likely, it is due to me being new to thinking about these negotiations in depth so I'm eager to learn more.

Everyone is a stakeholder in looking after our climate and developing a sustainable environment, and that also means that there are lots of different opinions and views. I agree with Jesse Scott that it really is a fascinating topic to study.

A final word
I wrote most of this blog last week, but between then and posting, the tragic events in Paris have unfolded. My thoughts are with the families and friends of those directly affected and the people of Paris as they recover from this despicable and horrific act. The Foreign Minister of France, Laurent Fabius, has a short quote on the home page of COP21 today, saying simply:


“The COP is maintained”