Showing posts with label climate change. Show all posts
Showing posts with label climate change. Show all posts

Sunday, 10 January 2016

After the storm, comes the calm…

This is my last blog before taking a break, but I just wanted to take a moment and use one of my favourite paintings as an inspiration for a few final words.

Figure 1: Snow Storm - Steam-Boat off a Harbour’s Mouth by Joseph Mallord William Turner. Source: Tate.org.uk


‘Snow Storm’ by Turner is a perfect way to end this blog. He is one of my favourite painters and inspired me to go to art school before studying science. He also helped me along the way to becoming a weather forecaster (a career lasting for 10 years) and now to being obsessed with big storms (although the Great Storm of 1987 in the UK also had a strong imprinting effect).

The painting shows his interpretation of a storm; one of many pictures he painted on the subject of vortices. Legend has it that he strapped himself to the mast of a ship during a storm, to gain an experience of extreme weather. In doing so he was risking his own life for his art. Whether true or not: it’s a great story.

It strikes me as a poignant image in light of today’s debate on climate change. To me, it represents the belligerent march of industry despite the environment carrying on in its naturally ferocious, unforgiving, but ultimately beautiful way. It also motivates me to consider more of what we can do to mitigate and adapt to a future of more severe extreme weather – a future which is likely to have been brought about by the actions of a few industrializing nations.

Even though I’m signing off from this blog for now, I feel it is certainly the beginning many more. I have learnt a lot about climate change while writing these posts, and I sincerely hope that you have enjoyed what you have read. I will continue to grow in my knowledge of climate issues, and try to synthesize the science with an even and balanced view.

If you too are finding out more about our natural environment and have any climate-related comments or questions related to anything I have posted, then please feel free to leave a comment.


But for now, goodbye and happy blogging!

Thursday, 7 January 2016

The Great Hurricane Debate

I have talked about the past in my palaeotempestology series, and some more about how models can help us and what they are capable of representing at different scales. But what can we really say about the present day climate and recent changes, regarding tropical cyclones like the devastating Hurricane Katrina (Figure 1).

In this blog, I ask: What is the current thinking on tropical cyclones in a changing climate? 

Figure 1: Hurricane Katrina just before landfall. Source: NASA


When we talk about storms and global warming there is a lot of uncertainty. Although we cannot assign any particular storm to global warming, we can talk about a shift in the likelihood of certain types of storm. Figure 2 below, shows this concept (using temperatures). The graphs show a distribution of possible futures, with the central vertical line representing the most likely. The curves fall away on both sides represent a lower probability of occurrence and we head towards the extremes.

With climate change, what is normal now, is likely to be shifted one way or another (warmer in terms of global temperature) which will change the likelihood of our current perception of extremes. The phrase ‘new normal’ is an evocative way of emphasising the shift.  Below is an example of how different changes in probability distributions affect temperature, but any variable can be displayed in this ways, for example probability of occurrence of storms.

Figure 2: Different changes in probability distributions of temperature. Source: Kodora and Ganguly 2014.
Figure 2 is useful in explaining probability shifts on a normal (Poisson) distribution. Don't worry about the small text:
  • The top graph (a) shows shifting the distribution of possible events to the left or right increasing the likelihood of the extremes (in the shallower edges of the curve) in the direction of shift.
  • The middle graph (b) shows the effect of changing the maximum probability of occurrence (the average most frequently occurring conditions and its ‘fattening’ of the extremes (also known as ‘fat tails’), which represents a change in variability. 
  • And the bottom graph (c) shows how changing the shape of the curve may affect average conditions but actually leaves the extremes largely untouched. 

This is how we need to think about possible climates in the future, and how distributions of storm intensity may change.

Dicing with extremes

Another way to put it would be using a dice analogy. On a ten sided die (yes, I do have a ten-sided die, as I used to play dungeons and dragons), let’s say 1 and 2 represent a category 1 hurricane and 9 and 10 are a category 5 hurricane. On a given day (obviously not real probabilities), if we assume global warming is shifting the intensity of storms to become more severe, all we are saying is that a storm in the future might be a category 5 storm if we perhaps roll an 8, 9 and 10 on that die, with a reduced chance of what we know as the category 1 storm, only occurring if we roll a 1.

So what’s the story for tropical cyclones?

There has been much debate over the years. Theoretical reasoning relies largely on the impacts of increasing temperatures affecting sea surface temperature, and extra water vapour in the air, which are a key factors in generating and develop tropical cyclones. However, the climate is not so simple. In a ‘Science: Perspectives’ piece by Kevin Trenberth in 2004, he explains how there is large variability in hurricane activity linked to ENSO, but as sea surface temperature and water vapour are increasing, they could enhance convection and therefore impact the intensity and rain-making potential of tropical cyclones. He also comments that trends in tracks and activity rates are harder to quantify.

It seems that there is much recent study on the changes in frequency, rather than the intensity, of storms. In a previous blog, I noted that climate models can represent the atmospheric conditions that encourage tropical cyclone genesis, however, intensity is more difficult due to the small scale features involved with storm development and convection that govern exactly how strong the wind becomes or the depth of pressure in the eye of the storm. This means that although we may be able to identity trends, we are unlikely to be able to quantify the changes, which limits application.

Through the Coupled Model Intercomparison Project, now on its fifth round of comparisons of the world’s biggest climate models (CMIP5), we are steadily making progress. Bellenger et al. 2014 describes how the latest models can capture modes of climate variability that influence tropical cyclone formation and evolution. The paper highlights an improvement in a previous cold-bias of sea surface temperatures in the Pacific Ocean, but generally not too much difference elsewhere, allowing the use of CMIP3 and CMIP5 models when assess ENSO. Climate models now also capture monsoon rains with high confidence (IPCC: WG1 Summary for Policy Makers). Historically, CMIP models have developed vortices that represent tropical cyclones, but they are generally too weak. (IPCC: WG1 AR4 Chapter 8).

Turning up the dial on Tropical Cyclone Intensity

The general consensus seems to be that tropical cyclones are not necessarily expected to increase in frequency, but they are likely to increase in severity. A shift in the intensity of storms towards the stronger wind speeds is a likely impact of global warming according to Holland and Bruyere (2013) as we can see from Figure 3.

Figure 3: Saffir-Simpson scale hurricane category proportion of total North Atlantic Tropical Cyclones (including Tropical Storms), changing through time (years indicated in the legend). Source: Holland and Bruyere (2013)
  
Back in 2005, Kerry Emanuel also discussed in trends tropical cyclone activity, and how their destructiveness has increased in the previous 30 years. A recent paper concurred with this finding from Estrada et al. (2015) who link US$2 to $12 billion of the losses incurred due to the busy 2005 hurricane season, to the effects of climate change. Tom Knutson (2004) also found similar results through studying the choice of climate models used to define the CO2-related warming, and the choice of parameterization schemes for convection in hurricanes. He found increases in tropical cyclone intensities linked to high CO2 environments (anthropogenically warmed simulations) in his model analysis.

Christopher Landsea, of the National Hurricane Centre in Miami, questioned many of Emanuel’s methods, in an article in Nature (2005). A debate ensued that caused a rift in the meteorological community. Landsea agreed with Will Gray in concluding that most of the variability on tropical cyclone frequency, especially intensity, is derived from natural variability, or at least that the observed data is not able to make a significant link to global warming. 

Gray preferred a theory linking hurricane intensity to the Thermohaline Circulation in the world’s oceans. An interesting article in the Wall Street Journal in 2006, highlights some of the awkward moments surrounding this passionate debate. Another more scientific angle from the guys at RealClimate.org, highlights where the different sides of the argument were formed regarding global warming’s effect on tropical cyclones.
Looking back now it seems apparent that the extra attention from two active hurricane seasons in a row, 2004 and 2005, may well have added fuel to the fire of the debate (Trenbeth 2005).

Poleward Bound?

IPCC synthesis of the past and future global changes in tropical cyclone frequency provide only low confidence (IPCC: Summary for Policy Makers), however there are regional patterns that have been elucidated.

Another interesting recent paper by James Kossin et al. 2014 found a slow poleward migration of tropical cyclone maximum lifetime intensity; a metric which is relatively insensitive to past data uncertainty. The trend is fairly small but linked to the last 30 years, and so I wonder if this is indeed another anthropogenic signal or part of natural variability. The main implication of such a poleward shift of the region affected by tropical cyclones, is that areas that have never experienced them before (and therefore are perhaps not built to withstand their destructive force), may become exposed in the future. And conversely, areas near the Equator that are currently in tropical cyclone-affected regions, may see a lower frequency of events.

Interesting stuff, and I look forward to more papers on this subject.

Latest models

Mizuta et al. (2012) showed how recent high resolution climate models (at 20 km resolution or so) have improved characterisation of intensity of tropical cyclones, at least to the extent to being able to examine distribution shifts within their own outputs, but still cannot tell us exactly how future storms will look in the year 2100. They are also able to represent variability in yearly activity rates with fairly low resolution (100 km) models (IPCC: WG1 AR5 Chapter 9). This is interesting when compared to the size of most tropical cyclones being only a few times bigger.

It’s an exciting time for climate modelling. I remember only 5 or so years ago as a forecaster that the global model resolution of the operational weather forecasting models was around 20 km. It’s amazing to think that this resolution is now being used to experiment with future climates over years and decades.

Conclusion

Recent findings echo the higher intensity theory, hence the inclusion of increases in tropical cyclone intensity in the late 21st century described as: “More likely than not in the Western North Pacific and North Atlantic.” (IPCC: Summary for Policy Makers). The latest IPCC report also concludes that it is “virtually certain” that there have been increases in intense tropical cyclone activity in the North Atlantic since 1970, but low confidence that this is anthropogenic in origin.

An increase in intensity certainly makes sense to many in the field – more water vapour and higher sea surface temperatures in the system may not create more tropical cyclones, but may well allow them to become stronger. After all, tropical cyclones are only trying to redistribute heat to the poles, so more heat potential has to end up somewhere… right? But then, if global warming is affecting the poles more than the tropics then surely this counter acts the effect somewhat by reducing the gradient. Perhaps the global gradient has something to do with changes in frequency. This will have to be a future blog subject too!

Ultimately, most of the scientists studying tropical cyclones around the world agree that global warming is happening, and that is very likely to be anthropogenic in origin (IPCC: Summary for Policy Makers). Although some still contend that there may not be enough evidence to confidently maintain that the intensity of tropical cyclones is increasing globally, there is a strong signal to say tropical cyclones have already increased in intensity. Furthermore, there are strong hints that intensity will continue to increase in the future.


Hopefully, we’ll get to a point where the science is settled and we can get on with adapting to the consequences of our changing climate. It certainly would be better to have more study to prove the idea beyond reasonable doubt.

Tuesday, 5 January 2016

Thoughts after COP21 and the role of risk assessment and insurance

I recently published a blog for my employer's blog site summarising some of the outcomes of the COP21 meeting in Paris. I focused on some of the developments in the financial world that may be able to help with adapting to the impacts of global warming in the coming decades. You can find the blog here if you’re interested.

Although the various schemes and forums that have been set up will no doubt broaden the reach of the insurance industry through government risk pools, or through micro insurance, many are quite new and innovative for the industry. Some of the new successful options are built as bespoke parametric insurance products that can be very quick to pay out since they are based solely on a defined parameter. The main thing required for such a product is a reliable dataset upon which to build a relationship to losses or costs.

Some financial support for African farmers

A very relevant example, due to this year’s strong El Nino (see another of my company blogs here) and droughts in some parts of Africa (Figure 1), is the African Risk Capacity (ARC).

Figure 1:  Pumping well-water from a borehole in the village of Bilinyang, near Juba, South Sudan. Source: World Bank/Arne Hoel














Their approach uses a drought index and an agreed threshold of precipitation which triggers payouts to help small farmers via their governments. The number of countries signed up to this in Africa is growing year on year, and is a great example of the insurance sector providing financial stability in an efficient manner for farmers that would otherwise be at risk of losing their livelihoods in extremes of drought. According to the ARC website, an analysis by Boston Consulting Group for ARC showed that the potential benefit of running the scheme is 4.4 times the costs of emergency response in times of drought.

Basically, every one dollar spent through ARC, saves four dollars and 40 cents in emergency response costs, and the money through ARC will be where it needs to be in a matter of days rather than the weeks and months that it can take for governments to reassign funds or wait for international aid. This is a great example of the financial sector providing a cushion to potential climatic impacts which may well get worse in the future.

Global initiatives from COP21

As I explain in my company blog, the UN’s Secretary-General Ban-Ki Moon announced his climate resilience initiative named A2R, which stands for Anticipate, Absorb, Reshape. Much of the scientific endeavour for projecting climate change, while understanding and providing early warnings for current climate extremes, broadly fits within “anticipate” section of the initiative. “Absorb” seems to fit naturally with financial mechanisms, as well as building resilient infrastructure (see here for link to fellow MSc blogger) and mitigating actions to reduce CO2. And “Reshape” is again about resilience but with more focus for the future, in building partnerships between the public and private sectors to foster sustainable growth and better decision making for future infrastructure.

There are many complementary initiatives getting started in this push for resilience. A special Task Force on Climate-Related Financial Disclosures, chaired by Michael Bloomberg, who has been ardent in support of building climate resilience, galvanized no doubt by having seen first-hand the impacts of severe weather on New York while presiding as mayor during Hurricane Sandy. This aligns well with a UN endorsed initiative called the ‘1-in-100 initiative’ which aims to encourage companies to better assess and disclose their ‘tail’ risk (risk of a 1% probability loss), giving them a financial incentive to be more resilient if they want to attract investment through being resilient.

Public/private sector partnerships

There seems to be a groundswell of activity in the private sector. While COP21 was underway I was invited, through my employer, to attend a meeting in Paris regarding climate resilience hosted by one of the clients of my company. The hosts are a large international management, engineering and development consultancy firm, and are therefore interested in finding out how different industries and sectors are planning to approach the challenge that will befall us due to global warming. It was a Chatham House Rule session so I won’t go into any details, but the meeting involved delegates from the World Bank, the European Investment Bank, the Rockerfeller Foundation, the Global Sustainability Institute and Anglia Ruskin University, a senior professor from our very own UCL Geography Department to highlight but a few - an interesting line up indeed.

Discussions covered a number of topics from city resilience to financial stability with respect to climate change, but my general feel from the event was that there was a tangible motivation to deal with the future impacts of global warming sooner rather than later. There was a recognition that there is good business opportunity through building sustainable cities, and offering risk assessment products and services in areas that will see increasing climate risk.

I feel the key to helping climate resilience is certainly to engage all parties and ideally within mutually beneficial partnerships. Initiatives such as the UN’s AR2 or the Insurance Development Forum (IDF, also announced at COP21) can help. My own job is part of this too: I may have mentioned it before but my MSc studies are part time, beside my day job which is in the risk and (re)insurance sector. I work with business users and academics to try to match up both of their needs and capabilities, and work towards tangible outputs through research and internal client related projects. An applied science coordinator/leader of sorts, by coordinating a network of academic institutions working with my company on a wide range of risk related subjects, including climate extremes.

There are also academic led partnerships such as the Engineering for Climate Extremes Partnership (ECEP) hosted by the National Centre for Atmospheric Research (NCAR) which aims to “strengthen society’s resilience to weather and climate extremes” (ECEP website: About). I also have a separate blog on this on my company website here. This vision can only truly be achieved through partnerships between the public and private sectors.

The power of partnerships is examined in the Stern Report which also highlights the potential economic downsides of not adapting to climate change. And furthermore a recent paper by Estrada et. al (2015) shows the economic costs of climate change in terms of damage from hurricane. They estimate that 2 to 12% of the normalized losses from the busy hurricane season of 2005 in the U.S. are attributable to climate change. It seems an interesting finding, but since they have also found an increase in both frequency and intensity of storms from the geophysical data, where other papers have only found a increase in intensity, it does seem to be a finding worth more exploration in a future blog.

In summary, it seems clear that the financial world certainly has a key part to play, and when fully committed to investing in new technology and research, it can act as a powerful driver for change in terms of building resilience and financial stability in the face of changing climate extremes.

Afterthought

I know this blog is supposed to be about storms, but I’m starting to realise just how much climate change is a multidisciplinary challenge and so to focus on one subject, one problem, or one solution can reduce our ability to bring together different expertise and opportunity.

I think it’s healthy to take a step back and look at the wider interaction of various adaptation and mitigation initiatives, and then perhaps work out how they can fit into your own area of expertise and capability to do something useful for society.


Sunday, 3 January 2016

Disastrous Return Periods

When talking about return periods it’s easy to assume that a 1 in 100 year event will occur only once in 100 years. This is may lead to misconceptions in the understanding of risk. Consequently, it can lead to poor-decision making. Stakes could be fairly high, perhaps affecting whether or not you invest in flood protection for your home, or misconceptions may influence engineering and building code regulations, when communicating with decision-makers. The reality is that a 1 in 100 year storm, can happen once in 100 years, twice in the same 100 years, three times or even not at all! At the risk of ranting, it strikes me as a hugely misleading communication tool which continues to purvey the risk management world, and communications in the general media today.

I am not alone in this. Francesco Serinaldi, an applied statistician at Newcastle university, wrote a paper in 2014 called: Dismissing Return Periods! Using an exclamation mark in a title of an academic paper gets a thumbs up from me! He goes into much more detail than I could on this subject, describing how univariate frequency analysis can be prone to misconceptions when return period terminology is used.

Serinaldi also suggests better alternatives for engineering and risk modelling applications. These are; the use of probability of excedance, and risk of failure over the life time of a project or average life expectancy of a person perhaps. These describe more objective and robust quantifications of frequency of specific events, or categories of events, defined by a parameter or index.

Perception of safety

Another example is described by Greg Holland in his blog on the Engineering For Climate Extremes Partnership (ECEP) website. When discussing Hurricane Katrina, he also suggests how misleading the description of the levee protection as being able to withstand a 1 in 100 year storm, evoking a ‘sense of safety’. He elaborates (as I mentioned above) that their 1 in 100 year storms is simply a 1% chance of such a storm happening in any given year (irrespective of climate variability).  He explains that this means that there is a 65% chance that such a storm would occur in the next 100 years. When changing the time period it also means that there is a 25-30% chance that such a storm would occur within the next 30 years. This starts to concern a much wider stakeholder groups, including small businesses and home owners.

Return periods can also vary widely depending on the spatial scale of an event. The Great storm of 1987 in the UK was reported as a 1 in 200 year event for many of the southern counties of the UK, whereas for parts of the south coast, it has been assessed to be more like 1 in 10 years! This is misleading in that the storm itself was large enough to cover a broad swath of land with severe impacts, but within this storm return period estimations vary. It depends on how the calculations are conducted, which data are used, thresholds that are assigned for definition of event.

One generalised return period statistic is not adequate to clearly describe the risks associated with a storm. The more objective methods suggested by Serinaldi are an alternative for engineering applications.

Public perception tangent…

As a bit of a tangent, but this has made me think about public communications too. I think that the use of analogues or a narrative, to try to recreate conditions in a viewer’s mind using past experiences, is very powerful in changing perceptions and behaviour, much more so than a misleading return period estimate.  I find it fascination how perception can change based on storytelling: one thing at which humans have always excelled.

An interesting paper by Lowe et al (2006) studies the effects of blockbuster movies such as the “The Day After Tomorrow” which can act to skew perception of risk, but also increase motivation to act on climate change and sensitize the viewer. The paper also notes a lack of knowledge on how to use this new found Hollywood-induced motivation. This is an interesting area of research in its own right.

Too many blog subjects, not enough time!

Tuesday, 29 December 2015

Will GCMs really tell us everything we need to know about climate change?

In a previous blog, I discussed General Circulation Models (GCMs) at varying resolutions.

Here, I’ll highlight a few limitations, especially when looking at tropical cyclones.

Even though GCMs are able to capture tropical cyclone tracks and storm formation to provide hugely valuable forecasts for public safety concerns, we should be aware of the limitations in looking at climate scale variability and change. For example, looking seasons or years ahead into a climate projection, GCMs have less ability to say how many and how intense the storms might be. Hurricane season forecasts are put together using a variety of statistical and GCM-based techniques and we can get a lot of value from both approaches. But there is only so much that we can say.

However, papers by Deser et al 2012 and Done et al 2014 are useful in determining what can be explained on a seasonal or decadal time-scale. James Done found that based on one season, his regional climate model experiments shows that around 40% of the variability in tropical cyclone frequency in the North Atlantic is simply natural variability, and not associated with forcing from greenhouse gases, volcanoes, aerosols or solar variability (external forcing). He notes that from Deser et al. 2012, regional scales can see internal variability becoming greater than externally forced variability. This also highlights the difficulty in assigning a single regional event to changes in climate on a global scale.

To sum up, GCMs
  • as numerical weather prediction models, offer great ability to provide operational forecasts and warnings on a day-to-day basis, 
  • as global/regional climate models, to experiment with the atmosphere and explore sensitivities in the processes that bring about extremes of climate, global climate variability or climate change. 


When looking at seasonal or longer timescales, GCMs run at lower resolution and so lose the ability to capture small scale features that drive tropical cyclones, and so we have to model the large scale influences to look at more general shifts in probabilities of single or seasonal phenomena (e.g. hurricanes or droughts).

Deser et al. 2012 also calls for greater dialogue between science and policy/decision-makers to improve communication and avoid raising expectations of regional climate predictions. I totally agree. Better communication between scientists and stakeholders is important because talking about storms and climate change is highly political. Poor communication can lead to gross misrepresentations by those aiming to mitigate and adapt to climate change, as well as those who do not accept that climate change is a concern.

Future for GCMs?

I can see how GCMs have great ability in helping us understand the sensitivities of the climate system, and as they improve and as computing power increases (along with big data solutions), then so too should our understanding of various climate processes. In fact, growth of the GCM capabilities may well increase the level of uncertainty as we start to model more and more complexity. I do wonder where the next big step will be though. Between CMIP3 and CMIP5 (two rounds of climate model comparison projects – see previous blog) Bellenger et al. (2015) showed some progress, but also commented that overall, there were limited improvements of how ENSO (a dominant mode of climate variability)  is characterised.

An interesting article here by Shackley et al. back in 1998 called; “Uncertainty, Complexity and Concepts of Good Science in Climate ChangeModeling: Are GCMs the Best Tools?”, shows a range of interesting discussion points asking whether GCM-based climate science is actually the best approach from a number of perspectives. Are there alternative types of models that could allow us to better engage with the public, with policy makers or with the private sector? There are certainly alternatives that show promise as discussed on Judith Curry’s blog, who is of the opinion that climate modelling is in a “big expensive rut.” I hope I can find time to expand on this interesting topic in my blog here.


Personally, I am a big fan of GCMs. It's amazing that they can represent the atmosphere with such high fidelity, but it's good to ask these questions and not to forget alternative approaches which may be much more practical and 'fit-for-purpose' in particular situations.. 

In a future blog, I’ll discuss a little about how we talk about probability of future events, and then follow on with a blog on how we currently stand on tropical cyclones and climate change. 

Saturday, 26 December 2015

A Model Family

Many of my recent blogs have been quite focussed on the past. It seems clear that we have a few useful methods that can help us understand storm frequency, with less certainty on how severe they have been. As powerful as palaeotempestology might be, it is sadly unlikely to be able to provide enough data for us to truly compare the climate proxy outputs at the fidelity with which we have been observing storms in the last 100 or so years, especially since we began to use satellites to observe the weather.

However, as an ex-professional in the world of weather forecasting, I often get asked about the chances of a certain intensity of storm occurring, such as, could we see another Hurricane Katrina, or will the Philippines see another Typhoon Haiyan, or closer to home (UK), when will we see another Great Storm of 1987 (aka 87J). Of course, these questions are difficult to answer, unless a storm of similar characteristics is starting to form and picked up in numerical weather prediction models such as the UK Met Office’s Unified Model (UM), or the U.S. NOAA’s Global Forecast System (GFS) (there are many more).

This blog will talk a little about what I know of the types of models that are based on physical laws at work in the atmosphere and oceans, and take super computers bigger than my flat (not saying much) to run.

General Circulation Modelling – the granddaddy of physical modelling

General Circulation Models (GCMs) focus on the actual physical dynamics of the atmosphere and model them by building a system of grid cells (lego-like blocks) which talk to each other regarding momentum and heat exchanges. The size of these grid cells defines the scale of the weather phenomena that can be modelled.

However, there is a trade-off between three facets of a GCM configuration. With limited computing resources, a balance must be struck between complexity (the physics that are included in the model in the actual lines of code), resolution (size of grid-cells) and run-length (how much time does the model  represent i.e. into the future or a period in the past perhaps). Basically climate models use Duplo bricks, and high resolution models use normal Lego bricks. The analogy also works as the can fit together nicely (Figure 1).

Figure 1: Larger Duplo (climate models) bricks and smaller Lego (weather forecasting models) bricks working together. Source: Wiki Commons Contributor: Kalsbricks

I wonder what type of modelling is analogous to mechano? Thoughts on a postcard, please, or in the comments section below?

In case you were wondering, the Lego analogy came about since that's what I bought my three year old nephew, Harry, for Christmas. The present that keeps on giving! Merry Christmas by the way!

Lego Bricks

High-resolution model configurations of some of the big GCMs have been built that can, for example, capture the small-scale eddies around the headlands of the Isle of Wight in the UK (by the Met Office during their involvement London Olympics 2012). Models of grid-scale, in the order of a few hundred metres, are used for this detailed work and are run over a very small region.

Another example of high resolution modelling: A regional model was employed to reanalyse Cyclone Megi from 2010 which had one of lowest central pressures ever recorded. The comparison shows satellite imagery alongside a model run (by Stuart Webster at the Met Office) with amazing detail of the eye-structure and outer bands of convection. Because of the presentation of the model data, the two are difficult to distinguish for the untrained eye (Figure 2).


Figure 2: Cyclone Megi simulation (top) showing eye- wall and convective bands, compared to similar locations and overall size of the real storm in a satellite image from MT-SAT 2. Source: Met Office.

Duplo bricks

GCMs traditionally struggle to match the intensity of storms in climate model configurations, as described in the IPCC AR5 chapter on evaluation of climate models (IPCC WG1 AR5: 9.5.4.3), but example such as the Met Office’s Cyclone Megi, and others models with resolutions of 100km or so show that the science is still able to model many features of tropical cyclone evolution.

They are also used to model the large scale planetary interactions that govern phenomena such as ENSO, and are captured well according to the selection of models used in the Coupled Model Inter-comparison Project (CMIP). CMIP is currently on its fifth incarnation, CMIP5, which is used by the IPCC to understand future climate change. This paper by Bellenger et al. (2015) shows some of the progress made in recent years, between CMIP version, however, due to similra ability to represent large scale features when examining ENSO, both CMIP3 and CMIP4 models can be used in conjunction as a broader comparison

Assembling the ensemble

The “ensemble” is also a technique used to run a model multiple times with slightly different starting conditions to capture a range of uncertainty in the outputs. No model is perfect so their products shouldn’t be believed on face value, but ensembles can help us by showing the range of possibilities as we try to represent what we don’t know in the input data.

This addresses some of the observational uncertainty. GCMs starting points are based on the network of observations that are connected up throughout the world, and standardised by the World Meteorological Organisation (WMO) for weather forecasting. These observations include ground-based observations (manual and automatic), radar imagery of precipitation, satellite images, aircraft reconnaissance (with tropical cyclones), sea surface readings, and weather balloon ascents (and more) which are all assimilated into an initial condition, and gradually step forward in time by the gridded global model. The starting point is also called ‘the initialisation’ in a forecasting model. For climate models the starting point can be current climate, or whatever version of the climate is relevant to experimental design.

Regardless of how a mode is started on it time-stepping through a defined period, ensembles provide an idea of the range of possible outcomes through minor perturbations in observing conditions, or even how certain physical processes are handled (i.e. through different paramaterisation schemes for features too small to be represented at a given resolution). In my forecasting days at the Met Office, looking at the solutions from a variety of the world’s big weather modelling organisations (NOAA, Met Office, ECMWF, JMA) was colloquially termed ‘a poor man’s ensemble’ as normally an ensemble will consistent of many tens of solutions. A similar concept, although not using GCMs, is found in risk modelling applications such as catastrophe loss modelling, many tens of thousands of simulations are performed to try to statistically represent extreme events, but using extreme value theory and statistical fits to the rare events on a probability distribution. A useful paper reviewing methods in loss modelling for hurricanes can be found by Watson et al. in 2004.

And the weather today...

So numerical weather prediction models used for day-to-day forecasting are run at high resolution, high complexity, but can only go a week or so into the future. Their accuracy has improved greatly in the last few decades. A forecasting for three days ahead now is as accurate as a forecast for one day ahead in the 1980s, according to the Met Office. And below (Figure 3) is a picture of the European Centre for Medium Range Forecasting’s (ECMWF) verification of different ranges over the decades


Figure 3: ECMWF’s verification scores for a range of forecast ranges. Source: ECMWF.
.
Climate models on the other hand are run with lower complexity and lower resolution, allowing them to be run out to represent decades. Since large scale climate modes such as ENSO (or the AMO, or MJO, or many others) can influence storm activity, intensity and track, GCMs are invaluable tools in helping us understand the broader climate, as well as the small-scale processes.


Basically, GCMs can be run at different resolutions with different input data depending on the application (e.g. weather forecasting or climate experimentation). The computing power dictates how these model configurations perform and the range at which they can produce outputs in a reasonable run time. They have developed into the key tool for understanding our weather and climate and interactions with the Earth’s surface (via other modelling approaches such as land surface models  or ocean circulation models. 

Wednesday, 23 December 2015

A Vanishing Sea of Toxic Dust Storms


In the last Climate Change MSc lecture of 2015, a case study was presented regarding the changes that have happened in a relatively short space of time in the Aral Sea, on the border between Kazakhstan and Uzbekistan. Figure 1 below clearly shows the reduction in the area covered by water in the sequence which runs from 2000 to 2015.


Figure 1: Aral Sea satellite image sequence from 2000 to 2015 (looping). The black outline is the approximate lake shoreline in 1960. Source: Constructed animating gif from NASA Earth Observatory images.

Otherwise known as the ‘Sea of Islands’, this endorheic sea was once the fourth largest inland sea in the world, and allowed fishing communities and agriculture to sustain themselves for decades in the early half of the 20th century. As an endorheic sea (meaning no outflow to the ocean) it acts as a terminus for surrounding hydrological systems, also termed as a terminal lake. Terminal seas and lakes such as this are very sensitive to changes in climate, for example through changes in evaporation rates. In fact, the Aral Sea has undergone a cycle of drying out and filling up over the past 10 thousand years (Micklin 2007).

Another picture, Figure 2 (sourced cited by an article on the Aral Sea Crisis by Columbia University) shows some older images than Figure 1, which highlight the longer term reduction.
Figure 2: Clear reduction in Aral Sea. When combined with Figure 1 we see the extremes of the reduction in water surface area. Source: http:/www.envis.maharashtra.gov.in and cited by Thompson 2008


The main cause for this reduction was the development of the Karakum Canal, built for agricultural irrigation, shipping and fisheries allow for economic development of Turkmenistan. It was started in 1954 and completed in 1988. It has enabled huge areas of Turkmenistan to be committed to high intensity agriculture, essentially draining the Aral Sea of water.

The reason for this huge engineering endeavour was the farming of cotton. The cotton, nicknamed ‘white gold’, requires a huge amount of water. To make matters worse the engineering practices used to construct the canal allow around 50% of the water to be lost into the ground and to evaporation.

Impact

Micklin noted the reduction in water surface area to be around 75%, and the lake level reduction to be around 23 to 30 meters (Glantz 2007), which led to a volume reduction of 90% and an increase in salinity of over an order of magnitude, from 10 g/l to over 100 g/l. This lead to tragic and severe impacts to the local ecosystems, mainly fish species, as well as enhancing the frequency of dust storms to roughly ten per year (Glantz 1999, cited by an article by Thompson in 2008 on the Columbia University website). These impacts deveastated local communities and made the area extremely inhospitable. 

Knock on impacts on local communities and industries are numerous. Obviously, the fishing industry in the sea has been decimated due to increasing salinity and agricultural practices are now hampered by the loss of water resources. Mammals and birds have also seen sharp decline in species diversity: from 1960 to 2007, the area lost roughly half of the number of species (Micklin 2007).

The other major impact of over 36,000 km2 (Wiggs et al. 2003) of dusty seabed being created is that there is now a large source of extra dust available to be picked up by the winds and on occasion whipped into dust storms (Figure 3). Roughly ten dust storms occur in the region per year (Glantz 1999, cited by an article by Thompson in 2008 on the Columbia University website).



Figure 3: Dust storms on the coast of the Aral Sea in May 2007 (Source: NASA)

Agricultural waste products containing pesticides, insecticides, herbicides and fertilisers have drained into the sea, accumulated over time, and then once the sea dried, they became baked into the exposed sediments. The desiccated land surface also potentially contains remnants from Soviet Union's biological warfare testing in the 1950’s, including Antrax, which is just waiting to be transported around by the aeolian processes. Vozrozhdeniye island, also known as Resurrection island, remained a controversial subject as it was one of the chief locations for such testing.

Wiggs et al. (2003) studied the link between aoelian dust and child health in the populations close to the Aral Sea, and found some associations to local respiratory illness in local populations, although there are significant long-distance sources of dust in the region too. Micklin (2007) also confirmed this negative impact on human health and agriculture in the wider area from dust storms that can grow to be 500km in size.

Climate perspective

Although the case of the Aral Sea’s reduction is an extreme example, it seems fair to assume that endorheic lakes will see pressure due to global climate change (Timms 2005), whether there is significant human influence or not. The Aral Sea has suffered from a two-pronged attack as the region undergoes warming, and agricultural exploitation and over-use. Strategies to preserve the remaining water in the North Aral Sea through damming projects after the sea split in to two basins in 1987, seem to be successful, which will enable the communities in the area to hold on to their way of life to an extent.

The former majesty of the larger portion of Aral Sea (the Big Aral), now seems to resemble no more than a salty (and toxic) dust bowl, with former islands now parched monuments to the impact of cotton farming and climate change, although to a lesser but still significant extent, (Aus Der Beek et al. 2011). The region will only come under more pressure if water resources become scarcer in the area linked to global warming and high evapotranspiration rates. 

Small et al. (2001) examined how the desiccation of such a large area through excessive irrigation has modified the sea surface temperatures, precipitation regimes and the hydrological cycle in the area. I wonder if the original plans to build the Karakum Canal took any of these knock-on effects in to consideration.

To end, I’ll post this interactive storymap hosted by Esri which highlights some human induced change since 1990 using the Landsat satellite imagery from NASA. The first example is the Aral Sea and, you can see again, by swiping the dividing line, how the lake has undergone a dramatic and rapid drying out in the last 25 years. The other pages of the map, also show cases of anthropogenic land use change from urban expansion, damming, land reclamation, and agricultural uses.



*UPDATE*
My brother's comment below makes a very good point regarding the fact that such a sad story, now serves as a an evocative reminder of the impact of human over exploitation of the environment. This reminder should be documented as it happens, not only in scientific literature, but in art too. We are both keen photographers, and so I thought I'd add this link to herwigphoto.com's Aral Sea project. Some amazing and poignant images.

Saturday, 12 December 2015

Palaeotempestology: Tree rings

In my last blog, I explored how the layers of calcium carbonate, which build up as a coral skeleton grows, can be used as a climate proxy. We can find a similar process by looking at tree rings. One of the more established practices in palaeoclimatology is dendroclimatology (the use of tree rings to study the past climates). Like other palaeoclimatological proxies, it allows us to extend the range of our observational record beyond that of conventional weather recording instrumentation.

Just as corals live for hundreds of years (sometimes over a thousand years), trees can keep on recording the composition of the atmosphere in their layers of cellulose for many hundreds of years, and beyond when fossilised. Figure 1 below shows an example of Huon pine samples ready for analysis, each dark line denoting a season of growth.

Figure 1: Huon Pine ready for analysis. Source: Edward Cook, Lamont-Doherty Earth Observatory, Columbia University, Palisades, NY

Isotopic differences

Ancient pines are often the favoured study subjects due to their longevity. They can give annual or seasonal information on atmospheric composition. To extend the record beyond a single sample, a variety of sources can be combined together using distinctive signatures as shown in Figure 2 below.
Figure 2: Sources of tree ring data showing how various samples can be linked together. Source: Laboratory of Tree-Ring Research, The University of Arizona

The main process that allow us to look at past storms is the fractionation of stable oxygen isotopes through condensation and evaporation. I touch upon this in my previous blog about corals, it is the difference atomic weight between the heavier oxygen-18 isotope and oxygen-16 isotope that allows us to glean clues about past climate events from tree cores.

The difference in atomic weight of oxygen isotopes is derived from the number of neutrons in the atomic structure. The most common natural isotope is oxygen-16 (over 99% of atmospheric oxygen) which has 8 protons and 8 neutrons (electrons are virtually weightless by comparison), but stable oxygen atoms can also have 9 or 10 neutrons to make up the different isotopes that we find useful for palaeoclimatology. As mentioned before, the water molecules with the lighter oxygen isotopes (oxygen-16) are preferentially evaporated in warm temperatures, while conversely the water molecules with heavier isotopic values (oxygen-18) tend to condense and form clouds or precipitation more easily. It is this property that allows us to identify different sources of precipitation in tree ring samples.

In extreme precipitation events associated with tropical cyclones, the level of oxygen-18 depletion in the rain water is high due to the highly efficient process of forming precipitation via condensation in the core of a tropical cyclone (Lawrence in 1998, Monksgaard et al. 2015). In Lawrence’s paper, five tropical cyclones that made landfall in Texas, U.S, were studied. They showed much lower oxygen-18 to oxygen-16 ratios (or δ18O) from tropical cyclones than normal summer convective storms.

This finding was further corroborated by a study of Hurricane Olivia by Lawrence et al. in 2002. Tropical cyclones are also large and long-lived and create vast areas of precipitation that can stay in the water system for weeks, giving different isotopic characteristics associated with the location of the heaviest rain bands and storm centre (Monksgarrd et al 2015). Deep soil water can remain unaffected by normal summer rainfall, and in the absence of further heavy rain events, it is allowed to be taken up by trees (Tang and Feng, 2001).

It seems clear that oxygen isotope analysis seems to be the favoured form of tree ring analysis for palaeotempestology.

Tapping the potential

Upon learning about these methods it also seems reasonable to assume that different intensities and characters of storms will result in different levels of oxygen-18 depletion. It seems likely that there would be much uncertainty in making assumptions of a storm’s intensity based on isotope fractionation (but I’ll keep looking for more research on this). At the moment, it seems that the uncertainty may preclude a reliable intensity measure of past storms using this approach.

The oxygen isotopes uptake into the tree’s structure will depend on many factors, including biological processes that are dependent on species, tree age, exposure to the storm, soil composition. Growth cycles are also taken into account. By doing so we can try to limit the degree to which uncertainty derived from the mismatch between growth season and storm season, can cloud useful information.

In the North Atlantic basin for example, hurricane season runs from early June to late November and as such overlaps mainly with latewood (as opposed to earlywood) growing phase. Therefore it is these sections of the layers of tree rings which are focussed upon for palaeotempestological studies.

Miller et al. 2006 presented the emerging case for using oxygen isotopes more widely after the devastation left behind by the busy 2004 and 2005 hurricane seasons, by building a 220-year record to identify past storms from unusually low oxygen-18 isotopes in pine forests. This is potentially very useful for engineering and loss modelling concerns.

“Can’t see the wood for the trees”

There are many uncertainties in the application of tree ring data to palaeoclimatology, let alone palaeotempestology, as summarized in the review paper by Sternberg et al. in 2009, including complex cellulose uptake biology, changes in isotopic composition of soil water, assumptions based on the relationship between leaf temperature and ambient temperature.

However, every study adds to the wealth of information and since each site represents a single location slice through time, it seems as though the science of dendroclimatology will only continually benefit from new data. And there still seems to be push to collect and analyse more data. The National Climatic Data Center, hosted by NOAA, is a font of old and recent tree ring datasets.

A recent review of the data by Schubert and Jahren published in October this year (2015) takes a wide view. It aims to unify tree ring data sets, to bring together a global picture of past extreme precipitation events based on low oxygen-18 isotope records. They conducted 5 new surveys and used 28 sites from the literature to create a relationship using seasonal temperature and precipitation, which can explain most of the isotopic oxygen ratio in tree cellulose. This seems to be a step up in resolution, as looking at seasonal variations rather than annual cycles may provide a step closer to identifying individual storms or storm clusters using tree ring data. It is interesting to see a comment in the conclusion of this paper about the fact that much of the uncertainty that still remains in this link, is derived from disturbances, such as storms.


Figure 3: Comparison between measured δ18O in the cellulose of studies trees and the calculated δ18O using the model developed by Schubert and Jahren which uses known climate characteristics. It shows a good correlation on relating seasonal temperature and precipitation to oxygen-18 isotope ratios. Source: Schubert and Jahren, 2015

It seems clear that it would be much more difficult to develop a simple equation to explain the extremes of the isotopic ratio chronologies to identify extreme storms. However, Schubert and Jahren seem to have taken a step forward while remaining focussed on average seasonal conditions. Nevertheless, I can’t help but wonder if there is a way for extreme events to be linked in to somehow.

Alternatives to isotopes

When looking specifically at past storms in trees rings, I did find a couple of other approaches to using tree ring data that may also be worth a mention.  

Firstly, an interesting couple of papers by Akachuka in 1991  and another in 1993, used a method where trees that have been forced to lean after a hurricane. This phenomenon is examined for any extra clues that it may provide by assessing how these trees recover from such disturbances. Although the papers do not look specifically at characterising the storms themselves (i.e. there is no wind speed to bole displacement relationship), I couldn’t help but wonder if there is some extra information to gather from these trees and whether we could build a relationship to specific storms or storm seasons.

Another paper by Sheppard et al. in 2005 looks at the effect of a tornado in 1992 on a specific dendrochronology and re-evaluates the pre-historical records from wood samples retrieved from an 11th century ruin in Arizona. He looks for similar patterns in wood growth (see Figure 2 for conceptualisation). Unfortunately, the patterns found in the tree rings which were caused by the tornado in 1992 were not replicated in the ring patterns of the 11th century sample. This is certainly interesting work, but I imagine that finding enough data for trees that are damaged but still survive tornadoes is not easy, especially when comparing to single older samples.

Conclusions

Although individual studies using tree lean or damage from specific events like tornados, are interesting and worthwhile academic endeavours to help us understand the ways in which storms of various scales impact certain tree growth, they do seem somewhat less applicable to thinking about climate change and how frequency and severity of storms are changing over a wide area.

With so many subtleties based on factors such as tree species or topography of a study site, I feel that the broader synthesis approaches (as per Schubert and Jahren above) using stable oxygen isotopes offer greater immediate potential for aiding our understanding of past changes in storm activity with possibility for application to risk assessments and projecting impacts of future climate change. 

Saturday, 28 November 2015

Palaeotempestology: Lake sediment records

Digging in to sediment records

With continued debate among scientists on exactly how future climate change will affect storm frequency and severity, it seems logical to see if we can find out more about variability in storm activity from the past.

Lake sediments are extremely useful in studying past climates, for which we have no observational record (through conventional weather recording equipment). They provide a slice through time to look at the changes in lake chemistry and environmental activity affecting the make up of suspended particles in the lake that eventually settle at the bottom.

Radiocarbon dating, thickness of layers of different sediments, analysis of diatoms and inference from the occasional break in the record (a hiatus, perhaps due to the drying out of a lake), are various ways in which lake sediments can give us clues about the past.

Within this range of different approaches there are a few ways in which sediments from lakes can be used to look at past storm events. In my previous blog, I highlighted a paper by Dr Jeff Donnelly et al. in 2015 entitled “Climate forcing of unprecedented intense-hurricane activity in the last 2000 years”. It presents a history of storm events over the past two thousand years, using an analysis of sediment grain size in their collected samples, with a resolution of around 1 year. The work uses evidence gathered from field work during the project (and previous studies) to determine the presence of two distinct periods of higher activity in severe hurricanes for the west North Atlantic coastline of North America: one between 1400 and 1675 C.E.; and another period of high frequency storms further back in time between 250 and 1150 C.E.

The study location is a place called Salt Pond, in Massachusetts. It has a tidal inlet linking it to the ocean, making it full of brackish waters. This proximity to the ocean means that the pond is exposed to ‘overwash’ during storm surge events associated with large storms heading northwards along the Eastern seaboard of the United States. These salt water incursions occur when the storm surge level is higher than any natural or man-made defences. This ‘overwash’ leads to ‘coarse grain event beds’, and so these can be used as an indicator of severe storm activity. This process is vaidated using known hurricanes landfalls, which are represented in the sediment records and act as ‘anchors’ to verify that the samples are valid.

The study builds on a number of papers that were produced after the convening of a workshop on Altlantic palaeohurricane reconstructions in 2001 at the University of South Carolina. The workshop aimed to identify new opportunities in the field of palaeotempestology. A summary of the workshop can be found here. Dr Jeff Donnelly and colleagues studied a number of lakes in the Northeast of the US, in the states of New Jersey and New England, and so to learn a bit about the methodology, I dug into some of the papers in some more depth.


Getting your hands dirty

It seems the only way to get at clues available from sediment records is to get your hands dirty. I found an earlier paper by Donnelly at al. from 2001 which built a 700 hundred year sediment record of severe storms in New England. This paper (and a couple more in Boldt et al. 2010Liuand Fearn, 2000) started to show me that each project strategy is subtly different. 

Various schemes are planned based on the conditions of the study sites, to find the best locations for sampling overwash areas in a consistent manner. The aim is to try to consistently capture the process by which more intense storms erode more sand from the coastal beach and bring this coarse sediment into the brackish lakes and ponds, larger storms being assumed to produce wider fans of overwash sand deposits, being thicker near the shore and thinner near the centre of the study lake. A range of
samples should be taken to try to represent the range of possible characteristics of past intense storms. Figure 1 (below) is a hypothetical diagram from Liu and Fearn (2000) to show various patterns of deposition. Note the radial patterns associated with the various directions of storm approach, with the larger fans associated with more intense storms.

Figure 1: Hypothetical coarse grain deposition fans in severe storm surge events. Source: Liu and Fearn, 2000 
The coarse sand creates a layer over the more usual organic-based deposits that settle on the bottom of a lake as a stratified layer. This happens most effectively in anoxic lake beds (lacking dissolved oxygen) since any mixing from plant of animal life will be minimal.

Having never been in the field to collect sediment samples, I found it interesting to see how Donnelly et al. (and other teams) maintained a consistent chronology in the sediment records. They took multiple samples and use the variety of methods above to build their chronology.


Markers in time

Isotopic radio carbon dating and stratigraphic markers used to mark certain control points to validate the data. Pollution horizons are useful in this respect, for example lead concentrations mark the beginning of the industrial revolution as it quickly made it's way into the water systems and lakes and then 'fixed' by anoxic sediments. The presence of lead pollution is an indicator of the late 1800's (Donnelly et al. 2001) and then another change occurs when lead was removed from gasoline in the 1970's and 1980's. This is a good example as it shows how these markers are useful for calibrating sediment records, in a way that is easily understood and recognised.

Pollen records can also mark certain points in history, for example the European colonisation of the eastern U.S. led to large scale clearance of the vegetation for farmland meaning that the pollen composition changes drastically (Russell et al. 1993).

Once these markers are established, previous storms are used to calibrate storm events, and then previous coarse grain even layers are identified and carbon dated.


Clear as mud?

So having learned a lot more about sediment analysis in relation to palaeotempestology, I now have a greater respect for what these cores of old mud and sand can tell us about the past. However, it does seem to me that there is still a large degree of uncertainty in the data when trying to discern an idea about individual storms. For example, what if two storm occur in quick succession as a cluster, before a sediment layer has had a chance to settle and ‘lock in’ the information? This may end up looking look like one larger or more intense storm, when actually it is the frequency of storms in that season which is varing. Donnelly et al. 2001 give an example from their study location of a lack of agreement between historical accounts of two intense storms in 1635 and 1638 which likely created overwash signatures, but in the sediment proxy data, only one event was indicated. This means that the estimated frequencies may have significant uncertainty.

Also, responses of lake or pond to overwash events may change over time due to changes in natural or man-made barriers. However, even with these uncertainties in mind, it is still clear that there is great value in understanding the past clues left behind by storms in our coastal lake sediments. 

Without any alternative information, the best that we can do is to piece together palaeotempestological proxies and glean snippets of information to build a longer record of storms.

It also provides grounds for comparison in using climate models to try to understand past variability,
another subject I intend to explore in a future blog.

For now, I’ll leave you with an informational video by Ocean Today in conjunction with the Smithsonian Institution and NOAA, just after Hurricane Sandy in 2012 which will hopefully make a clear demonstration of what overwash looks like and how the coastal beach material can be dragged in across to end up in lakes or ponds that lay close to the ocean to give us these markers of past events.



My next blog will be on the evidence that can be derived from coral cores.