Showing posts with label hurricanes. Show all posts
Showing posts with label hurricanes. Show all posts

Saturday, 26 December 2015

A Model Family

Many of my recent blogs have been quite focussed on the past. It seems clear that we have a few useful methods that can help us understand storm frequency, with less certainty on how severe they have been. As powerful as palaeotempestology might be, it is sadly unlikely to be able to provide enough data for us to truly compare the climate proxy outputs at the fidelity with which we have been observing storms in the last 100 or so years, especially since we began to use satellites to observe the weather.

However, as an ex-professional in the world of weather forecasting, I often get asked about the chances of a certain intensity of storm occurring, such as, could we see another Hurricane Katrina, or will the Philippines see another Typhoon Haiyan, or closer to home (UK), when will we see another Great Storm of 1987 (aka 87J). Of course, these questions are difficult to answer, unless a storm of similar characteristics is starting to form and picked up in numerical weather prediction models such as the UK Met Office’s Unified Model (UM), or the U.S. NOAA’s Global Forecast System (GFS) (there are many more).

This blog will talk a little about what I know of the types of models that are based on physical laws at work in the atmosphere and oceans, and take super computers bigger than my flat (not saying much) to run.

General Circulation Modelling – the granddaddy of physical modelling

General Circulation Models (GCMs) focus on the actual physical dynamics of the atmosphere and model them by building a system of grid cells (lego-like blocks) which talk to each other regarding momentum and heat exchanges. The size of these grid cells defines the scale of the weather phenomena that can be modelled.

However, there is a trade-off between three facets of a GCM configuration. With limited computing resources, a balance must be struck between complexity (the physics that are included in the model in the actual lines of code), resolution (size of grid-cells) and run-length (how much time does the model  represent i.e. into the future or a period in the past perhaps). Basically climate models use Duplo bricks, and high resolution models use normal Lego bricks. The analogy also works as the can fit together nicely (Figure 1).

Figure 1: Larger Duplo (climate models) bricks and smaller Lego (weather forecasting models) bricks working together. Source: Wiki Commons Contributor: Kalsbricks

I wonder what type of modelling is analogous to mechano? Thoughts on a postcard, please, or in the comments section below?

In case you were wondering, the Lego analogy came about since that's what I bought my three year old nephew, Harry, for Christmas. The present that keeps on giving! Merry Christmas by the way!

Lego Bricks

High-resolution model configurations of some of the big GCMs have been built that can, for example, capture the small-scale eddies around the headlands of the Isle of Wight in the UK (by the Met Office during their involvement London Olympics 2012). Models of grid-scale, in the order of a few hundred metres, are used for this detailed work and are run over a very small region.

Another example of high resolution modelling: A regional model was employed to reanalyse Cyclone Megi from 2010 which had one of lowest central pressures ever recorded. The comparison shows satellite imagery alongside a model run (by Stuart Webster at the Met Office) with amazing detail of the eye-structure and outer bands of convection. Because of the presentation of the model data, the two are difficult to distinguish for the untrained eye (Figure 2).


Figure 2: Cyclone Megi simulation (top) showing eye- wall and convective bands, compared to similar locations and overall size of the real storm in a satellite image from MT-SAT 2. Source: Met Office.

Duplo bricks

GCMs traditionally struggle to match the intensity of storms in climate model configurations, as described in the IPCC AR5 chapter on evaluation of climate models (IPCC WG1 AR5: 9.5.4.3), but example such as the Met Office’s Cyclone Megi, and others models with resolutions of 100km or so show that the science is still able to model many features of tropical cyclone evolution.

They are also used to model the large scale planetary interactions that govern phenomena such as ENSO, and are captured well according to the selection of models used in the Coupled Model Inter-comparison Project (CMIP). CMIP is currently on its fifth incarnation, CMIP5, which is used by the IPCC to understand future climate change. This paper by Bellenger et al. (2015) shows some of the progress made in recent years, between CMIP version, however, due to similra ability to represent large scale features when examining ENSO, both CMIP3 and CMIP4 models can be used in conjunction as a broader comparison

Assembling the ensemble

The “ensemble” is also a technique used to run a model multiple times with slightly different starting conditions to capture a range of uncertainty in the outputs. No model is perfect so their products shouldn’t be believed on face value, but ensembles can help us by showing the range of possibilities as we try to represent what we don’t know in the input data.

This addresses some of the observational uncertainty. GCMs starting points are based on the network of observations that are connected up throughout the world, and standardised by the World Meteorological Organisation (WMO) for weather forecasting. These observations include ground-based observations (manual and automatic), radar imagery of precipitation, satellite images, aircraft reconnaissance (with tropical cyclones), sea surface readings, and weather balloon ascents (and more) which are all assimilated into an initial condition, and gradually step forward in time by the gridded global model. The starting point is also called ‘the initialisation’ in a forecasting model. For climate models the starting point can be current climate, or whatever version of the climate is relevant to experimental design.

Regardless of how a mode is started on it time-stepping through a defined period, ensembles provide an idea of the range of possible outcomes through minor perturbations in observing conditions, or even how certain physical processes are handled (i.e. through different paramaterisation schemes for features too small to be represented at a given resolution). In my forecasting days at the Met Office, looking at the solutions from a variety of the world’s big weather modelling organisations (NOAA, Met Office, ECMWF, JMA) was colloquially termed ‘a poor man’s ensemble’ as normally an ensemble will consistent of many tens of solutions. A similar concept, although not using GCMs, is found in risk modelling applications such as catastrophe loss modelling, many tens of thousands of simulations are performed to try to statistically represent extreme events, but using extreme value theory and statistical fits to the rare events on a probability distribution. A useful paper reviewing methods in loss modelling for hurricanes can be found by Watson et al. in 2004.

And the weather today...

So numerical weather prediction models used for day-to-day forecasting are run at high resolution, high complexity, but can only go a week or so into the future. Their accuracy has improved greatly in the last few decades. A forecasting for three days ahead now is as accurate as a forecast for one day ahead in the 1980s, according to the Met Office. And below (Figure 3) is a picture of the European Centre for Medium Range Forecasting’s (ECMWF) verification of different ranges over the decades


Figure 3: ECMWF’s verification scores for a range of forecast ranges. Source: ECMWF.
.
Climate models on the other hand are run with lower complexity and lower resolution, allowing them to be run out to represent decades. Since large scale climate modes such as ENSO (or the AMO, or MJO, or many others) can influence storm activity, intensity and track, GCMs are invaluable tools in helping us understand the broader climate, as well as the small-scale processes.


Basically, GCMs can be run at different resolutions with different input data depending on the application (e.g. weather forecasting or climate experimentation). The computing power dictates how these model configurations perform and the range at which they can produce outputs in a reasonable run time. They have developed into the key tool for understanding our weather and climate and interactions with the Earth’s surface (via other modelling approaches such as land surface models  or ocean circulation models. 

Saturday, 12 December 2015

Palaeotempestology: Tree rings

In my last blog, I explored how the layers of calcium carbonate, which build up as a coral skeleton grows, can be used as a climate proxy. We can find a similar process by looking at tree rings. One of the more established practices in palaeoclimatology is dendroclimatology (the use of tree rings to study the past climates). Like other palaeoclimatological proxies, it allows us to extend the range of our observational record beyond that of conventional weather recording instrumentation.

Just as corals live for hundreds of years (sometimes over a thousand years), trees can keep on recording the composition of the atmosphere in their layers of cellulose for many hundreds of years, and beyond when fossilised. Figure 1 below shows an example of Huon pine samples ready for analysis, each dark line denoting a season of growth.

Figure 1: Huon Pine ready for analysis. Source: Edward Cook, Lamont-Doherty Earth Observatory, Columbia University, Palisades, NY

Isotopic differences

Ancient pines are often the favoured study subjects due to their longevity. They can give annual or seasonal information on atmospheric composition. To extend the record beyond a single sample, a variety of sources can be combined together using distinctive signatures as shown in Figure 2 below.
Figure 2: Sources of tree ring data showing how various samples can be linked together. Source: Laboratory of Tree-Ring Research, The University of Arizona

The main process that allow us to look at past storms is the fractionation of stable oxygen isotopes through condensation and evaporation. I touch upon this in my previous blog about corals, it is the difference atomic weight between the heavier oxygen-18 isotope and oxygen-16 isotope that allows us to glean clues about past climate events from tree cores.

The difference in atomic weight of oxygen isotopes is derived from the number of neutrons in the atomic structure. The most common natural isotope is oxygen-16 (over 99% of atmospheric oxygen) which has 8 protons and 8 neutrons (electrons are virtually weightless by comparison), but stable oxygen atoms can also have 9 or 10 neutrons to make up the different isotopes that we find useful for palaeoclimatology. As mentioned before, the water molecules with the lighter oxygen isotopes (oxygen-16) are preferentially evaporated in warm temperatures, while conversely the water molecules with heavier isotopic values (oxygen-18) tend to condense and form clouds or precipitation more easily. It is this property that allows us to identify different sources of precipitation in tree ring samples.

In extreme precipitation events associated with tropical cyclones, the level of oxygen-18 depletion in the rain water is high due to the highly efficient process of forming precipitation via condensation in the core of a tropical cyclone (Lawrence in 1998, Monksgaard et al. 2015). In Lawrence’s paper, five tropical cyclones that made landfall in Texas, U.S, were studied. They showed much lower oxygen-18 to oxygen-16 ratios (or δ18O) from tropical cyclones than normal summer convective storms.

This finding was further corroborated by a study of Hurricane Olivia by Lawrence et al. in 2002. Tropical cyclones are also large and long-lived and create vast areas of precipitation that can stay in the water system for weeks, giving different isotopic characteristics associated with the location of the heaviest rain bands and storm centre (Monksgarrd et al 2015). Deep soil water can remain unaffected by normal summer rainfall, and in the absence of further heavy rain events, it is allowed to be taken up by trees (Tang and Feng, 2001).

It seems clear that oxygen isotope analysis seems to be the favoured form of tree ring analysis for palaeotempestology.

Tapping the potential

Upon learning about these methods it also seems reasonable to assume that different intensities and characters of storms will result in different levels of oxygen-18 depletion. It seems likely that there would be much uncertainty in making assumptions of a storm’s intensity based on isotope fractionation (but I’ll keep looking for more research on this). At the moment, it seems that the uncertainty may preclude a reliable intensity measure of past storms using this approach.

The oxygen isotopes uptake into the tree’s structure will depend on many factors, including biological processes that are dependent on species, tree age, exposure to the storm, soil composition. Growth cycles are also taken into account. By doing so we can try to limit the degree to which uncertainty derived from the mismatch between growth season and storm season, can cloud useful information.

In the North Atlantic basin for example, hurricane season runs from early June to late November and as such overlaps mainly with latewood (as opposed to earlywood) growing phase. Therefore it is these sections of the layers of tree rings which are focussed upon for palaeotempestological studies.

Miller et al. 2006 presented the emerging case for using oxygen isotopes more widely after the devastation left behind by the busy 2004 and 2005 hurricane seasons, by building a 220-year record to identify past storms from unusually low oxygen-18 isotopes in pine forests. This is potentially very useful for engineering and loss modelling concerns.

“Can’t see the wood for the trees”

There are many uncertainties in the application of tree ring data to palaeoclimatology, let alone palaeotempestology, as summarized in the review paper by Sternberg et al. in 2009, including complex cellulose uptake biology, changes in isotopic composition of soil water, assumptions based on the relationship between leaf temperature and ambient temperature.

However, every study adds to the wealth of information and since each site represents a single location slice through time, it seems as though the science of dendroclimatology will only continually benefit from new data. And there still seems to be push to collect and analyse more data. The National Climatic Data Center, hosted by NOAA, is a font of old and recent tree ring datasets.

A recent review of the data by Schubert and Jahren published in October this year (2015) takes a wide view. It aims to unify tree ring data sets, to bring together a global picture of past extreme precipitation events based on low oxygen-18 isotope records. They conducted 5 new surveys and used 28 sites from the literature to create a relationship using seasonal temperature and precipitation, which can explain most of the isotopic oxygen ratio in tree cellulose. This seems to be a step up in resolution, as looking at seasonal variations rather than annual cycles may provide a step closer to identifying individual storms or storm clusters using tree ring data. It is interesting to see a comment in the conclusion of this paper about the fact that much of the uncertainty that still remains in this link, is derived from disturbances, such as storms.


Figure 3: Comparison between measured δ18O in the cellulose of studies trees and the calculated δ18O using the model developed by Schubert and Jahren which uses known climate characteristics. It shows a good correlation on relating seasonal temperature and precipitation to oxygen-18 isotope ratios. Source: Schubert and Jahren, 2015

It seems clear that it would be much more difficult to develop a simple equation to explain the extremes of the isotopic ratio chronologies to identify extreme storms. However, Schubert and Jahren seem to have taken a step forward while remaining focussed on average seasonal conditions. Nevertheless, I can’t help but wonder if there is a way for extreme events to be linked in to somehow.

Alternatives to isotopes

When looking specifically at past storms in trees rings, I did find a couple of other approaches to using tree ring data that may also be worth a mention.  

Firstly, an interesting couple of papers by Akachuka in 1991  and another in 1993, used a method where trees that have been forced to lean after a hurricane. This phenomenon is examined for any extra clues that it may provide by assessing how these trees recover from such disturbances. Although the papers do not look specifically at characterising the storms themselves (i.e. there is no wind speed to bole displacement relationship), I couldn’t help but wonder if there is some extra information to gather from these trees and whether we could build a relationship to specific storms or storm seasons.

Another paper by Sheppard et al. in 2005 looks at the effect of a tornado in 1992 on a specific dendrochronology and re-evaluates the pre-historical records from wood samples retrieved from an 11th century ruin in Arizona. He looks for similar patterns in wood growth (see Figure 2 for conceptualisation). Unfortunately, the patterns found in the tree rings which were caused by the tornado in 1992 were not replicated in the ring patterns of the 11th century sample. This is certainly interesting work, but I imagine that finding enough data for trees that are damaged but still survive tornadoes is not easy, especially when comparing to single older samples.

Conclusions

Although individual studies using tree lean or damage from specific events like tornados, are interesting and worthwhile academic endeavours to help us understand the ways in which storms of various scales impact certain tree growth, they do seem somewhat less applicable to thinking about climate change and how frequency and severity of storms are changing over a wide area.

With so many subtleties based on factors such as tree species or topography of a study site, I feel that the broader synthesis approaches (as per Schubert and Jahren above) using stable oxygen isotopes offer greater immediate potential for aiding our understanding of past changes in storm activity with possibility for application to risk assessments and projecting impacts of future climate change. 

Saturday, 28 November 2015

Palaeotempestology: Lake sediment records

Digging in to sediment records

With continued debate among scientists on exactly how future climate change will affect storm frequency and severity, it seems logical to see if we can find out more about variability in storm activity from the past.

Lake sediments are extremely useful in studying past climates, for which we have no observational record (through conventional weather recording equipment). They provide a slice through time to look at the changes in lake chemistry and environmental activity affecting the make up of suspended particles in the lake that eventually settle at the bottom.

Radiocarbon dating, thickness of layers of different sediments, analysis of diatoms and inference from the occasional break in the record (a hiatus, perhaps due to the drying out of a lake), are various ways in which lake sediments can give us clues about the past.

Within this range of different approaches there are a few ways in which sediments from lakes can be used to look at past storm events. In my previous blog, I highlighted a paper by Dr Jeff Donnelly et al. in 2015 entitled “Climate forcing of unprecedented intense-hurricane activity in the last 2000 years”. It presents a history of storm events over the past two thousand years, using an analysis of sediment grain size in their collected samples, with a resolution of around 1 year. The work uses evidence gathered from field work during the project (and previous studies) to determine the presence of two distinct periods of higher activity in severe hurricanes for the west North Atlantic coastline of North America: one between 1400 and 1675 C.E.; and another period of high frequency storms further back in time between 250 and 1150 C.E.

The study location is a place called Salt Pond, in Massachusetts. It has a tidal inlet linking it to the ocean, making it full of brackish waters. This proximity to the ocean means that the pond is exposed to ‘overwash’ during storm surge events associated with large storms heading northwards along the Eastern seaboard of the United States. These salt water incursions occur when the storm surge level is higher than any natural or man-made defences. This ‘overwash’ leads to ‘coarse grain event beds’, and so these can be used as an indicator of severe storm activity. This process is vaidated using known hurricanes landfalls, which are represented in the sediment records and act as ‘anchors’ to verify that the samples are valid.

The study builds on a number of papers that were produced after the convening of a workshop on Altlantic palaeohurricane reconstructions in 2001 at the University of South Carolina. The workshop aimed to identify new opportunities in the field of palaeotempestology. A summary of the workshop can be found here. Dr Jeff Donnelly and colleagues studied a number of lakes in the Northeast of the US, in the states of New Jersey and New England, and so to learn a bit about the methodology, I dug into some of the papers in some more depth.


Getting your hands dirty

It seems the only way to get at clues available from sediment records is to get your hands dirty. I found an earlier paper by Donnelly at al. from 2001 which built a 700 hundred year sediment record of severe storms in New England. This paper (and a couple more in Boldt et al. 2010Liuand Fearn, 2000) started to show me that each project strategy is subtly different. 

Various schemes are planned based on the conditions of the study sites, to find the best locations for sampling overwash areas in a consistent manner. The aim is to try to consistently capture the process by which more intense storms erode more sand from the coastal beach and bring this coarse sediment into the brackish lakes and ponds, larger storms being assumed to produce wider fans of overwash sand deposits, being thicker near the shore and thinner near the centre of the study lake. A range of
samples should be taken to try to represent the range of possible characteristics of past intense storms. Figure 1 (below) is a hypothetical diagram from Liu and Fearn (2000) to show various patterns of deposition. Note the radial patterns associated with the various directions of storm approach, with the larger fans associated with more intense storms.

Figure 1: Hypothetical coarse grain deposition fans in severe storm surge events. Source: Liu and Fearn, 2000 
The coarse sand creates a layer over the more usual organic-based deposits that settle on the bottom of a lake as a stratified layer. This happens most effectively in anoxic lake beds (lacking dissolved oxygen) since any mixing from plant of animal life will be minimal.

Having never been in the field to collect sediment samples, I found it interesting to see how Donnelly et al. (and other teams) maintained a consistent chronology in the sediment records. They took multiple samples and use the variety of methods above to build their chronology.


Markers in time

Isotopic radio carbon dating and stratigraphic markers used to mark certain control points to validate the data. Pollution horizons are useful in this respect, for example lead concentrations mark the beginning of the industrial revolution as it quickly made it's way into the water systems and lakes and then 'fixed' by anoxic sediments. The presence of lead pollution is an indicator of the late 1800's (Donnelly et al. 2001) and then another change occurs when lead was removed from gasoline in the 1970's and 1980's. This is a good example as it shows how these markers are useful for calibrating sediment records, in a way that is easily understood and recognised.

Pollen records can also mark certain points in history, for example the European colonisation of the eastern U.S. led to large scale clearance of the vegetation for farmland meaning that the pollen composition changes drastically (Russell et al. 1993).

Once these markers are established, previous storms are used to calibrate storm events, and then previous coarse grain even layers are identified and carbon dated.


Clear as mud?

So having learned a lot more about sediment analysis in relation to palaeotempestology, I now have a greater respect for what these cores of old mud and sand can tell us about the past. However, it does seem to me that there is still a large degree of uncertainty in the data when trying to discern an idea about individual storms. For example, what if two storm occur in quick succession as a cluster, before a sediment layer has had a chance to settle and ‘lock in’ the information? This may end up looking look like one larger or more intense storm, when actually it is the frequency of storms in that season which is varing. Donnelly et al. 2001 give an example from their study location of a lack of agreement between historical accounts of two intense storms in 1635 and 1638 which likely created overwash signatures, but in the sediment proxy data, only one event was indicated. This means that the estimated frequencies may have significant uncertainty.

Also, responses of lake or pond to overwash events may change over time due to changes in natural or man-made barriers. However, even with these uncertainties in mind, it is still clear that there is great value in understanding the past clues left behind by storms in our coastal lake sediments. 

Without any alternative information, the best that we can do is to piece together palaeotempestological proxies and glean snippets of information to build a longer record of storms.

It also provides grounds for comparison in using climate models to try to understand past variability,
another subject I intend to explore in a future blog.

For now, I’ll leave you with an informational video by Ocean Today in conjunction with the Smithsonian Institution and NOAA, just after Hurricane Sandy in 2012 which will hopefully make a clear demonstration of what overwash looks like and how the coastal beach material can be dragged in across to end up in lakes or ponds that lay close to the ocean to give us these markers of past events.



My next blog will be on the evidence that can be derived from coral cores.

Monday, 26 October 2015

Storms, Species and Ecosystem Stability

An interesting lecture last Friday, from Prof. Anson Mackay, about biodiversity and landscape change got me thinking. He talked about the value of biodiversity and various ecosystem services that are at risk due to recent global changes. Changes such as CO2 levels, urban expansion, deforestation, ocean acidification, sea ice loss, habitat loss, marine ecosystem over-exploitation and more, are all effecting biodiversity and their associated ecosystem services at both global and local scales.

I was left with one question: Being an ex-weather man, I couldn’t help but wonder about how storms can affect biodiversity. Can a single storm or series of storms lead to extinction of a species? If they can, then in a warmed future world where storms may be more severe, possibly more frequent in places, and perhaps moving their tracks to hit places that are not well-adapted to such storms, is future storminess a more significant threat to biodiversity than it is today? This also needs to be put in the context of human population to growth continuing through this century (UN predictions of around 11 billion people by 2100), which will no doubt continue to put more pressure on natural habitats through urban expansion with increasing numbers of people living in megacities.


Species on the brink
What evidence is there for storms affecting biodiversity? From personal experience, I can think of one species of bird, which may be put under more pressure from increased storminess and as well as human activity: the Bermuda Longtail. After living for three years on the small island of Bermuda, I only saw one or two, but these stunning marine birds that live on the wing are the subjects of keen conservation efforts
Source: Bermuda Goverment

Roughly 50% of the breeding pairs in the North Atlantic nest in Bermuda’s cliffs. The Longtails are however quite rare and are prone to pressures from:
  • ·       storms and floods,
  • ·       coastal development and human activity,
  • ·       predation from new species alien to the island such as rats, crows and domestic cats,
  • ·       competition for suitable nesting sites from pigeons, all only present since humans inhabited the island.

Lots of pressures, and surely an example of one of many species in a similar situation. In general, my instincts tell me that certain vulnerable species should be prone on a local scale. As in the case of the Longtail, when a vital breeding ground is as isolated as Bermuda, it seems to be reasonable to expect that species that are already on the brink, and are vulnerable to abrupt changes of landscape like the erosion and flooding after a storm or change in land use of a coastline, could be pushed towards extinction by a single event at a critical time, for example during breeding.

Coral destruction
Teixidó et al. in 2013 looked at the impact of severe storms in the NW Mediterranean Sea on biodiversity on the sea bed (benthic region) in coral producing species of marine life. The case study looks at a storm that hit the study region on December 26th 2008 and was considered to be the strongest in 50 years at the time. The study examined the benthic community composition from data gathered from the preceding couple of years, and also from surveys during the years after the storm. The damage was severe largely due to this storm generating huge waves that smashed and scoured the coral outcrops and areas of shallow sea bed. Surveys after the event revealed extensive damage to corals and sea-bed communities, including some species that are relatively long-lived like some sponges, sea fans, and anenomes. The study makes the interesting point, backed up by numerous citations, that species that exhibit little change in populations and few community changes over time, due to a lack of disruption, are particularly susceptible to the impact of low frequency, high impact events. This compounded by evidence from studies that show the Mediterranean to be a potential hotspot for climate change (e.g. Giorgi and Lionello 2008). This means that we may see certain species in the Med under increasing pressure from extinction in the future. If we are going into an Antropocene, partly characterised by greater extremes of climate, then long-lived species with limited adaptability could well be the first to become extinct.

Disturbance biodiversity boosts
Storms can also trigger an ecological process in which species take advantage of the openings presented by a disturbance, seeking to fill any ecological niche that opens up and temporarily increasing biodiversity. This is known as “gap phase succession”. A non-storm related example can be found in many forest and grassland ecosystems with fire being a key trigger. Many species actually rely on these events, however, the question of whether they can adapt to greater ferocity of fires in a future warmer world, with longer and deeper drought turning grasslands and forests into tinder just waiting for a spark (normally from human activity or lightning), remains. This idea is similar in concept to the intermediate disturbance hypothesis, first introduced by Joseph Connell in 1978. The diagram below highlights this idea, but basically, it’s how biodiversity can be maximised by disturbances that are neither too frequent, nor too rare.
Source:  The intermediate disturbance hypothesis (data from Connell 1978).

But back to storms, the huge numbers of trees that get blown over during winter storms across Europe will continue to provide a bonus for biodiversity in the short term (downed trees provide food and habitat for new invertebrates, fungi and lichen). However, again it does appear these ecosystems have developed in such a way as to take advantage of the gaps presented by rare infrequent events. Change the frequency and severity of shocks on a system that has developed based on past experience over a long period of stable extremes, and who know what will happen! We can certainly hypothesise.

Goodbye equilibrium, hello climate change
This study by Backlund et al. in 2008 paints a rather pessimistic picture assessing broad impacts of climate change, and for the purposes of this blog, it describes how ecosystems are likely to be pushed further in to alternate states due to climate change. It considers that established predator/prey or pollinator/plant interactions may be put under additional stresses. It seems that this can potentially leave them vulnerable to the impacts of single rare events, such as hurricanes, which could lead to system failure! Stressed systems are generally less able to bounce back from big shocks.
If climate change is modifying the severity of storms, and potentially frequency and storm tracks for some regions, then we have lots of work to do in working out where we have vulnerable species and where we need to build in extra resilience to prepare for future severe events, which outside of the past experience. Climate change is a slow, creeping effect, which can easily catch us off-guard (and is arguably already doing so). In some cases however, ecosystem management may be the most effective route to help build our societal resilience to a future of more intense storms, as highlighted in a recent Royal Society report (see recommendation 4).

Stormy times ahead
It seems from my research for this blog that storms do have a part to play in maintaining our biodiversity, and the benefits that diverse biological systems bring. Storms (and climatic extremes) are important for the ecosystems that have already evolved to fit the past frequency and severity of rare events, but in an uncertain future due to climate change, they may also have the potential to exert significant extra pressure on local ecosystems, and could result in local extinctions of threatened species.

Enhanced focus on vulnerable and keystone species, tightly dependant ecosystems structures, and isolated communities should help guide our conservation and resilience building efforts in the context of a changing climate.