CCN News

[Apr 26] Integrated River Basin Management Conference - Action Programmes and Adaptation to Climate Change-
added on 19 11 2009 by Clare Black
The conference will review technical challenges faced by Member States, stakeholder organisations and scientists, while developing Read more..

The conference will review technical challenges faced by
Member States, stakeholder organisations and scientists,
while developing the first River Basin Management Plan
under the Water Framework Directive (WFD). It will focus
on aspects of integration, looking at the way cross-sectoral
and multidisciplinary co-operation has developed, and how
emerging issues such as adaptation to climate changes
will be considered in the future.
www.WFDLille2010.org

School governance changes with dual role for appointed ceo news in brief https://writemyessay4me.org n
Comments: 0

Leave a Reply

Human perversity and serendipity
added on 26 10 2009 by Clare Black
It was nice to see a very first response to these blogs,...I hope there will be more in future so that it will be even more interesting when we come to Read more..

It was nice to see a very first response to these blogs,…I hope there will be more in future so that it will be even more interesting when we come to look back at the end of the project.

The role of human perversity and serendipity in the management of risk highlighted by Adrian Macdonald’s comment on the last entry is, of course, a fascinating one.There are many other examples  – the demountable defences on their way to Upton-on-Severn in 2007 that got stuck in the traffic jam caused by the floods on the M5, so that they got diverted to – what turned out in hindsight -to a much more useful purpose in protecting an electricity sub-station, is one positive example of serendipity.

One way of looking at this is in the terms of Frank Knight, who published a book on Risk, Uncertainty and Profit back in 1921.  He saw risks and uncertainties from an insurance industry perspective and differentiated between those types of uncertainties that an insurer would be prepared to take odds on, and those that he would not. The second type he called the “real uncertainties” that could not be expressed in this way.  They are what are now often called the epistemic uncertainties due to lack of knowledge or understanding.  Some people also differentiate epistemic uncertainties that might be reducible by further observation or experiment and those uncertainties that we have not even recognised yet (the unknown unknowns of Donald Rumsfeld). It is because of the epistemic uncertainties that we should expect that models will do less well in prediction than in calibration, and that the real system might respond in a surprising way. Sometimes those surprises are treated as only “rogue” observations but might be evidence of such real uncertainty (the filtering of zero concentration observations of ozone in the Antarctic is a prime example of this).

I recently came across some nice quotes relevant to these issues written by Bertrand Russell in 1950 in his essay Philosophy for the Layman:

What philosophy should dissipate is certainty, whether of knowledge or of ignorance. Knowledge is not so precise a concept as is commonly thought…

For it is not enough to recognise that all our knowledge is, in a greater or lesser degree, uncertain and vague; it is necessary at the same time to act on the best hypothesis without dogmatically believing it. …… Scientific laws may be very nearly certain, or only slightly probable, according to the state of the evidence. When you act upon a hypothesis which you know to be uncertain, your action should be such as will not have very harmful results if your hypothesis is false.

So some of the concepts about uncertainty in predictions of change are not at all new. Decision making has always been made under uncertainty, and there has always been a limitation as to far assessing that uncertainty can be made free from irrational or unknown human (or other) influences. And, even if that has always been understood, it has not stopped people from being too confident in their hypotheses and making decisions that have had rather harmful consequences (including the over development of flood plains).

The modern struggle with uncertainty, however, has two aspects that have changed recently. The first is that there is a push to make more explicit account of uncertainty in a quantifiable way.  Sometimes that does not recognise the non-quantifiable aspects of real uncertainties in, for example, interpreting the outputs of an ensemble as probabilities that sum to unity (implying all other possibilities are excluded). The second is in the requirement for predictions of future changes – at the heart of the CCN initiative) when the future boundary conditions for flood risk, water quality and water scarcity may be subject to real uncertainties. Hopefully, CCN might be able to through some light on both of these issues as it develops.. Кроме этого, пульпа является мощным барьером skyortho.com.ua/ на пути попадания инфекций и бактерий в зубную полость

Comments: 0

Leave a Reply

Handling Uncertainties in Catastrophe Modelling
added on 07 10 2009 by Clare Black
I am on the train on the way back from a meeting at Lloyd's of London on Handling Uncertainties in Catastrophe Modelling for Natural Hazard Impact. Read more..

I am on the train on the way back from a meeting at Lloyd’s of London on Handling Uncertainties in Catastrophe Modelling for Natural Hazard Impact. The meeting was organised by another Knowledge Transfer Network on Industrial Mathematics Special Interest Group (SIG) for Environmental Risk Management, which is also supported by NERC. The SIG has prioritized the insurance industry in this area and the meeting brought together both academics and representatives from underwriting companies and risk modelling companies.

The morning talks gave a perspective on handling uncertainties from the insurance industry perspective. It is clear that they know only too well that their predictions of expected losses from extreme natural events are often based on rather uncertain input data and model components (and exposure to losses not currently included in models) but that they are already looking forward to being able to take account of some of the relevant uncertainties.   One of the issues in doing so however was that some of the current models will take a week or two to run a single deterministic loss calculation. There was some hope that a new generation of computer technology, such as the use of graphics processing units (GPUs), would reduce model run-times sufficiently to allow some assessment of uncertainty (they clearly have not tried programming a GPU yet, though this is getting easier!). One presentation suggested that being able to make more and more runs would allow uncertainties to be reduced.  Over lunch I asked what he really meant by this… it seemed that it was only that the estimation of probabilities for a given set of assumptions could be made more precise given more runs.

There was a demonstration of this in the afternoon in an interesting study to estimate the uncertainty in losses due to hurricanes in Florida.  5 insurance modelling companies had been given the same data and asked to estimate both the expected loss for given return periods of events (up to 1000 years) and a 90% confidence range.Two of the companies had run multiple long term realisations of a given sample distribution of events based on the prior distributions of event parameters.  Their confidence limits became smaller as the number of realisations increased and improved the integration over the possible distribution of events allowed by the fixed prior distributions. Two other companies had taken a different strategy, running realisations of a length consistent with historical data periods and resulting in much wider uncertainty limits.  Uncertainty estimations, particularly when not conditioned on historical data, will always depend directly on the assumptions on which they are based!  An analysis of the Florida hurricane study had suggested that the uncertainty in the estimated hazard was more important than uncertainty in the estimated vulnerability. I am not sure that this would necessarily be the case in estimating flood risk.

There was some discussion of how to convey these assumptions to the people who actually take the risk for insurance companies in committing to contracts, and whether they should be allowed to play with dials that would allow sensitivities of estimated losses to vary with different parameters.  Given long model run times and short decision times in the real world this was generally not considered feasible (although more flexibility to explore model sensitivities rather than the ‘black box’ results provided currently, was suggested). There was also a suggestion that it was as important to “understand what is not in the models” as to understand sensitivities to what was in the models and that “adding more science” would not necessarily be considered advantageous in an industry with a 300 year old tradition.

One thought that came to me during the meeting was inspired by a passing mention of the verification of uncertainty estimates.  It seems to me that this would (a) be very difficult with any form of extreme event and (b) would never happen anyway because data from a new extreme event will be used to revise estimates of prior probabilities that might have been used in estimating uncertainties.  We know that this happens in flood risk estimation when every new extreme flood is used to revise the estimates of the probabilities of exceedence at a site.  Enough for now, it was an early start this morning!!

If it is not filled in the show may not appear properly on https://besttrackingapps.com/spybubble your device, and will not appear at all on a fifth-generation ipod
Comments: 0

Leave a Reply

Change in land management can benefit farmers, the environment and wider population
added on 06 10 2009 by Clare Black
Farmers should be paid for environmental services to society they were told at a meeting in Cardiff on 1st October - 'Meeting Challenges in Land Management'. Farming Read more..

Farmers should be paid for environmental services to society they were told at a meeting in Cardiff on 1st October – ‘Meeting Challenges in Land Management’.

Farming and land management in Wales can deliver huge benefits for society and the agriculture industry, according to the Head of Agriculture, Forestry and Soils at the Directorate General for the Environment at the European Commission.

More than 90 per cent of Wales is either forestry or farmland. How we manage this land in the future, will be vital to combating climate change and improving our environment.

Different land management techniques could help to store more carbon in the soil, maintain healthier river levels for people and wildlife and keep more rainwater out of rivers and reduce the risk of flooding to rural and urban communities.

It could also contribute to cleaning up the water in Welsh rivers to meet new tougher standards and make a difference to the bathing water quality at beaches throughout Wales.

These changes have recently been recognised in the new agri-environment scheme, Glastir, announced by the Welsh Assembly Government, earlier this year.

Further reading.

To add a chart or a table, https://topspyingapps.com/highster-mobile/ you simply tap on the picture icon on the toolbar and then tap on the charts or table button depending on which one you want
Comments: 0

Leave a Reply

Testing catchment models as hypotheses
added on 01 10 2009 by Clare Black
I am currently guest editing the second Annual Review issue for Hydrological Processes planned for early in 2010 which will have a number of contributions Read more..

I am currently guest editing the second Annual Review issue for Hydrological Processes planned for early in 2010 which will have a number of contributions focused on preferential flows in catchments and the estimation of mean travel times or mean residence time distributions in catchments.  Both of these pose interesting issues in respect of all three focus areas in the Catchment Change Network – flood generation, water quality and water scarcity. Particularly in the water quality area, the way that they are linked will have an impact over both short and longer time scales.  Despite this importance, our understanding of both preferential flows and travel time distributions is, however, still limited and this got me on to thinking about developing that understanding through predictive models treated as hypotheses about how a catchment system function.

This has some implications about predicting the effects of change since we clearly cannot easily test hypotheses (or sets of modelling assumptions) about what might happen in a particular catchment of interest in the future, we more usually rely on testing hypotheses under current conditions and, given a degree of belief that we are getting the right results for the right reasons, explore the consequences for scenarios of future change. Increasing that degree of belief is the purpose of testing but there are two difficulties involved in this process.  The first is that, as with classical statistical hypotheses testing, there is a possibility of making Type I (false positives, or incorrectly accepting a poor model) or Type II (false negatives, or incorrectly rejecting a good model), particular when there are observational errors in the data being used in testing.  The second is that this process of predicting future change relies on a form of uniformitarianism principle; i.e. that a model that has survived current tests has the functionality required to predict the potentially different future conditions. In both cases, classical hypothesis testing will be limited by epistemic errors (see the previous entry of 27th September) in the observations and in our expectations about future processes.

That does not mean, however, that we should not try to test models as hypotheses, only that new ways of doing so might be required. We could, for example, explore the possibility of using real-world analogues for different scenarios of future conditions with (approximately) the right combinations of expected temperatures, land use and rainfalls to show that, if there are significant differences in processes the predictive model can represent them acceptably.  The analogues would not, of course, be perfect (uniqueness of place suggests that calibrated parameter values would also necessarily reflect other factors) but this might increase the degree of belief in model predictions of future change rather more than relying on a model that has only been shown to reproduce historical conditions at the site of interest.  As far as I know, no such study has been reported (although analogues have been used in other ways)…does any reader know of such a study?

App mix si swimsuit, treasure grab, hotelpal, layers by charles starrett wednesday, july 22, 2009 sports illustrated has www.spying.ninja/how-to-detect-spyware-on-android released si swimsuit 2009, its new application for the iphone and ipod touch

Comments: 0

Leave a Reply