Human perversity and serendipity

It was nice to see a very first response to these blogs,…I hope there will be more in future so that it will be even more interesting when we come to look back at the end of the project.

The role of human perversity and serendipity in the management of risk highlighted by Adrian Macdonald’s comment on the last entry is, of course, a fascinating one.There are many other examples  – the demountable defences on their way to Upton-on-Severn in 2007 that got stuck in the traffic jam caused by the floods on the M5, so that they got diverted to – what turned out in hindsight -to a much more useful purpose in protecting an electricity sub-station, is one positive example of serendipity.

One way of looking at this is in the terms of Frank Knight, who published a book on Risk, Uncertainty and Profit back in 1921.  He saw risks and uncertainties from an insurance industry perspective and differentiated between those types of uncertainties that an insurer would be prepared to take odds on, and those that he would not. The second type he called the “real uncertainties” that could not be expressed in this way.  They are what are now often called the epistemic uncertainties due to lack of knowledge or understanding.  Some people also differentiate epistemic uncertainties that might be reducible by further observation or experiment and those uncertainties that we have not even recognised yet (the unknown unknowns of Donald Rumsfeld). It is because of the epistemic uncertainties that we should expect that models will do less well in prediction than in calibration, and that the real system might respond in a surprising way. Sometimes those surprises are treated as only “rogue” observations but might be evidence of such real uncertainty (the filtering of zero concentration observations of ozone in the Antarctic is a prime example of this).

I recently came across some nice quotes relevant to these issues written by Bertrand Russell in 1950 in his essay Philosophy for the Layman:

What philosophy should dissipate is certainty, whether of knowledge or of ignorance. Knowledge is not so precise a concept as is commonly thought…

For it is not enough to recognise that all our knowledge is, in a greater or lesser degree, uncertain and vague; it is necessary at the same time to act on the best hypothesis without dogmatically believing it. …… Scientific laws may be very nearly certain, or only slightly probable, according to the state of the evidence. When you act upon a hypothesis which you know to be uncertain, your action should be such as will not have very harmful results if your hypothesis is false.

So some of the concepts about uncertainty in predictions of change are not at all new. Decision making has always been made under uncertainty, and there has always been a limitation as to far assessing that uncertainty can be made free from irrational or unknown human (or other) influences. And, even if that has always been understood, it has not stopped people from being too confident in their hypotheses and making decisions that have had rather harmful consequences (including the over development of flood plains).

The modern struggle with uncertainty, however, has two aspects that have changed recently. The first is that there is a push to make more explicit account of uncertainty in a quantifiable way.  Sometimes that does not recognise the non-quantifiable aspects of real uncertainties in, for example, interpreting the outputs of an ensemble as probabilities that sum to unity (implying all other possibilities are excluded). The second is in the requirement for predictions of future changes – at the heart of the CCN initiative) when the future boundary conditions for flood risk, water quality and water scarcity may be subject to real uncertainties. Hopefully, CCN might be able to through some light on both of these issues as it develops.. Кроме этого, пульпа является мощным барьером skyortho.com.ua/ на пути попадания инфекций и бактерий в зубную полость

Comments are closed.