Uncertainty [1]

Uncertainty is lack of knowledge. Uncertainty comes from: natural variations; and, lack of knowledge of the process / event. We must distinguish between uncertainty and ‘ignorance’. ‘Ignorance’ refers to a lack of awareness of factors influencing the issue. This means that the identification of threats may be incomplete. It is a well-recognised weakness in risk assessment. The measures needed to counteract this ignorance are a wide use of different disciplines and skilled people.

Uncertainty can lead to ‘risk aversion’; for example, a risk-averse financial investor dislikes risk, and therefore will avoid high-risk stocks or investments, which means he/she will often lose higher rates of return. These type of investors usually look for ‘safer’ investments (e.g., low interest bank savings accounts or government bonds) which generally have lower returns.

The Rare Event

When we make a decision there will always be some uncertainty. Our uncertainty is usually accommodated by our life experience, and we make the decision using this experience. This leads to some interesting anomalies [2]:

  • we overestimate small probabilities, and underestimate moderate and high probabilities;
  • we tend to trust conclusions based on very small samples; and,
  • we like to base decisions on lessons from more recent situations (e.g., after a natural disaster).

We rely on our instincts and experience, even when they have a track record of being wrong [3]. But… what if we are asked to make a decision on an event that we have never heard of, and therefore have no experience of? Occasionally things happen and we are shocked and surprised at the unexpected nature of the event.

If the ‘event’ has low consequence we place little importance on it, but if it has high consequence (e.g., multiple fatalities) we consider it important; and, if there is little or no historical records available to predict it, we call it a…‘rare event’.

When we heard of the attack on the Twin Towers in New York we were shocked/surprised it had happened (the probability), and appalled at the large number of casualties (the consequences). This type of low probability, high consequence event is often called a ‘black swan’ [4].

The Black Swan

The phrase ‘black swan’ is often used to describe a rare event. The term was a common phrase in the16th century as a statement of impossibility: swans must be white because all historical records of swans reported that they had white feathers; therefore, a black swan was impossible/non-existent. The Dutch explorer Willem de Vlamingh discovered black swans in Australia in 1697. Now the term ‘black swan’ exposes fixed thinking: a process of thought exposed as wrong when a fundamental theory is proven wrong.

The Rare Event

‘I cannot imagine any condition which would cause a ship to founder. I cannot conceive of any vital disaster happening to this vessel. Modern shipbuilding has gone beyond that.’

Captain Edward Smith (1850-1912), commander of the RMS Titanic.

A black swan event is rare: a ‘surprise’… something difficult to predict. It has a major effect, and is the first of its kind, but it could have been expected (with hindsight): relevant data were available but unaccounted for.

The discovery of the black swan in 1697 illustrates a ‘limitation to our learning from observations or experience and the fragility of our knowledge. One single observation can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans’ [5].

What you do not know can be far more relevant than what you do know… which means… risk management models based on historic data will have limitations, and may not be able to predict a black swan. The past does not always inform the future….

A ‘black swan’ is a low probability, high consequence event, predictable with hindsight; for example, computers, or, the ‘9/11’ attacks on the Twin Towers.

It is an ‘outlier’: it lies outside the realm of current expectations.

References

  1. Anon., ‘Reducing risks, Protecting people: HSE’s decision-making process’, HSE Books. R2P2, Health and Safety Executive, UK. 2001.
  2. M Abdellaoui et al, ‘Experienced vs. Described Uncertainty: Do We Need Two Prospect Theory Specifications?’ Management Science. Vol. 57, No. 10. pp. 1879–1895. October 2011.
  3. R Carucci, ‘Three Ways Your Brain Is Hazardous To Great Decision Making’. Forbes. January 21, 2016. https://www.forbes.com/sites/roncarucci/2016/01/21/three-ways-your-brain-is-hazardous-to-great-decision-making/#72bf04ec309b
  4. N Taleb, ‘The Black Swan: The Impact of the Highly Improbable’, Random House, 2007. ISBN 978-1400063512.
  5. N Taleb, ‘The Black Swan: The Impact of the Highly Improbable’, The New York Times. April 22, 2007. http://www.nytimes.com/2007/04/22/books/chapters/0422-1st-tale.html