In a world in which risks seem to appear, or return, more often than one might expect (cyber-risk, climate change, driverless cars, or War on Land, anyone?!), it is not sufficient, if it ever was, to maintain the view that “There is nothing new under the Sun”, but rather to consider that “There are more things in Heaven and Earth, Horatio, than are dreamt of in (y)our philosophy.”
Even if one is an underwriter narrowly focused on a standard class or product such as NatCAT, one still has to ask the question as to whether the models and assumptions one uses are sufficiently robust and predictive to enable one not only to price risk appropriately, but also deal with the consequences of extreme events.
So, we were interested to come across a recent report co-sponsored by the Global Challenges Foundation and a number of departments of the University of Oxford entitled: “Global Challenges- 12 Risks that threaten human civilization”. We won’t list them all, but, apart from the “usual suspects” such as climate change, nuclear war, artificial intelligence and major asteroid impact, we might mention synthetic biology and super-volcanoes.
Of course, in the end we are all dead (!), but we like to think that our institutions and societies will survive and prosper, assisted in part by the underpinnings of robust risk management and (re)insurance.
One aspect of the report that amused us was the fact that it excluded such matters where there are no effective counter-measures, or ways in which to mitigate the consequences- such as a near-space gamma-ray burst- something which, of course, we think about all the time…
However, what the panel that produced the report did attempt, where it believed it was feasible, was to assign a probability, or range of probabilities, to a category. Those of a nervous disposition should look away now! Bearing in mind that we are in a realm beyond even that of Nassim Taleb’s Black Swans in existential terms, anything less than zero may give pause for thought. Interestingly, the “winner” was Artificial Intelligence with a range of 0 to 10%. The following quotation, in the context of the next 100 years, gives an idea of the authors’ concerns: “Putting the risk of extinction below 5 per cent would be wildly overconfident”.
Many will, no doubt, dismiss such work and forecasting, as pointless and even scientific self-indulgence or grandstanding. At Awbury, we would not presume to venture beyond the arena on which we focus, namely E-CAT, or providing protection to our clients against economic or financial risks which can threaten or destroy their business or franchise. We would simply observe that the 1-in-100 or 1-in-250 year risk models common within the (re)insurance industry may provide false comfort; and that stepping back from the mundane and the expected does have its value.
-The Awbury Team