If I were you, I wouldn’t start from here…

The punchline to the famous Irish joke about an Englishman lost in the in the backroads of Ireland and seeking direction from a local, brings to mind the persistent struggle which now faces many longstanding businesses, with the (re)insurance industry being no exception.

Burdened by legacy processes and systems, many fail to cope (in terms of creating value) in a world which is changing around them in ways that make a mockery of the argument that incremental and gradual change is still a feasible strategic approach towards long-term viability.

While “move fast and break things” is probably not the best alternative, it does contain the kernel of truth. There needs to be not only a comprehensive plan, but swift and effective execution.

In January, Oliver Wyman published a paper entitled “The Mindset Collision”, in which it contrasted the “Vision” and the “Value” mindsets- the former being focused on building the business of the future, foreseeing and adapting to new technologies and disruptors; while the latter focuses on delivering financial returns within a fairly short timeframe. In reality, of course, any business which hopes to have a viable and sustainable future needs to blend both approaches. It is all very well having grand visions, but without clear expectations of what return an investment should bring, there is a serious risk of wasting both time and resources- something that will not be lost on those who observe the travails of Softbank’s “Vision Fund”.

If one considers the (re)insurance industry, it tends over time, in the aggregate and in developed markets, to grow at roughly the rate of overall economic growth. This means that, unless one can either find a more efficient way of delivering the product, one has either to seek new markets, or create products that meet unfulfilled needs that command “value added” rather than “cost plus” pricing. Even regulators have realized that the industry needs to be encouraged to take more risks in terms of development by introducing the concept of the regulatory “sandbox”. Without at least some “vision”, the industry’s incumbents (other than those which have scale and network advantages) run the risk of “death by a thousand cuts”, as new competitors, unburdened by the above-mentioned legacy issues, peel away revenues from the most profitable business areas, or provide better-tailored products. Otherwise, as Oliver Wyman puts it, the risk is becoming the financial equivalent of a “dumb utility”, providing capacity priced at the margin, if one is lucky. Not exactly an enticing prospect.

The irony is that in a world full of complex and developing risks, the industry has the opportunity to grow and prosper if it can decide what it wants to be and how to achieve that. Maintaining the status quo, and repeating the usual hopeful mantras about improving underwriting quality, or re-deploying capital and capacity away from “unprofitable lines” is simply evidence of a state of denial. It singularly fails the “vision” test, as well as the “value” one, because it changes nothing.

If you do not understand that past is no longer prologue, and that nimbleness and adaptability, coupled with a clear purpose, are (frankly, as they always were) essential traits, businesses risk becoming, not just a “dumb utility”, but a “financial zombie”, unaware of its own reality, and doomed to stumbling and flailing about until someone “disaggregates” it.

The Awbury Team

Standard

Risk is what you don’t see…

The always thoughtful Morgan Housel (of the Collaborative Fund) recently wrote an article with the above title. Adding “or won’t” is equally instructive.

The point is, as most fundamental ones are, so simple and elegant; yet much of the (re)insurance industry (amongst all the rest) spends much of its time obsessing over the things it hopes it can measure and believes it can foresee. Obviously, in many product lines that is the rational and sensible approach, as many risks are, in the real world, bounded and constrained.

However, it is “in the misunderstood tail” that true risk lies. This being the first few months of a new decade (and we are not going to argue that point!), the media and what passes as thoughtful discourse (you know who you are, WEF Davos…) are full of “predictions”, risk surveys, incomprehensible network diagrammes and the like. This is presumably in the hope that the “naming of the risk”, or being part of a “knowledgeable consensus” will somehow make the risk less terrifying (as in “a trouble shared…”), because if it can be named or agreed upon we at least know what we are dealing with (or think we do).

So, as Housel points out: ”The same risks, repeated over and over, [occur in narrative] sometimes several years in a row”- elections, trade wars, climate change, nuclear war. Of course, there are existential risks. We have identified collectively a whole range of them. If they occur, it may well be “game over” for Humanity and “civilization”. Harsh as it may seem, and in no way diminishing the need to take them seriously, it is the ones that are unforeseen or not taken sufficiently seriously that tend to cause the most harm. They are “surprises”, by definition- even if, with hindsight, perhaps they should not have been (e.g., the Japanese attack on Pearl Harbor).

As we saw with the Great Financial Crisis (GFC), when a central tenet of belief (that US housing prices could not in aggregate decline significantly) proved to be unfounded, people were shocked and surprised. They panicked, with consequences that we still have to grapple with even now. With few exceptions (read Michael Lewis’s excellent book The Big Short), people could not contemplate that what happened not only could, but would.

Secondly, once surprised, and compounding the issue, people tend to extrapolate outcomes beyond the logical or probable, becoming (unduly) paranoid and pessimistic about the future, which often creates a worse outcome than if they had behaved rationally. It is useful to remind one’s self of that probability ex ante! Unfortunately, it is also hard to imagine any of today’s political “leaders” being seen as credible when uttering something analogous to Roosevelt’s: “The only thing we have to fear is fear itself”.

One could argue that such risks are “black swans” or “unknown unknowns”, but most are not. Naturally, one cannot plan for what one cannot imagine happening, but convincing one’s self that one has foreseen all the possible risks amounts to hubris.

From Awbury’s point of view, we are institutionally paranoid about risk. We know that we are not infallible or omniscient, and that the whole point of being in the (re)insurance business is to accept risk. However, we also aim to design our products and solutions in ways that not only meet our clients’ needs, but also contain mitigants and margins of error that minimize the likelihood of being surprised in a way or to an extent that causes ruin.

The Awbury Team

Standard

The really, really, long view…

Major central banks have the resources to produce papers that may seem arcane, but also provide an interesting perspective. A recent paper Bank of England Working Paper, entitled “Eight centuries of global real interest rates,

R-G, and the ‘suprasecular’ decline, 1311–2018” is no exception:
‘Suprasecular’ Decline

In it, economist Paul Schmelzing constructed a time series for real interest rates going back some 700 years to 1311. One only has to read the paper to understand the scale of the work undertaken, as well as the fact that there really is not much that is truly new under the (financial) sun. Who knew, for example, that in 1262 (sic) the Venetian Grand Council decreed the establishment of a secondary market in the Serenissima’s long term debt?

Fascinating historical facts aside, what is intriguing about Schmelzing’s work is that it leads him to the conclusion that since the late 1400s there has been a steady decline in real interest rates over the intervening centuries that cuts across asset classes, political systems and monetary regimes. He is careful not to be dogmatic about the precise cause (and the time series is volatile year to year). Capital accumulation and the ability to save more (and expect to be able to enjoy the fruits) are probably the most likely (but in no way definitive) reasons. As a result he posits, in essence, that the “lower for longer” mantra may be rather more than a convenient tag for something hitherto considered to be without much historical context or precedent. In more concrete terms, he suggests that the long term real rate for 2018 would be around 1.50%, which, set against indicative targeted inflation rates of 2% for most major central banks, indicates a nominal cap of around 3.5% for whatever is deemed the safest asset provider- which, of course, has shifted materially over the centuries, heading North West across Europe to the UK and then crossing the Atlantic to the US. In fact, Schmelzing suggests that no-one should be surprised at the current fact of negative nominal interest rates given ultra long term trends.

Of course, this is merely one study. Arguments will doubtless be made that, given what has happened since the GFC to inflation rates and the fact of the piercing of the “zero lower bound”, the author is indulging in the abiding sin of financial modelers and using a backtest (albeit a very long one!) to confirm a desired hypothesis.

However, Schmelzing is studiously careful not state that what is observed has a clear cause (because it does not), but simply points out that the data provide evidence of a secular decline in real interest rates, and that it would be foolish to ignore that point simply because it may be an inconvenient truth.

So far as the (re)insurance industry is concerned, in Awbury’s view it re-emphasizes the need to focus both on the quality of underwriting and on ensuring that nominal investment returns are not used as a “crutch” to mask weak Combined Ratios, while at the same time suggesting that focusing mainly on standard fixed income products to generate those investment returns may also be something that bears re-examination.

The Awbury Team

Standard

A Quantum of Solace…?

It appears that a number of the largest US commercial and investment banks (such as Citi, JP Morgan and Goldman Sachs) as well as European banks (such as Barclays and BBVA)  are ramping up their research into the potential applications of so-called quantum computing (a derivative of Quantum Theory’s articulation of superposition and quantum entanglement, amongst other exotica)- a technology which, if it can be tamed and harnessed, promises to revolutionize many areas of business and finance which depend upon the swift analysis and computation of huge datasets.

Because of its architecture, quantum computing may make the power of modern “supercomputers” seem to be from the days of ENIAC (in the 1940s), opening up the possibility of designing and applying algorithms and models currently beyond the dreams of all but the best-funded entities.

Not surprisingly, banks are exploring “use cases” for quantum computing, the most obvious of which is in risk management, where the speed and power of the technology will significantly enhance their ability to analyze probabilistic outcomes, not only more quickly, but also more broadly, giving a boost to the long-standing Monte Carlo simulation and C-VaR methodologies. Other areas include speeding up the Machine Learning systems that underlie Artificial Intelligence (AI). Of course, banks are hardly alone in investigating what quantum computing might do. We would be shocked if the largest hedge funds and alternative asset managers were not also undertaking or commissioning their own research. When market advantage can be measured in microseconds or less, anything that can speed up analysis and execution, or examine and process a wider range of data is enticing.

That said, no financial institution has yet publicly stated (perhaps for reasons of maintaining an understandable discretion) that it is beyond the initial research phase, (although IBM claims to be testing quantum algorithms for pricing of European options and portfolio optimization) and there is continuing debate over the extent to which and when a stable and reliable quantum computer will be constructed.

Turning to the (re)insurance industry, there are a number of similar “use cases” that can be identified, such as in the area of modelling complex, unstable systems such as weather and climate, or in enhancing actuarial models to enable more accurate pricing and risk selection. Sompo International has already stated that it is examining the potential application of quantum computing in a world driven by 5G communications technology.

However, as well as the benefits, (re)insurers should be aware (as banks are) of the threats posed by quantum computing in the area of data security and transfer. Data are currently primarily secured by so-called “prime number encryption”, which remains effective because of the time-prohibitive nature of computing all possible combinations. With quantum or qubit systems, their ability to assess factor combinations very quickly would render prime number encryption obsolete. Not only that, but the properties of quantum entanglement apparently open up the possibility of intercepting quantum information in transit without detection- which would be the worst nightmare of any institution tasked with maintaining confidentiality.

As yet no-one really knows, or is admitting to when appropriately-scaled and reliable qubit computing will make its debut (and one can be sure that state actors will want to keep their capabilities hidden). Nevertheless, at Awbury, we believe in being aware of technological developments, so that we can constantly assess and update our own risk analyses. We may not seek quantum supremacy, but we do aim for quantum understanding.

The Awbury Team

Standard

Blessed are the Economists, for They shall inherit (what is left of) the Earth…

It is a truth now universally acknowledged that an economist trying to understand a problem, must be in search of a model. Unfortunately, many of them no longer seem to work.

This is a point that Robert Skidelsky (a British economic historian, generally considered the definitive biographer of Keynes) makes in his recent, essentially polemical book “Money and Government: The Past and Future of Economics”.

It is almost an axiom that the upper policy levels of most governments are riddled with economists, as are central banks. They may not obviously be in charge, but they are very influential.

Unfortunately, as Skidelsky points out, many of them are captives of intellectual orthodoxies which, while no longer able to explain the world as it is, nevertheless permeate their thinking. The classic example is inflation. Above a certain level, inflation is considered (quite reasonably) a “very bad thing”- something to be managed and tamed- as the late Paul Volcker famously did in the early 1980s as Chairman of the Federal Reserve. Therefore, in the wake of the Great Financial Crisis (GFC), there was (and still is) great concern that the printing of money by central banks, their monetization of government debt (cf. Japan), and repetitive “quantitative easing” (done in one form or another by most major central banks) would cause a rapid and potentially uncontrollable rise in inflation. That has demonstrably not happened.

Similarly, NAIRU (the Non-Accelerating Inflation Rate of Unemployment) and the so-called Phillips Curve (plotting the relationship between inflation and unemployment) remain tenets of economic orthodoxy, even when, as in the US and the UK, levels of unemployment are at very low levels without visible signs of changes in levels or expectations of inflation.

Now we are not, of course, advocating that the involvement of economists in policy-setting and -management should be avoided (having a weak spot for the Bank of England’s Andy Haldane); but rather that, as in most areas of political economy, a diversity of views and rigorous empiricism should be encouraged. Repeatedly stating that something should work, when it manifestly does not, is both fatuous and harmful.

The problem is that, if the orthodox economists (and their educational approach) remains dominant, nothing changes; and hand-wringing or bluster are hardly effective in terms of economic management. Certainly, there remain “orthodoxies” that do hold true, such as the fact that high or arbitrary levels of tariffs are not only harmful in a macro sense, but also a hidden tax, with little, if any, offset in terms of domestic job creation. Nevertheless, change is sorely needed.

In reality (and hardly alone in this respect), many economists and the policy makers they advise are focused on yesterday’s “battles”; blithely ignoring or downplaying the issues that matter in the real world, such as how to identify why levels of productivity change; how to deal with a potential decline in the sustainable demand for labour; the impact of demographics on demand; or the changing landscapes of the financial industry.

At Awbury, we are strong believers in the potency of studying and trying to understand the world as it is and may become, not as we might wish it to be. Some models are always necessary; but becoming in thrall to any particular approach is something we aim to avoid. What always matters is exploring, testing and incrementally enhancing what demonstrably works!

The Awbury Team

Standard

“People tend to overestimate what can be done in one year and to underestimate what can be done in five or ten years”

The above is a quotation (from 1965) by Joseph Licklider (usually referred to as “Lick”), an American psychologist and computer scientist, who was considered by most of his peers and famous successors, to be the visionary architect of much of what we now take for granted as part of modern technology and systems.

Skeptics should consider this (from 1960): “Computers are destined to become interactive intellectual amplifiers for everyone in the world, universally networked worldwide” from a paper entitle “Man-Computer Symbiosis”. Lick was not writing science-fiction (although to most of his then readers it much have seemed so), but deploying his intellect and knowledge to formulate a new concept, which to him must have been obvious. Bear in mind that the integrated circuit which still forms the core of almost all computers (except those of the nascent quantum design) was only invented in 1958 by Jack Kilby of Texas Instruments.

While the world’s population may now amount to 7.8BN people, and intellectual capacity is theoretically normally distributed across it, the impact of exceptional talent is non-proportional. The late Steve Jobs was notorious for applying this approach, making real Robert Taylor’s dictum (who ran the legendary Xerox PARC computer science laboratory): “Never hire “good” people, because ten good people together can’t do what a single great one can”. Taylor was quite ruthless: “…if you can get rid of people who are not so good, the spirit of the place is improved.”

In essence, Taylor was trying to create an environment in which closely connected and properly incentivized individuals, would co-operate in research that would literally change the world. Normal it was not; nor short-term.

To quote at some length another computer scientist (and Turing Award winner), Alan Kay: “Because of the normal distribution of talents and drive in the world, a depressingly large percentage of organizational processes have been designed to deal with people of moderate ability, motivation and trust… [A]dministrators seem to prefer to be completely in control of mediocre processes to being “out of control” with superproductive processes. They are trying to “avoid failure” rather than trying to “capture the heavens””. One can see this in the real world in the guise of the truism that no-one is fired for being as equally wrong as everyone else.

Clearly, such statements are easy to label as “elitist” and disparaging. However, whether one looks at social structures, bureaucracies or commercial enterprises, most of the value is created or produced at the far right tail of the distribution. It is mathematically impossible for the majority to be above average, and this applies as much to (re)insurance as to anything else, as we have written about previously.

Circling back to the title quotation, creating sustainable value stems from a combination of vision, application and persistence. Its immediate impact may not be obvious, and often there are failures or necessary adjustments along the way. Nevertheless, the goal remains always in sight, and significant change is possible within the medium term.

At Awbury, whatever the circumstances, our aim is always to try to create demonstrable value over time for all our clients and partners by avoiding “normal” frameworks and standard approaches. Paradoxically, to us that just seems normal!

The Awbury Team

Standard

Another Year, Another Decade…

The past decade has seen Awbury grow from what one might term a “glint in the eye” of its founders into an established specialized insurer, focused on helping its expanding client base find solution to their complex credit, economic and financial risks.

Building a business from scratch is both an exhilarating and humbling experience: exhilarating because one has to address multiple issues “in real time”; humbling because, no matter how experienced one is (and the Awbury team has a market memory covering more than four decades), there is still an unrelenting torrent of data and information to absorb as markets, economies and products change over time.

What then are some lessons learned that are worth distilling? Here are ten:

Firstly, to paraphrase von Moltke The Elder, “No business plan survives contact with the market”. In other words, no matter how well-researched, debated and thought through one’s initial business plan is, its execution will inevitably encounter circumstances that require adjustment. That being said, Awbury has never wavered from its stated purpose, nor been tempted to “pivot” into other product lines. Doing so would be a distraction and dilute our resources for no good reason. However, the ability to adapt remains critical

Secondly, size is not everything, Being the biggest is not a sensible goal for a business built upon the ability to create intellectual capital. Much better to pursue targeted, patient, careful growth

Thirdly, a consistent, definable and effective culture matters. This is much easier with a smaller team, in which everyone knows and interacts regularly with everyone else

Fourthly, the ability to identify replicable and scalable business products is important in terms of creating sustainable income streams

Fifthly, being unsentimental and intellectually ruthless are essential in selecting transactions with the best probability of both execution and a compelling risk/reward ratio

Sixthly, the perfect is the enemy of the good (to slightly adapt Voltaire). What matters is effective execution; not designing some aesthetically perfect artefact

Seventhly, relationships matter. Our business is based upon creating and maintaining long term relationships based upon trust, respect and mutual benefit. Win-Win is always the best outcome

Eighthly, ensuring the proper alignment of interests and incentives underpins effective risk management

Ninthly, selection of the right professional partners means that one can focus on the factors that create and sustain value and success

And finally, never believe your own propaganda! Self-confidence and intellectual humility are not incompatible.

Through the past decade we believe we have built a franchise which is structured to remain effective, relevant and valuable to our client base, as well as a source of high-quality premium flows to our partners who help provide our capacity.

We could not have done this without them; nor without our roster of trusted advisors. To all of them, and to our clients, we offer our gratitude.

We look forward to the new decade- both its challenges and its opportunities.

The Awbury Team

Standard