"I have approximate answers and possible beliefs in different degrees of certainty about different things, but I'm not absolutely sure of anything"

Richard Feynman
embracing uncertainty
How do we embrace uncertainty in our forecasts, predictions and strategies?
One of the key characteristics of quantum mechanics is that of uncertainty, a feature described as a fundamental limit to knowing the precise values of certain pairs of physical properties. Now, I am not suggesting that you read more about quantum theory, but I am suggesting that by embracing uncertainty you enable a shift in perspective from deterministic to probabilistic ways of thinking and therefore make choices based on your assessment of the chances of an event occurring, rather than the perceived certainty of a single future outcome.

Considering this is in the context of our work, particularly commercial advisory services, helping our clients to make informed commercial decisions aligned with their strategic goals.

The majority of this work uses discounted cash flows as an analytical tool to model capital appraisals, investment valuations or multivariable financial forecasts. A usual component of this approach is to use sensitivity and scenario analysis to identify risk and forecast a range of scenarios based on a defined future set of actions. This is often achieved using discrete numerical assumptions and conditional logic statements (if X occurs, then perform operation Y), but for reasons I will go on to describe, this method of analysing the future is too simplistic and potentially fundamentally flawed. Given that a large proportion of global assets are valued using the discounted cash-flow tool this implication could be far reaching.

The flaw of averages
Discounted cash flow analysis is primarily driven by a set of user-defined assumptions about the future e.g. interest, discount rates, bond yields, rental growth etc. These assumptions usually take the form of discrete numbers that represent an average of a range of possible outcomes e.g. a future interest rate of 3.5% or a growth rate of 2%. But there is a flaw in using averages in these situations, known as the "flaw of averages" and driven by Jensen's inequality.

In 1906 Johan Jensen proved that "the average of all possible outcomes associated with uncertain parameters is generally not equal to the value obtained from using the average value of the parameters".

This inequality underpins the value of stock options and maybe be familiar to those working in quantitative finance, but for our clients, SMEs and charitable organisations, this is little understood. Appreciating the flaw of averages is crucial if you are trying to model future scenarios and take into account uncertainty in your strategies.
Harvard Business Review published a useful article in 2002 that discussed the Flaw of Averages including the impact of basing future plans using discrete average assumptions:

"In one celebrated, real-life case, relying on averages forced Orange County, California, into insolvency. In the summer of 1994, interest rates were low and were expected to remain so. Officials painted a rosy picture of the county's financial portfolio based on this expected future behaviour of interest rates. But had they explicitly considered the well-documented range of interest-rate uncertainties, instead of a single, average interest-rate scenario, they would have seen that there was a 5% chance of losing $1 billion or more—which is exactly what happened. The average hid the enormous riskiness of their investments.

More recently, a failure to appreciate the flaw led to $2 billion in property damage in North Dakota. In 1997, the U.S. Weather Service forecast that North Dakota's rising Red River would crest at 49 feet. Officials in Grand Forks made flood management plans based on this single figure, which represented an average. In fact, the river crested above 50 feet, breaching the dikes, and unleashing a flood that forced 50,000 people from their homes."

So how can this be fixed?

The source of the issue is the use of discrete numbers that represent averages of our assumptions. The solution is to move away from discrete averages and model assumptions as continuous distributions. Using a distribution enables you to model a range of potential numbers to simulate uncertainty. Often using a "normal distribution" is a good starting point but there are many other distributions that can be used - the key point is to use a distribution that best models the assumption that you are making.

The most common tool to simulate outcomes using input distributions is Monte Carlo analysis. Monte Carlo methods use repeated random sampling to solve problems, interpreting the results probabilistically. This may all sound complicated but you can build the Monte Carlo method into your models quite easily using Excel, generating cumulative probability charts to visualise your results.

The hardest part can often be interpreting and explaining your results. But effort here is crucial; once you've embraced uncertainty you need to shift your perspective and language to be able to explain your results clearly and concisely. Often it is changing the way we think from a deterministic statement; "if I change X to Y then the output is Z" to a probabilistic statement; "using relevant distributions to model X and Y, the probability of achieving our goal under these conditions is Z%."

Going back to our previous article, "Embracing complexity" and the interconnected nature of many variables, we can extend this approach to connect variables that impact one another. For example interest rates and inflation. These are often connected via a central bank policy that aims to keep inflation within a certain target range. Therefore by connecting these variables via an operation that models the policy, this enables the complexity of the system to be accounted for whilst building future uncertainty into your forecast assumptions.

By using this approach we can make more informed decisions using a robust methodology that embraces a systems interconnected nature and the uncertainty implicit in future potential configurations as the system reacts to change.
Made on
Tilda