Probabilities matter. If society fails to appreciate probabilities, and insists on seeing everything in certainties, a bleak future awaits us all (probably).
Consider five predictions, and common responses to these predictions.
Prediction A: If the UK leaves the EU without a deal, the UK will experience a significant economic downturn.
Response A: We’ve heard that prediction before. Before the Brexit vote, it was predicted that a major economic downturn would happen straightaway if the result was “Leave”. That downturn failed to take place. So we can discard the more recent prediction. It’s just “Project Fear” again.
Prediction B (made in Feb 2020): We should anticipate a surge in infections and deaths from Covid-19, and take urgent action to prevent transmissions.
Response B: We’ve heard that prediction before. Bird flu was going to run havoc. SARS and MERS, likewise, were predicted to kill hundreds of thousands. These earlier predictions were wrong. So we can discard the more recent prediction. It’s just “Project Pandemic” again.
Prediction C: We should prepare for the advent of artificial superintelligence, the most disruptive development in all of human history.
Response C: We’ve heard that prediction before. AIs more intelligent than humans have often been predicted. No such AI has been developed. These earlier predictions were wrong. So there’s no need to prepare for ASI. It’s just “Project Hollywood Fantasy” again.
Prediction D: If we don’t take urgent action, the world faces a disaster from global warming.
Response D: We’ve heard that prediction before. Climate alarmists told us some time ago “you only have twelve years to save the planet”. Twelve years passed, and the planet is still here. So we can ignore what climate alarmists are telling us this time. It’s just “Project Raise Funding for Climate Science” again.
Prediction E (made in mid December 1903): One day, humans will fly through the skies in powered machines that are heavier than air.
Response E: We’ve heard that prediction before. All sorts of dreamers and incompetents have naively imagined that the force of gravity could be overcome. They have all come to ruin. All these projects are a huge waste of money. Experts have proved that heavier than air flying machines are impossible. We should resist this absurdity. It’s just “Langley’s Folly” all over again.
The vital importance of framing
Now, you might think that I write these words to challenge the scepticism of the people who made the various responses listed. It’s true that these responses do need to be challenged. In each case, the response involves an unwarranted projection from the past into the future.
But the main point on my mind is a bit different. What I want to highlight is the need to improve how we frame and present predictions.
In all the above cases – A, B, C, D, E – the response refers to previous predictions that sounded similar to the more recent ones.
Each of these earlier predictions should have been communicated as follows:
- There’s a possible outcome we need to consider. For example, the possibility of an adverse economic downturn immediately after a “Leave” vote in the Brexit referendum.
- That outcome is possible, though not inevitable. We can estimate a rough probability of it happening.
- The probability of the outcome will change if various actions are taken. For example, swift action by the Bank of England, after a Leave vote, could postpone or alleviate an economic downturn. Eventually leaving the EU, especially without a deal in place, is likely to accelerate and intensify the downturn.
In other words, our discussions of the future need to embrace uncertainty, and need to emphasise how human action can alter that uncertainty.
What’s more, the mention of uncertainty must be forceful, rather than something that gets lost in small print.
So the message itself must be nuanced, but the fact that the message is nuanced must be underscored.
All this makes things more complicated. It disallows any raw simplicity in the messaging. Understandably, many activists and enthusiasts prefer simple messages.
However, if a message has raw simplicity, and is subsequently seen to be wrong, observers will be likely to draw the wrong conclusion.
That kind of wrong conclusion lies behind each of flawed responses A to E above.
Sadly, lots of people who are evidently highly intelligent fail to take proper account of probabilities in assessing predictions of the future. At the back of their minds, an argument like the following holds sway:
- An outcome predicted by an apparent expert failed to materialise.
- Therefore we should discard anything else that apparent expert says.
Quite likely the expert in question was aware of the uncertainties affecting their prediction. But they failed to emphasise these uncertainties strongly enough.
Transcending cognitive biases
As we know, we humans are prey to large numbers of cognitive biases. Even people with a good education, and who are masters of particular academic disciplines, regularly fall foul of these biases. They seem to be baked deep into our brains, and may even have conveyed some survival benefit, on average, in times long past. In the more complicated world we’re now living in, we need to help each other to recognise and resist the ill effects of these biases. Including the ill effects of the “probability neglect” bias which I’ve been writing about above.
Indeed, one of the most important lessons from the current chaotic situation arising from the Covid-19 pandemic is that society in general needs to raise its understanding of a number of principles related to mathematics:
- The nature of exponential curves – and how linear thinking often comes to grief, in failing to appreciate exponentials
- The nature of probabilities and uncertainties – and how binary thinking often comes to grief, in failing to appreciate probabilities.
This raising of understanding won’t be easy. But it’s a task we should all embrace.
Image sources: Thanasis Papazacharias and Michel Müller from Pixabay.
Footnote 1: The topic of “illiteracy about exponentials and probabilities” is one I’ll be mentioning in this Fast Future webinar taking place on Sunday evening.
Footnote 2: Some people who offer a rationally flawed response like the ones above are, sadly, well aware of the flawed nature of their response, but they offer it anyway. They do so since they believe the response may well influence public discussion, despite being flawed. They put a higher value on promoting their own cause, rather than on keeping the content of the debate as rational as possible. They don’t mind adding to the irrationality of public discussion. That’s a topic for a separate discussion, but it’s my view that we need to find both “carrots” and “sticks” to discourage people from deliberately promoting views they know to be irrational. And, yes, you guessed it, I’ll be touching on that topic too on Sunday evening.
Leave a Reply