dw2

25 October 2015

Getting better at anticipating the future

History is replete with failed predictions. Sometimes pundits predict too much change. Sometimes they predict too little. Frequently they predict the wrong kinds of change.

Even those forecasters who claim a good track record for themselves sometime turn out, on closer inspection, to have included lots of wiggle room in their predictions – lots of scope for creative reinterpretation of their earlier words.

Of course, forecasts are often made for purposes other than anticipating the events that will actually unfold. Forecasts can serve many other goals:

  • Raising the profile of the forecaster and potentially boosting book sales or keynote invites – especially if the forecast is memorable, and is delivered in a confident style
  • Changing the likelihood that an event predicted will occur – either making it more likely (if the prediction is enthusiastic), or making it less likely (if the prediction is fearful)
  • Helping businesses and organisations to think through some options for their future strategy, via “scenario analysis”.

Given these alternative reasons why forecasters make predictions, it perhaps becomes more understandable that little effort is made to evaluate the accuracy of past forecasts. As reported by Alex Mayyasi,

Organizations spend staggering amounts of time and money trying to predict the future, but no time or money measuring their accuracy or improving on their ability to do it.

This bizarre state of affairs may be understandable, but it’s highly irresponsible, none the less. We can, and should, do better. In a highly uncertain, volatile world, our collective future depends on improving our ability to anticipate forthcoming developments.

Philip Tetlock

Mayyasi was referring to research by Philip Tetlock, a professor at the University of Pennsylvania. Over three decades, Tetlock has accumulated huge amounts of evidence about forecasting. His most recent book, co-authored with journalist Dan Gardner, is a highly readable summary of his research.

The book is entitled “Superforecasting: The Art and Science of Prediction”. I wholeheartedly recommend it.

Superforecasting

The book carries an endorsement by Nobel laureate Daniel Kahneman:

A manual for thinking clearly in an uncertain world. Read it.

Having just finished this book, I echo the praise it has gathered. The book is grounded in the field of geopolitical forecasting, but its content ranges far beyond that starting point. For example, the book can be viewed as one of the best descriptions of the scientific method – with its elevation of systematic, thoughtful doubt, and its search for ways to reduce uncertainty and eliminate bias. The book also provides a handy summary of all kinds of recent findings about human thinking methods.

“Superforecasting” also covers the improvements in the field of medicine that followed from the adoption of evidence-based medicine (in the face, it should be remembered, of initial fierce hostility from the medical profession). Indeed, the book seeks to accelerate a similar evidence-based revolution in the fields of economic and political analysis. It even has hopes to reduce the level of hostility and rancour that tends to characterise political discussion.

As such, I see the book as making an important contribution to the creation of a better sort of politics.

Summary of “Superforecasting”

The book draws on:

  • Results from four years of online competitions for forecasters held under the Aggregative Contingent Estimation project of IARPA (Intelligence Advanced Research Projects Activity)
  • Reflections from contest participants whose persistently scored highly in the competition – people who became known as ‘superforecasters’
  • Insight from the Good Judgement Project co-created by Tetlock
  • Reviews of the accuracy of predictions made publicly by politicians, political analysts, and media figures
  • Other research into decision-making, cognitive biases, and group dynamics.

Forecasters and superforecasters from the Good Judgement Project submitted more than 10,000 predictions over four years in response to questions about the likelihood of specified outcomes happening within given timescales over the following 3-12 months. Forecasts addressed the fields of geopolitics and economics.

The book highlights the following characteristics as being the cause of the success of superforecasters:

  • Avoidance of taking an ideological approach, which restricts the set of information that the forecaster considers
  • Pursuit of an evidence-based approach
  • Willingness to search out potential sources of disconfirming evidence
  • Willingness to incrementally adjust forecasts in the light of new evidence
  • The ability to break down estimates into a series of constituent questions that can, individually, be more easily calculated
  • The desire to obtain several different perspectives on a question, which can then be combined into an aggregate viewpoint
  • Comfort with mathematical and probabilistic reasoning
  • Adoption of careful, precise language, rather than vague terms (such as “might”) whose apparent meaning can change with hindsight
  • Acceptance of contingency rather than ‘fate’ or ‘inevitability’ as being the factor responsible for outcomes
  • Avoidance of ‘groupthink’ in which undue respect among team members prevents sufficient consideration of alternative viewpoints
  • Willingness to learn from past forecasting experiences – including both successes and failures
  • A growth mindset, in which personal characteristics and skill are seen as capable of improvement, rather than being fixed.

(This section draws on material I’ve added to H+Pedia earlier today. See that article for some links to further reading.)

Human pictures

Throughout “Superforecasting”, the authors provide the human backgrounds of the forecasters whose results and methods feature in the book. The superforecasters have a wide variety of backgrounds and professional experience. What they have in common, however – and where they differ from the other contest participants, whose predictions were less stellar – is the set of characteristics given above.

The book also discusses a number of well-known forecasters, and dissects the causes of their forecasting failures. This includes 9/11, the wars in Iraq, the Cuban Bay of Pigs fiasco, and many more. There’s much to learn from all these examples.

Aside: Other ways to evaluate futurists

Australian futurist Ross Dawson has recently created a very different method to evaluate the success of futurists. As Ross explains at http://rossdawson.com/futurist-rankings/:

We have created this widget to provide a rough view of how influential futurists are on the web and social media. It is not intended to be rigorous but it provides a fun and interesting insight into the online influence of leading futurists.

The score is computed from the number of Twitter followers, the Alexa score of websites, and the general Klout metric.

The widget currently lists 152 futurists. I was happy to find my name at #53 on the list. If I finish writing the two books I have in mind to publish over the next 12 months, I expect my personal ranking to climb 🙂

Yet another approach is to take a look at http://future.meetup.com/, the listing (by size) of the Meetup groups around the world that list “futurism” (or similar) as one of their interests. London Futurists, which I’ve been running (directly and indirectly) over the last seven years, features in third place on that list.

Of course, we futurists vary in the kind of topics we are ready (and willing) to talk to audiences abound. In my own case, I wish to encourage audiences away from “slow-paced” futurism, towards serious consideration of the possibilities of radical changes happening within just a few decades. These changes include not just the ongoing transformation of nature, but the possible transformation of human nature. As such, I’m ready to introduce the topic of transhumanism, so that audiences become more aware of the arguments both for and against this philosophy.

Within that particular subgrouping of futurist meetups, London Futurists ranks as a clear #1, as can be seen from http://transhumanism.meetup.com/.

Footnote

Edge has published a series of videos of five “master-classes” taught by Philip Tetlock on the subject of superforecasting:

  1. Forecasting Tournaments: What We Discover When We Start Scoring Accuracy
  2. Tournaments: Prying Open Closed Minds in Unnecessarily Polarized Debates
  3. Counterfactual History: The Elusive Control Groups in Policy Debates
  4. Skillful Backward and Forward Reasoning in Time: Superforecasting Requires “Counterfactualizing”
  5. Condensing it All Into Four Big Problems and a Killer App Solution

I haven’t had the time to view them yet, but if they’re anything like as good as the book “Superforecasting”, they’ll be well worth watching.

1 Comment »

  1. I like your thorough review, and all the bullet points you listed. This was a very engaging and persuasive read, and seems to rest on solid research. (Also, that’s an interesting cover picture you have there–haven’t seen the weathervane version yet anywhere else!) Check out my review if you are so inclined: https://leviathanbound.wordpress.com/2016/03/11/superforecasting/

    Regards.

    Comment by Levi — 12 March 2016 @ 6:32 pm


RSS feed for comments on this post. TrackBack URI

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog at WordPress.com.